Coder Social home page Coder Social logo

amirsaffari / online-multiclass-lpboost Goto Github PK

View Code? Open in Web Editor NEW
65.0 5.0 33.0 2.11 MB

Online Multi-Class LPBoost and Gradient Boosting

Home Page: http://www.ymer.org/amir/software/online-multiclass-lpboost/

License: MIT License

Makefile 1.59% C++ 98.41%

online-multiclass-lpboost's Introduction

Online Multi-Class LPBoost

This is the original implementation of the Online Multi-Class LPBoost [1], Online Multi-Class Gradient Boost [1], and Online Random Forest algorithms [2].

Read the INSTALL file for build instructions.

Usage:

Input arguments:

  -h | --help : will display this message.
  -c : path to the config file.
  
  --ort : use Online Random Tree (ORT) algorithm.
  --orf : use Online Random Forest (ORF) algorithm.
  --omcb : use Online Multi-Class Gradient Boost (OMCBoost) algorithm.
  --omclp : use Online Multi-Class LPBoost (OMCLPBoost) algorithm.
  --larank : use Online LaRank algorithm.

  --train : train the classifier.
  --test : test the classifier.
  --t2 : train and test the classifier at the same time.

Example: ./OMCBoost -c conf/omcb.conf --omclp --train --test

Config file:

All the settings for the classifier are passed via the config file. You can find the config file in "conf" folder. It is easy to see what are the meanings behind each of these settings:

Data:

  * trainData = path to the training data (features)
  * trainLabels = path to the training labels
  * testData = path to the test data (features)
  * testLabels = path to test labels

Forest:

  * maxDepth = maximum depth for a tree
  * numRandomTests = number of random tests for each node
  * counterThreshold = number of samples to be seen for an online node before splitting
  * numTrees = number of trees in the forest

LaRank:

  * larankC = C regularization parameter for LaRank SVM

Boosting:

  * numBases = number of weak learners
  * weakLearner = type of weak learners: 0: ORF, 1: LaRank
  * shrinkage = shrinkage factor for gradient boost
  * lossFunction = type of loss function for gradient boost: 0 = Exponential Loss, 1 = Logit Loss:
  
  * C = regularization parameter for LPBoost
  * cacheSize = size of the cache for LPBoost
  * nuD = dual gradient descent step size
  * nuP = primal gradient descent step size
  * annealingRate = rate of decay for step sizes during the training
  * theta = theta factor for the augmented Lagrangian
  * numIterations = number of iterations per sample

Experimenter:

  * findTrainError = find the training error over time
  * numEpochs = number of online training epochs

Output:

  * savePath = path to save the results (not implemented yet)
  * verbose = defines the verbosity level (0: silence)

Data format:

The data formats used is a simple ASCII format. Data is a represented as a matrix (vectors are matrices with 1 column). The first line in data file is the size of the matrix in form of: #numrows #numcols. From the next line the matrix is written. You can find a few datasets in the data folder, check their header to see some examples.

Currently, there is only one limitation with the data files: the classes should be labeled starting in a regular format and start from 0. For example, for a 3 class problem the labels should be in {0, 1, 2} set.

REFERENCES:

[1] Amir Saffari, Martin Godec, Thomas Pock Christian Leistner, and Horst Bischof, "Online Multi-Class LPBoost", in IEEE Conference on Computer Vision and Patter Recognition, 2010. PDF: http://www.ymer.org/papers/files/2010-OMCLPBoost.pdf

[2] Amir Saffari, Christian Leistner, Jakob Santner, Martin Godec, and Horst Bischof, "Online Random Forests", in 3rd IEEE ICCV Workshop on On-line Computer Vision, 2009. PDF: http://www.ymer.org/papers/files/2009-OnlineRandomForests.pdf

online-multiclass-lpboost's People

Contributors

amirsaffari avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

online-multiclass-lpboost's Issues

Got Wrong Label in Test

Hello,

I have been applied Online Gradient Boosting-based on Decision tree Regression on the MAGIC Datasets (From UCI Machine Learning Repository). AS mentioned in the OMCBoost, I have done training of the Data. But When I predict on the model It gives the labels, which is the last label in the training. For Example, the Last label of the training sample is 1. It returns 1 in all the test data. So, I think, I did something wrong during the test the model. Can you send the pseudo code of test the model of online Gradient Boosting?

Thank you so much for your time.

errors in "make"

src/utilities.h: In function ‘double randDouble()’:
src/utilities.h:48:47: error: ‘getpid’ was not declared in this scope
srand(TV.tv_sec * TV.tv_usec + getpid() + getDevRandom());
^
Makefile:31: recipe for target 'src/online_rf.o' failed
make: *** [src/online_rf.o] Error 1
even I installed eigen2 and libconfig-1.5.

Samples assigned by fathernode are not updated in childnode's test

Hi,

In random forest, I find samples assigned by fathernode (depth=1) are not updated in childnode's (depth=2) test. Thus, those inherited samples stay at the childnode (depth=2) and are not passed to the child-childnode (depth=3) because they are not in the sample set of any test. I find this when I check the number of samples among all leaf node. The sum of sample numbers of all leaf nodes are not equals to the number of samples imported. I'd like to ask that whether I misunderstand your algorithm or it is a bug? Related codes are located at src/online_rf.cpp:75-85.

Thanks,
Yi

need more details on how/where to install egine

it seems it always reports
from src/experimenter.cpp:17:
src/data.h:23:22: fatal error: Eigen/Core: No such file or directory
#include <Eigen/Core>
^
compilation terminated.
In file included from src/OMCBoost.cpp:20:0:
src/data.h:23:22: fatal error: Eigen/Core: No such file or directory
#include <Eigen/Core>
^
compilation terminated.
In file included from src/classifier.h:17:0,
from src/classifier.cpp:14:
src/data.h:23:22: fatal error: Eigen/Core: No such file or directory
#include <Eigen/Core>
^
compilation terminated.

even I success install egine2

Error when building

I get this error when building the project using make
src/hyperparameters.cpp:15:25: fatal error: libconfig.h++: No such file or directory #include <libconfig.h++> ^ compilation terminated. make: *** [src/hyperparameters.o] Error 1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.