Provided by: mlpack-bin_4.5.0-1_amd64 bug

NAME

       mlpack_adaboost - adaboost

SYNOPSIS

        mlpack_adaboost [-m unknown] [-i int] [-l unknown] [-T unknown] [-e double] [-t unknown] [-V bool] [-w string] [-M unknown] [-P unknown] [-p unknown] [-h -v]

DESCRIPTION

       This  program  implements  the  AdaBoost  (or Adaptive Boosting) algorithm. The variant of
       AdaBoost implemented here is AdaBoost.MH. It uses a weak learner, either  decision  stumps
       or  perceptrons,  and  over  many  iterations, creates a strong learner that is a weighted
       ensemble of weak learners. It runs these iterations until a tolerance value is crossed for
       change in the value of the weighted training error.

       For  more  information  about  the  algorithm, see the paper "Improved Boosting Algorithms
       Using Confidence-Rated Predictions", by R.E. Schapire and Y.  Singer.

       This program allows training of an AdaBoost model, and then application of that model to a
       test  dataset.  To train a model, a dataset must be passed with the '--training_file (-t)'
       option. Labels can be given with  the  ’--labels_file  (-l)'  option;  if  no  labels  are
       specified,  the  labels  will  be  assumed  to  be  the  last column of the input dataset.
       Alternately, an AdaBoost model may be loaded with the '--input_model_file (-m)' option.

       Once a model is trained or loaded, it may be used to provide class predictions for a given
       test  dataset.  A test dataset may be specified with the ’--test_file (-T)' parameter. The
       predicted classes for each point in the test dataset are output to the '--predictions_file
       (-P)'  output  parameter.  The AdaBoost model itself is output to the '--output_model_file
       (-M)' output parameter.

       For example, to run AdaBoost on an input dataset 'data.csv'  with  labels  ’labels.csv'and
       perceptrons  as the weak learner type, storing the trained model in 'model.bin', one could
       use the following command:

       $ mlpack_adaboost --training_file data.csv  --labels_file  labels.csv  --output_model_file
       model.bin --weak_learner perceptron

       Similarly,  an  already-trained  model  in  'model.bin'  can  be  used  to  provide  class
       predictions from test data 'test_data.csv' and store the output in ’predictions.csv'  with
       the following command:

       $     mlpack_adaboost     --input_model_file     model.bin    --test_file    test_data.csv
       --predictions_file predictions.csv

OPTIONAL INPUT OPTIONS

       --help (-h) [bool]
              Default help info.

       --info [string]
              Print help on a specific option. Default value ''.

       --input_model_file (-m) [unknown]
              Input AdaBoost model.

       --iterations (-i) [int]
              The maximum number of boosting iterations to be run (0 will run until convergence.)
              Default value 1000.  --labels_file (-l) [unknown] Labels for the training set.

       --test_file (-T) [unknown]
              Test dataset.

       --tolerance (-e) [double]
              The  tolerance  for change in values of the weighted error during training. Default
              value 1e-10.

       --training_file (-t) [unknown]
              Dataset for training AdaBoost.

       --verbose (-v) [bool]
              Display informational messages and the full list of parameters and  timers  at  the
              end of execution.

       --version (-V) [bool]
              Display the version of mlpack.

       --weak_learner  (-w)  [string]  The  type  of  weak  learner  to use: 'decision_stump', or
       'perceptron'. Default value 'decision_stump'.

OPTIONAL OUTPUT OPTIONS

       --output_model_file (-M) [unknown]
              Output trained AdaBoost model.

       --predictions_file (-P) [unknown]
              Predicted labels for the test set.

       --probabilities_file (-p) [unknown]
              Predicted class probabilities for each point in the test set.

ADDITIONAL INFORMATION

       For further information, including relevant papers, citations,  and  theory,  consult  the
       documentation found at http://www.mlpack.org or included with your distribution of mlpack.