Provided by: mlpack-bin_3.2.2-3_amd64 bug

NAME

       mlpack_adaboost - adaboost

SYNOPSIS

        mlpack_adaboost [-m unknown] [-i int] [-l string] [-T string] [-e double] [-t string] [-V bool] [-w string] [-o string] [-M unknown] [-P string] [-h -v]

DESCRIPTION

       This  program  implements  the  AdaBoost  (or  Adaptive  Boosting)  algorithm.  The  variant  of AdaBoost
       implemented here is AdaBoost.MH. It uses a weak learner, either decision stumps or perceptrons, and  over
       many  iterations,  creates  a  strong learner that is a weighted ensemble of weak learners. It runs these
       iterations until a tolerance value is crossed for change in the value of the weighted training error.

       For more information about the algorithm, see the paper "Improved Boosting Algorithms  Using  Confidence-
       Rated Predictions", by R.E. Schapire and Y.  Singer.

       This  program allows training of an AdaBoost model, and then application of that model to a test dataset.
       To train a model, a dataset must be passed with the '--training_file (-t)' option. Labels  can  be  given
       with  the  ’--labels_file  (-l)' option; if no labels are specified, the labels will be assumed to be the
       last  column  of  the  input  dataset.  Alternately,  an  AdaBoost  model  may   be   loaded   with   the
       '--input_model_file (-m)' option.

       Once  a model is trained or loaded, it may be used to provide class predictions for a given test dataset.
       A test dataset may be specified with the ’--test_file (-T)' parameter. The  predicted  classes  for  each
       point  in  the  test  dataset are output to the '--predictions_file (-P)' output parameter.  The AdaBoost
       model itself is output to the '--output_model_file (-M)' output parameter.

       Note: the following parameter is deprecated and will be removed in mlpack  4.0.0:  '--output_file  (-o)'.
       Use '--predictions_file (-P)' instead of '--output_file (-o)'.

       For  example,  to  run AdaBoost on an input dataset 'data.csv' with perceptrons as the weak learner type,
       storing the trained model in 'model.bin', one could use the following command:

       $ mlpack_adaboost --training_file data.csv --output_model_file model.bin --weak_learner perceptron

       Similarly, an already-trained model in 'model.bin' can be used to provide  class  predictions  from  test
       data 'test_data.csv' and store the output in ’predictions.csv' with the following command:

       $    mlpack_adaboost    --input_model_file   model.bin   --test_file   test_data.csv   --predictions_file
       predictions.csv

OPTIONAL INPUT OPTIONS

       --help (-h) [bool]
              Default help info.

       --info [string]
              Print help on a specific option. Default value ''.

       --input_model_file (-m) [unknown]
              Input AdaBoost model.

       --iterations (-i) [int]
              The maximum number of boosting iterations to be run (0 will run until convergence.) Default  value
              1000.

       --labels_file (-l) [string]
              Labels for the training set.

       --test_file (-T) [string]
              Test dataset.

       --tolerance (-e) [double]
              The tolerance for change in values of the weighted error during training. Default value 1e-10.

       --training_file (-t) [string]
              Dataset for training AdaBoost.

       --verbose (-v) [bool]
              Display informational messages and the full list of parameters and timers at the end of execution.

       --version (-V) [bool]
              Display the version of mlpack.

       --weak_learner  (-w) [string] The type of weak learner to use: 'decision_stump', or 'perceptron'. Default
       value 'decision_stump'.

OPTIONAL OUTPUT OPTIONS

       --output_file (-o) [string]
              Predicted labels for the test set.

       --output_model_file (-M) [unknown]
              Output trained AdaBoost model.

       --predictions_file (-P) [string]
              Predicted labels for the test set.

ADDITIONAL INFORMATION

       For further information, including relevant papers, citations,  and  theory,  consult  the  documentation
       found at http://www.mlpack.org or included with your distribution of mlpack.