Function
Performs fixed mode bootstrapped back propagation training on a MLP. Bootstrapped training is useful when there is a large number of examples of some classes, and only a small number of examples of others. During training a subset of each class is randomly selected and used to train the network. After a specified number of epochs the training set is rebuilt and training continues.
Usage
BootMLP parameterfile
Example Parameter File
WeightFile = Phoneme-01.wgt
DestinationFile = Phoneme-01-boot.wgt
#DataClasses = 2
DataFile = Phoneme-01-pos.trn
DataFile = Phoneme-01-neg.trn
Examples = 100
Examples = 300
Epochs = 1000
ChangeAfter = 10
LearningRate = 0.5
Momentum = 0.5
Batch = true
Parameter File Explained
WeightFile : file to load MLP to train from
DestinationFile : file to save trained MLP to
#DataClasses : number of training data classes
DataFile : name of training data file. The number of data files
must be equal to the number of data classes
Examples : number of examples to take from each training file. The
number of Examples tags must be equal to the number of data classes. The
order of the examples is the same as the Data Files
Epochs : number of epochs to train for
LearningRate : learning rate training parameter
Momentum : momentum training parameter
Batch : true = use batch mode training, false don't
This page is maintained by Michael Watts (http://mike.watts.net.nz)
Last modified on: 12/10/98.