Enum fann_sys::fann_train_enum
[−]
[src]
pub enum fann_train_enum { FANN_TRAIN_INCREMENTAL, FANN_TRAIN_BATCH, FANN_TRAIN_RPROP, FANN_TRAIN_QUICKPROP, }
The Training algorithms used when training on fann_train_data
with functions like
fann_train_on_data
or fann_train_on_file
. The incremental training alters the weights
after each time it is presented an input pattern, while batch only alters the weights once after
it has been presented to all the patterns.
Variants
FANN_TRAIN_INCREMENTAL | Standard backpropagation algorithm, where the weights are updated after each training pattern. This means that the weights are updated many times during a single epoch. For this reason some problems will train very fast with this algorithm, while other more advanced problems will not train very well. |
FANN_TRAIN_BATCH | Standard backpropagation algorithm, where the weights are updated after calculating the mean square error for the whole training set. This means that the weights are only updated once during an epoch. For this reason some problems will train slower with this algorithm. But since the mean square error is calculated more correctly than in incremental training, some problems will reach better solutions with this algorithm. |
FANN_TRAIN_RPROP | A more advanced batch training algorithm which achieves good results
for many problems. The RPROP training algorithm is adaptive, and does therefore not
use the |
FANN_TRAIN_QUICKPROP | A more advanced batch training algorithm which achieves good results
for many problems. The quickprop training algorithm uses the |