Struct fann::FannTrainer [] [src]

pub struct FannTrainer<'a> {
    // some fields omitted
}

A training configuration. Create this with Fann::on_data or Fann::on_file and run the training with train.

Methods

impl<'a> FannTrainer<'a>

fn with_reports(self, interval: c_uint) -> FannTrainer<'a>

Activates printing reports periodically. Between two reports, interval neurons are added (for cascade training) or training goes on for interval epochs (otherwise).

fn with_callback(self, interval: c_uint, callback: &'a Fn(&Fann, &TrainData, c_uint) -> CallbackResult) -> FannTrainer<'a>

Configures a callback to be called periodically during training. Every interval epochs (for regular training) or every time interval new neurons have been added (for cascade training), the callback runs. It receives as arguments:

  • a reference to the current Fann,
  • a reference to the training data,
  • the number of steps (added neurons or epochs) taken so far.

fn cascade(self) -> FannTrainer<'a>

Use the Cascade2 algorithm: This adds neurons to the neural network while training, starting with an ANN without any hidden layers. The network should use shortcut connections, so it needs to be created like this:

let td = fann::TrainData::from_file("test_files/xor.data").unwrap();
let fann = fann::Fann::new_shortcut(&[td.num_input(), td.num_output()]).unwrap();

fn train(&mut self, max_steps: c_uint, desired_error: c_float) -> FannResult<()>

Train the network until either the mean square error drops below the desired_error, or the maximum number of steps is reached. If cascade training is activated, max_steps refers to the number of neurons that are added, otherwise it is the maximum number of training epochs.