Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hama Wiki" for change 
notification.

The "MultiLayerPerceptron" page has been changed by YexiJiang:
https://wiki.apache.org/hama/MultiLayerPerceptron?action=diff&rev1=28&rev2=29

  The two phases will repeat alternatively until the termination condition is 
met (reach a specified number of iterations).
  
  
- 
- 
  == How to use Multilayer Perceptron in Hama? ==
  
  MLP can be used for both regression and classification. For both tasks, we 
need first initialize the MLP model by specifying the parameters. 
  
+ === Train the model ===
  For training, the following things need to be specified:
   * The '''''model topology''''': including the number of neurons (besides the 
bias neuron) in each layer; whether current layer is the final layer; the type 
of squashing function.
   * The '''''learning rate''''': Specify how aggressive the model learning the 
training instances. A large value can accelerate the learning process but 
decrease the chance of model convergence. Recommend in range (0, 0.5].
@@ -93, +92 @@

  || convergence.check.interval || If this parameters is set, then the model 
will be checked every time when the iteration is a multiple of this parameter. 
If the convergence condition is satisfied, the training will terminate 
immediately. ||
  || tasks || The number of concurrent tasks. ||
  
+ === Use the trained model ===
+ 
+ Once the model is trained and stored, it can be reused later.
+ 
+ {{{
+   String modelPath = ...;  // the location of the existing model
+ 
+   DoubleVector features = ...; // the features of an instance
+   SmallLayeredNeuralNetwork ann = new SmallLayeredNeuralNetwork(modelPath);
+   DoubleVector labels = ann.getOutput(instance);  // the label evaluated by 
the model
+ }}}
  
  === Two class learning problem ===
  To be added...

Reply via email to