HI, Felix, you are current, the current implementation is a simple
online/stochastic gradient descent network using back-propagation for
optimizing. The user can set the number of levels, number of neurons in
each level, and a various of parameters (such as learning rate,
regularization weight, etc.). The CLI version simplifies some parameters
because basic users do not need that many parameters.

Regards,
Yexi


2014-07-14 7:36 GMT-07:00 Felix Schüler (JIRA) <[email protected]>:

>
>     [
> https://issues.apache.org/jira/browse/MAHOUT-1551?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14060688#comment-14060688
> ]
>
> Felix Schüler commented on MAHOUT-1551:
> ---------------------------------------
>
> Ted, thanks for the feedback!
> As far as we understand it, the implementation is a simple
> online/stochastic gradient descent using backpropagation to calculate the
> gradients of the error function. Weights are then updated with a fixed
> learning rate that never changes. As we (I always say 'we' because I am
> working on it with someone else for a university-class) have described in
> MAHOUT-1388, the CLI version only performs a fixed number of n iterations
> where n is the size of the training set. So example is fed into the network
> once, which in case of a dataset as small as the iris dataset does not lead
> to acceptable performance. The unit test for the mlp iterates 2000 times
> through the dataset to achieve a good performance, but as far as we can
> tell, stopping does not depend on learning or weight updates even though
> regularization is implemented.
> We could add this information to the implementation section of the
> documentation.
>
> As for the DSL, we are very tempted to implement the MLP or a more general
> neural network framework. We will think about it and see if we can find the
> time.
>
> > Add document to describe how to use mlp with command line
> > ---------------------------------------------------------
> >
> >                 Key: MAHOUT-1551
> >                 URL: https://issues.apache.org/jira/browse/MAHOUT-1551
> >             Project: Mahout
> >          Issue Type: Documentation
> >          Components: Classification, CLI, Documentation
> >    Affects Versions: 0.9
> >            Reporter: Yexi Jiang
> >              Labels: documentation
> >             Fix For: 1.0
> >
> >         Attachments: README.md
> >
> >
> > Add documentation about the usage of multi-layer perceptron in command
> line.
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v6.2#6252)
>

Reply via email to