[ 
https://issues.apache.org/jira/browse/MAHOUT-703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13044365#comment-13044365
 ] 

Sean Owen commented on MAHOUT-703:
----------------------------------

Another good one Hector and hearing no grunts of objection from Ted let's put 
it in. I have a few small style points for your patches.

- We'll need to use the standard Apache license header
- Class description can/should go in the class javadoc not above the package 
statement
- Java var naming syntax is camelCase rather than camel_case
- Careful of the javadoc -- it has to start with /** to be read as such
- Go ahead and use braces and a newline with every control flow statement 
including ifs

- In train(), outputActivation is not used?

> Implement Gradient machine
> --------------------------
>
>                 Key: MAHOUT-703
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-703
>             Project: Mahout
>          Issue Type: New Feature
>          Components: Classification
>    Affects Versions: 0.6
>            Reporter: Hector Yee
>            Priority: Minor
>              Labels: features
>             Fix For: 0.6
>
>         Attachments: MAHOUT-703.patch
>
>   Original Estimate: 72h
>  Remaining Estimate: 72h
>
> Implement a gradient machine (aka 'neural network) that can be used for 
> classification or auto-encoding.
> It will just have an input layer, identity, sigmoid or tanh hidden layer and 
> an output layer.
> Training done by stochastic gradient descent (possibly mini-batch later).
> Sparsity will be optionally enforced by tweaking the bias in the hidden unit.
> For now it will go in classifier/sgd and the auto-encoder will wrap it in the 
> filter unit later on.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to