[ 
https://issues.apache.org/jira/browse/MAHOUT-703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hector Yee updated MAHOUT-703:
------------------------------

    Status: Patch Available  (was: Open)

Working ranking neural net with one hidden sigmoid layer.

> Implement Gradient machine
> --------------------------
>
>                 Key: MAHOUT-703
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-703
>             Project: Mahout
>          Issue Type: New Feature
>          Components: Classification
>    Affects Versions: 0.6
>            Reporter: Hector Yee
>            Priority: Minor
>              Labels: features
>             Fix For: 0.6
>
>         Attachments: MAHOUT-703.patch
>
>   Original Estimate: 72h
>  Remaining Estimate: 72h
>
> Implement a gradient machine (aka 'neural network) that can be used for 
> classification or auto-encoding.
> It will just have an input layer, identity, sigmoid or tanh hidden layer and 
> an output layer.
> Training done by stochastic gradient descent (possibly mini-batch later).
> Sparsity will be optionally enforced by tweaking the bias in the hidden unit.
> For now it will go in classifier/sgd and the auto-encoder will wrap it in the 
> filter unit later on.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to