[ 
https://issues.apache.org/jira/browse/MAHOUT-1990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16027850#comment-16027850
 ] 

ASF GitHub Bot commented on MAHOUT-1990:
----------------------------------------

GitHub user rawkintrevo opened a pull request:

    https://github.com/apache/mahout/pull/323

    MAHOUT-1990 [WIP] Add Multilayer Perceptron

    ### Purpose of PR:
    Implements multi-layer perceptron style neural network
    
    Also introduces some SGD functionality (namely learning strategies) (will, 
doesn't yet)
    
    Also introduces classifier trait
    
    
    ### Important ToDos
    Please mark each with an "x"
    - [x] A JIRA ticket exists (if not, please create this 
first)[https://issues.apache.org/jira/browse/ZEPPELIN/]
    - [x] Title of PR is "MAHOUT-XXXX Brief Description of Changes" where XXXX 
is the JIRA number.
    - [ ] Created unit tests where appropriate
    - [ ] Added licenses correct on newly added files
    - [x] Assigned JIRA to self
    - [ ] Added documentation in scala docs/java docs, and to website
    - [ ] Successfully built and ran all unit tests, verified that all tests 
pass locally.
    
    Does this change break earlier versions?
    **yes** refactored linear regression to new sub-package, and created 
nonlinear package.
    
    Is this the beginning of a larger project for which a feature branch should 
be made?
    ... possibly, but I think I will have it all working in short enough order 
that it can stay on mine- however I am open to this idea if someone else wants 
to chip in. 
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/rawkintrevo/mahout mahout-1990

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/mahout/pull/323.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #323
    
----
commit d15a9c2730888aeb0973c173ccdaeb12957b2c53
Author: rawkintrevo <trevor.d.gr...@gmail.com>
Date:   2017-05-25T06:49:10Z

    initial incoreMLP work

commit ab2b1213b76f1f94297a617d7f465479825d7377
Author: rawkintrevo <trevor.d.gr...@gmail.com>
Date:   2017-05-27T15:48:07Z

    Distributed MLP

commit 226a884c8e7d520cd4656ab9d8cfd71647a84b0f
Author: rawkintrevo <trevor.d.gr...@gmail.com>
Date:   2017-05-28T04:30:45Z

    Distributed works now

commit 7ea981e53af692a43bc4b75c176b46b7ff6ce328
Author: rawkintrevo <trevor.d.gr...@gmail.com>
Date:   2017-05-28T14:05:47Z

    Distributed works now

commit c32585be8d0260c0a6000ebd6944c0b1a2c95595
Author: rawkintrevo <trevor.d.gr...@gmail.com>
Date:   2017-05-28T16:03:24Z

    Added regression and classifier wrappers

----


> Implement Multilayer Perceptron
> -------------------------------
>
>                 Key: MAHOUT-1990
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1990
>             Project: Mahout
>          Issue Type: Improvement
>          Components: Algorithms
>    Affects Versions: 0.13.2
>            Reporter: Trevor Grant
>            Assignee: Trevor Grant
>
> Following strategy
> It should- 
> 1. implement incoreMLPs which can be 'plugged together' for purposes of back 
> propegation (this makes for easy extension into more complex networks)
> 2. implement a common distributed MLP which maps out incoreMLPs and then 
> averages parameters
> 3. regression and classifier wrappers around the base MLP to reduce 
> duplication of code
> 4. would be nice to make distributed and incore neural network 'trait' for 
> consistent API across all future neural networks.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to