Github user avulanov commented on the pull request:

    https://github.com/apache/spark/pull/1290#issuecomment-67712918
  
    I've just updated this branch with matrix form of back-propagation adopted 
from my old code https://github.com/avulanov/spark/tree/neuralnetwork. It was 
slower than the code with loops and I thought the problem was with slow 
parameter roll/unroll. Apparently, the problem was that I did not take 
advantage of the stride from breeze matrices. It was fixed and now the 
backpropagation itself is slightly faster (~25% depending on the size of 
matrices). The complimentary benefit is that the algorithm's code is more 
readable.
    
    @bgreeven Could you write a brief description to the ANN test called 
"Gradient of ANN" to let the reader understand more clearly what we are testing?
    
    @bgreeven @jkbradley @Lewuathe I was thinking more about the parameters for 
the ANN. The main problem for me with the current approach is that one needs to 
duplicate optimizer's parameters instead of setting them directly and thus one 
has to implement a Cartesian product of ANN and Optimizer parameters. It would 
be great to separate ANN parameters from optimization parameters, at least. 
This does not fit into the current `.train` method pattern. For example, there 
might be a public constructor for ANN (say any ML model) that accepts a 
topology and error function. After initialization, one can run `trainWithXXX` 
method. Or, one can ask the instance to instantiate an optimizer 
`getOptimizer`, set its parameters directly and then simply run `train()`. 
    
    @mengxr Could you also join our discussion? We will really appreciate your 
opinion.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to