[ 
https://issues.apache.org/jira/browse/MAHOUT-24?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12586829#action_12586829
 ] 

Isabel Drost commented on MAHOUT-24:
------------------------------------

It would be great, if you added some unit tests to your code to show that it is 
working - and to prevent others from accidentally  breaking it.

In addition you should add a few comments to make clear which steps of the 
algorithm are implemented by which sections in the code. I just had a look at 
the code and compared it with what was published in the NIPS paper - maybe I am 
just a little blind today, but I have some problems mapping your implementation 
back to the original algorithm description.

You could improve your code by using the new Mahout matrix package. It comes 
with basic matrix and vector operations that should make implementing the 
algorithm a lot easier.

Isabel

> Skeletal LWLR implementation
> ----------------------------
>
>                 Key: MAHOUT-24
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-24
>             Project: Mahout
>          Issue Type: New Feature
>         Environment: n/a
>            Reporter: Samee Zahur
>         Attachments: LWLR.patch.tar.bz2
>
>
> This is a very skeletal but functional implementation for LWLR. It outputs n 
> lines where n is the number of dimensions. ith line = sum(x[i]*x[ind]) where 
> ind is the index of independant variable. So the actual gradient = 2nd 
> line/1st line for the classical 2D.
> Contains a single small test case for demonstration.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to