[ 
https://issues.apache.org/jira/browse/MAHOUT-702?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13035823#comment-13035823
 ] 

Ted Dunning commented on MAHOUT-702:
------------------------------------

Nice work Hector.

I have a few comments.

First, doesn't your train method destroy the input vector?  That seems like bad 
manners.  I think that you can get the 
effect you want without an additional copy being made by accumulating into the 
two rows of the weights matrix.

Secondly, I see why you put the test into the existing test so that you could 
re-use some framework.  

My preference is to keep a bit of separation, however.  What do you think about 
factoring out the
common structure and having both kinds of test extend the same abstract class?

Also, does your PA learner have any regularization other than early stopping?  
What about annealing
of the learning rate?

Finally, what do you think about putting this under a similar framework as 
AdaptiveLogisticRegression 
in order to get auto-tuning of the learning rate?


> Implement Online Passive Aggressive learner
> -------------------------------------------
>
>                 Key: MAHOUT-702
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-702
>             Project: Mahout
>          Issue Type: New Feature
>          Components: Classification
>    Affects Versions: 0.6
>            Reporter: Hector Yee
>            Priority: Minor
>         Attachments: MAHOUT-702.patch
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Implements online passive aggressive learner that minimizes label ranking 
> loss.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to