[
https://issues.apache.org/jira/browse/MAHOUT-817?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13176388#comment-13176388
]
Dmitriy Lyubimov commented on MAHOUT-817:
-----------------------------------------
btw this patch doesn't address use cases of "folding in" and "folding out"
which are basically special cases of SVD fold-in adjusted to row-wise input
and PCA offset.
Do we want to leave it out of scope? Generally it usually doesn't make sense to
do this stuff in a batch, but rather in real time which requires indexing
mechanism of V (and U). Other than that, it is a simple multiplication
operation, perhaps we could just engineer a fold-in using regular distributed
matrix operations? I never investigated an issue of a batch fold in with Mahout.
> Add PCA options to SSVD code
> ----------------------------
>
> Key: MAHOUT-817
> URL: https://issues.apache.org/jira/browse/MAHOUT-817
> Project: Mahout
> Issue Type: New Feature
> Affects Versions: 0.6
> Reporter: Dmitriy Lyubimov
> Assignee: Dmitriy Lyubimov
> Fix For: Backlog
>
> Attachments: MAHOUT-817.patch, SSVD-PCA options.pdf, ssvd-tests.R,
> ssvd.R, ssvd.m
>
>
> It seems that a simple solution should exist to integrate PCA mean
> subtraction into SSVD algorithm without making it a pre-requisite step and
> also avoiding densifying the big input.
> Several approaches were suggested:
> 1) subtract mean off B
> 2) propagate mean vector deeper into algorithm algebraically where the data
> is already collapsed to smaller matrices
> 3) --?
> It needs some math done first . I'll take a stab at 1 and 2 but thoughts and
> math are welcome.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira