[
https://issues.apache.org/jira/browse/SPARK-1405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14222024#comment-14222024
]
Debasish Das edited comment on SPARK-1405 at 11/22/14 4:22 PM:
---------------------------------------------------------------
We need a larger dataset as well where topics go to the range of 10000+...That
range will stress factorization based LSA formulations since there is broadcast
of factors at each step....NIPS dataset is small...Let's start with that...But
we should test a large dataset like wikipedia as well..If there is a
pre-processed version from either mahout or scikit-learn we can use that ?
was (Author: debasish83):
We need a larger dataset as well where topics go to the range of 10000+...That
range will stress factorization based LSA formulations since there is broadcast
of factors at each step....NIPS dataset is small...you guy's will be willing to
test a wikipedia dataset for example ? If there is a pre-processed version from
either mahout or scikit-learn we can use that ?
> parallel Latent Dirichlet Allocation (LDA) atop of spark in MLlib
> -----------------------------------------------------------------
>
> Key: SPARK-1405
> URL: https://issues.apache.org/jira/browse/SPARK-1405
> Project: Spark
> Issue Type: New Feature
> Components: MLlib
> Reporter: Xusen Yin
> Assignee: Guoqiang Li
> Priority: Critical
> Labels: features
> Attachments: performance_comparison.png
>
> Original Estimate: 336h
> Remaining Estimate: 336h
>
> Latent Dirichlet Allocation (a.k.a. LDA) is a topic model which extracts
> topics from text corpus. Different with current machine learning algorithms
> in MLlib, instead of using optimization algorithms such as gradient desent,
> LDA uses expectation algorithms such as Gibbs sampling.
> In this PR, I prepare a LDA implementation based on Gibbs sampling, with a
> wholeTextFiles API (solved yet), a word segmentation (import from Lucene),
> and a Gibbs sampling core.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]