Hi Evan,
Thank you very much for your quick response.
I am using ALS to create model, here is my method
def doCollab() {
val sc = new SparkContext("local[2]", "Log Query")
val mc = new MLContext(sc)
var pairs = mc.load("user_song_pairs", 1 to 2)
val ratings = mc.load("user_ratings", 1)
val als = new ALS()
als.setBlocks(-1)
als.setIterations(15)
als.setRank(10)
val model = als.run(ratings)
}
But here first of all MLConext could not resolved, Am I creating context
wrongly?
Secondly ALS has parameters like
- *rank* is the number of latent factors in our model.
- *iterations* is the number of iterations to run.
- *lambda* specifies the regularization parameter in ALS.
But I could not find some example values for this parameters. Can you give
a bit more explanation for these and give some example values?
BR,
Aslan
On Fri, Nov 29, 2013 at 9:03 PM, Evan Sparks <[email protected]> wrote:
> Hi Aslan,
>
> You'll need to link against the spark-mllib artifact. The method we have
> currently for collaborative filtering is ALS.
>
> Documentation is available here -
> http://spark.incubator.apache.org/docs/latest/mllib-guide.html
>
> We're working on a more complete ALS tutorial, and will link to it from
> that page when it's ready.
>
> - Evan
>
> > On Nov 29, 2013, at 10:33 AM, Aslan Bekirov <[email protected]>
> wrote:
> >
> > Hi All,
> >
> > I am trying to do collaborative filtering with MLbase. I am using spark
> 0.8.0
> >
> > I have some basic questions.
> >
> > 1) I am using maven and added dependency to my pom
> > <dependency>
> > <groupId>org.apache.spark</groupId>
> > <artifactId>spark-core_2.9.3</artifactId>
> > <version>0.8.0-incubating</version>
> > </dependency>
> >
> > I could not see any MLbase related classes in downloaded jar that is why
> I could not import mli libraries. Am I missing something? Do I have to add
> some more dependency for mli?
> >
> > 2) Is there exist java api for MLBase?
> >
> > Thanks in advance,
> >
> > BR,
> > Aslan
>