Re: org.apache.spark.ml.recommendation.ALS

2015-04-14 Thread Xiangrui Meng
Yes, I think the default Spark builds are on Scala 2.10. You need to follow instructions at http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211 to build 2.11 packages. -Xiangrui On Mon, Apr 13, 2015 at 4:00 PM, Jay Katukuri jkatuk...@apple.com wrote: Hi Xiangrui,

Re: org.apache.spark.ml.recommendation.ALS

2015-04-13 Thread Jay Katukuri
Hi Xiangrui, Here is the class: object ALSNew { def main (args: Array[String]) { val conf = new SparkConf() .setAppName(TrainingDataPurchase) .set(spark.executor.memory, 4g) conf.set(spark.shuffle.memoryFraction,0.65) //default is 0.2

Re: org.apache.spark.ml.recommendation.ALS

2015-04-08 Thread Jay Katukuri
some additional context: Since, I am using features of spark 1.3.0, I have downloaded spark 1.3.0 and used spark-submit from there. The cluster is still on spark-1.2.0. So, this looks to me that at runtime, the executors could not find some libraries of spark-1.3.0, even though I ran

Re: org.apache.spark.ml.recommendation.ALS

2015-04-06 Thread Xiangrui Meng
Please attach the full stack trace. -Xiangrui On Mon, Apr 6, 2015 at 12:06 PM, Jay Katukuri jkatuk...@apple.com wrote: Hi all, I got a runtime error while running the ALS. Exception in thread main java.lang.NoSuchMethodError:

Re: org.apache.spark.ml.recommendation.ALS

2015-04-06 Thread Jay Katukuri
Here is the command that I have used : spark-submit —class packagename.ALSNew --num-executors 100 --master yarn ALSNew.jar -jar spark-sql_2.11-1.3.0.jar hdfs://input_path Btw - I could run the old ALS in mllib package. On Apr 6, 2015, at 12:32 PM, Xiangrui Meng men...@gmail.com wrote:

Re: org.apache.spark.ml.recommendation.ALS

2015-04-06 Thread Jay Katukuri
Hi, Here is the stack trace: Exception in thread main java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror; at ALSNew$.main(ALSNew.scala:35) at ALSNew.main(ALSNew.scala) at

Re: org.apache.spark.ml.recommendation.ALS

2015-04-06 Thread Xiangrui Meng
So ALSNew.scala is your own application, did you add it with spark-submit or spark-shell? The correct command should like spark-submit --class your.package.name.ALSNew ALSNew.jar [options] Please check the documentation: http://spark.apache.org/docs/latest/submitting-applications.html -Xiangrui