Yes, I think the default Spark builds are on Scala 2.10. You need to
follow instructions at
http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
to build 2.11 packages. -Xiangrui
On Mon, Apr 13, 2015 at 4:00 PM, Jay Katukuri jkatuk...@apple.com wrote:
Hi Xiangrui,
Hi Xiangrui,
Here is the class:
object ALSNew {
def main (args: Array[String]) {
val conf = new SparkConf()
.setAppName(TrainingDataPurchase)
.set(spark.executor.memory, 4g)
conf.set(spark.shuffle.memoryFraction,0.65) //default is 0.2
some additional context:
Since, I am using features of spark 1.3.0, I have downloaded spark 1.3.0 and
used spark-submit from there.
The cluster is still on spark-1.2.0.
So, this looks to me that at runtime, the executors could not find some
libraries of spark-1.3.0, even though I ran
Please attach the full stack trace. -Xiangrui
On Mon, Apr 6, 2015 at 12:06 PM, Jay Katukuri jkatuk...@apple.com wrote:
Hi all,
I got a runtime error while running the ALS.
Exception in thread main java.lang.NoSuchMethodError:
Here is the command that I have used :
spark-submit —class packagename.ALSNew --num-executors 100 --master yarn
ALSNew.jar -jar spark-sql_2.11-1.3.0.jar hdfs://input_path
Btw - I could run the old ALS in mllib package.
On Apr 6, 2015, at 12:32 PM, Xiangrui Meng men...@gmail.com wrote:
Hi,
Here is the stack trace:
Exception in thread main java.lang.NoSuchMethodError:
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at ALSNew$.main(ALSNew.scala:35)
at ALSNew.main(ALSNew.scala)
at
So ALSNew.scala is your own application, did you add it with
spark-submit or spark-shell? The correct command should like
spark-submit --class your.package.name.ALSNew ALSNew.jar [options]
Please check the documentation:
http://spark.apache.org/docs/latest/submitting-applications.html
-Xiangrui