Thanks for the explanation !

Shing




On Monday, January 13, 2014 1:28 AM, Patrick Wendell <[email protected]> wrote:
You should launch with "java" and not "scala" to launch. The "scala"
command in newer versions manually adds a specific version of akka to
the classpath which conflicts with the version spark is using. This
causes the error you are seeing. It's discussed in this thread on the
dev list:

http://apache-spark-developers-list.1001551.n3.nabble.com/Akka-problem-when-using-scala-command-to-launch-Spark-applications-in-the-current-0-9-0-SNAPSHOT-tp2.html

- Patrick


On Sun, Jan 12, 2014 at 4:21 AM, Shing Hing Man <[email protected]> wrote:
> Hi,
>
>   I am using the development version of Spark from
> git://github.com/apache/incubator-spark.git
> with Scala 2.10.3.
>
>
> The example GroupByTest runs successfully using :
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> bin/run-example 
> org.apache.spark.examples.GroupByTest local
>
>
> The script  bin/run-example  essentially does the following.
> 1) set up SPARK_HOME
> 2) set up classpath
>
> 3) execute "java -cp $CLASSPATH org.apache.spark.examples.GroupByTest local"
>
>
> I would have thought if (3) above is replaced by
>       scala -cp $CLASSPATH org.apache.spark.examples.GroupByTest local
> it should still works. But it does not. Please see below.
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> echo $SPARK_HOME
> /home/matmsh/Downloads/spark/github/incubator-spark
> matmsh@gauss:~/Downloads/spark/github/incubator-spark>
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> export 
> CLASSPATH=/home/matmsh/Downloads/spark/github/incubator-spark/examples/target/scala-2.10/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar:/home/matmsh/Downloads/spark/github/incubator-spark/conf:/home/matmsh/Downloads/spark/github/incubator-spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> scala  -cp $CLASSPATH  
>   org.apache.spark.examples.GroupByTest local
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/home/matmsh/Downloads/spark/github/incubator-spark/examples/target/scala-2.10/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/matmsh/Downloads/spark/github/incubator-spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 14/01/12 12:01:25 INFO Utils: Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 14/01/12 12:01:25 WARN Utils: Your hostname, gauss.site resolves to a 
> loopback address: 127.0.0.2; using 192.168.0.10 instead (on interface eth0)
> 14/01/12 12:01:25 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
> another address
> java.lang.NoSuchMethodException: 
> akka.remote.RemoteActorRefProvider.<init>(java.lang.String, 
> akka.actor.ActorSystem$Settings, akka.event.EventStream, 
> akka.actor.Scheduler, akka.actor.DynamicAccess)
> at java.lang.Class.getConstructor0(Class.java:2721)
> at java.lang.Class.getDeclaredConstructor(Class.java:2002)
> at 
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:77)
> at scala.util.Try$.apply(Try.scala:161)
> at 
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:74)
> at 
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85)
> at 
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:85)
> at scala.util.Success.flatMap(Try.scala:200)
> at 
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:85)
> at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:546)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:100)
> at org.apache.spark.examples.GroupByTest$.main(GroupByTest.scala:36)
> at org.apache.spark.examples.GroupByTest.main(GroupByTest.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at 
> scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:71)
> at 
> scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> at 
> scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:139)
> at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:71)
> at 
> scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:139)
> at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:28)
> at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:45)
> at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:35)
> at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:45)
> at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:74)
> at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:96)
> at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:105)
> at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
> matmsh@gauss:~/Downloads/spark/github/incubator-spark>
>
> What is wrong with the following ?
>
>  scala -cp $CLASSPATH org.apache.spark.examples.GroupByTest local
>
>
> Thanks in advance for any assistance !
>
> Shing

Reply via email to