Its a scala version conflict, can you paste your build.sbt file?

Thanks
Best Regards

On Fri, Jun 26, 2015 at 7:05 AM, stati <srikanth...@gmail.com> wrote:

> Hello,
>
> When I run a spark job with spark-submit it fails with below exception for
> code line
>        /*val webLogDF = webLogRec.toDF().select("ip", "date", "name"")*/
>
> I had similar issue running from spark-shell, then realized that I needed
> sqlContext.implicit._
> Now my code has the following imports
> /*
>      import org.apache.spark._
>      import org.apache.spark.sql._
>      import org.apache.spark.sql.functions._
>      val sqlContext = new SQLContext(sc)
>      import sqlContext.implicits._
> */
>
> Code works fine from spark-shell REPL. It also runs fine when run in local
> mode from Eclipse. I get this
> error only when I submit to cluster using spark-submit.
> bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class WebLogAnalysis
> --master spark://machu:7077
>
> I'm testing with spark 1.4. My code was built using scala 2.11 and
> spark+sparkSQL 1.4.0 as dependency in build.sbt
>
> Exception in thread "main" java.lang.NoSuchMethodError:
>
> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
>         at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
>         at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
>         at WebLogAnalysis.main(WebLogAnalysis.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
>
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>         at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>         at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> I can provide more code or log if that will help. Let me know.
>
> Srikanth
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to