[ 
https://issues.apache.org/jira/browse/SPARK-17596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15509164#comment-15509164
 ] 

Sean Owen commented on SPARK-17596:
-----------------------------------

It's going to be a build issue one way or the other. NoSuchMethodError means 
code was compiled versus a different version than what is present at runtime. 
Here it's from Scala libraries.

http://stackoverflow.com/questions/25089852/what-is-the-reason-for-java-lang-nosuchmethoderror-scala-predef-arrowassoc-upo
 makes me suspect you're using Spark compiled for 2.10.

> Streaming job lacks Scala runtime methods
> -----------------------------------------
>
>                 Key: SPARK-17596
>                 URL: https://issues.apache.org/jira/browse/SPARK-17596
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 2.0.0
>         Environment: Linux 4.4.20 x86_64 GNU/Linux
> openjdk version "1.8.0_102"
> Scala 2.11.8
>            Reporter: Evgeniy Tsvigun
>              Labels: kafka-0.8, streaming
>
> When using -> in Spark Streaming 2.0.0 jobs, or using 
> spark-streaming-kafka-0-8_2.11 v2.0.0, and submitting it with spark-submit, I 
> get the following error:
>     Exception in thread "main" org.apache.spark.SparkException: Job aborted 
> due to stage failure: Task 0 in stage 72.0 failed 1 times, most recent 
> failure: Lost task 0.0 in stage 72.0 (TID 37, localhost): 
> java.lang.NoSuchMethodError: 
> scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
> This only happens with spark-streaming, using ArrowAssoc in plain 
> non-streaming Spark jobs works fine.
> I put a brief illustration of this phenomenon to a GitHub repo: 
> https://github.com/utgarda/spark-2-streaming-nosuchmethod-arrowassoc
> Putting only provided dependencies to build.sbt
> "org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
> "org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided"
> using -> anywhere in the driver code, packing it with sbt-assembly and 
> submitting the job results in an error. This isn't a big problem by itself, 
> using ArrayAssoc can be avoided, but spark-streaming-kafka-0-8_2.11 v2.0.0 
> has it somewhere inside, and generates the same error.
> Packing with scala-library, can see the class in the jar after packing, but 
> it's still reported missing in runtime.
> The issue reported on StackOverflow: 
> http://stackoverflow.com/questions/39395521/spark-2-0-0-streaming-job-packed-with-sbt-assembly-lacks-scala-runtime-methods



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to