Evgeniy Tsvigun created SPARK-17596:
---------------------------------------

             Summary: Streaming job lacks Scala runtime methods
                 Key: SPARK-17596
                 URL: https://issues.apache.org/jira/browse/SPARK-17596
             Project: Spark
          Issue Type: Bug
          Components: Streaming
    Affects Versions: 2.0.0
         Environment: Linux 4.4.20 x86_64 GNU/Linux
openjdk version "1.8.0_102"
Scala 2.11.8
            Reporter: Evgeniy Tsvigun


When using -> in Spark Streaming 2.0.0 jobs, or using 
spark-streaming-kafka-0-8_2.11 v2.0.0, and submitting it with spark-submit, I 
get the following error:

    Exception in thread "main" org.apache.spark.SparkException: Job aborted due 
to stage failure: Task 0 in stage 72.0 failed 1 times, most recent failure: 
Lost task 0.0 in stage 72.0 (TID 37, localhost): java.lang.NoSuchMethodError: 
scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;

This only happens with spark-streaming, using ArrowAssoc in plain non-streaming 
Spark jobs works fine.

I put a brief illustration of this phenomenon to a GitHub repo: 
https://github.com/utgarda/spark-2-streaming-nosuchmethod-arrowassoc

Putting only provided dependencies to build.sbt

"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided"

using -> anywhere in the driver code, packing it with sbt-assembly and 
submitting the job results in an error. This isn't a big problem by itself, 
using ArrayAssoc can be avoided, but spark-streaming-kafka-0-8_2.11 v2.0.0 has 
it somewhere inside, and generates the same error.

Packing with scala-library, can see the class in the jar after packing, but 
it's still reported missing in runtime.

The issue reported on StackOverflow: 
http://stackoverflow.com/questions/39395521/spark-2-0-0-streaming-job-packed-with-sbt-assembly-lacks-scala-runtime-methods





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to