Evgeniy Tsvigun created SPARK-17596:

             Summary: Streaming job lacks Scala runtime methods
                 Key: SPARK-17596
                 URL: https://issues.apache.org/jira/browse/SPARK-17596
             Project: Spark
          Issue Type: Bug
          Components: Streaming
    Affects Versions: 2.0.0
         Environment: Linux 4.4.20 x86_64 GNU/Linux
openjdk version "1.8.0_102"
Scala 2.11.8
            Reporter: Evgeniy Tsvigun

When using -> in Spark Streaming 2.0.0 jobs, or using 
spark-streaming-kafka-0-8_2.11 v2.0.0, and submitting it with spark-submit, I 
get the following error:

    Exception in thread "main" org.apache.spark.SparkException: Job aborted due 
to stage failure: Task 0 in stage 72.0 failed 1 times, most recent failure: 
Lost task 0.0 in stage 72.0 (TID 37, localhost): java.lang.NoSuchMethodError: 

This only happens with spark-streaming, using ArrowAssoc in plain non-streaming 
Spark jobs works fine.

I put a brief illustration of this phenomenon to a GitHub repo: 

Putting only provided dependencies to build.sbt

"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided"

using -> anywhere in the driver code, packing it with sbt-assembly and 
submitting the job results in an error. This isn't a big problem by itself, 
using ArrayAssoc can be avoided, but spark-streaming-kafka-0-8_2.11 v2.0.0 has 
it somewhere inside, and generates the same error.

Packing with scala-library, can see the class in the jar after packing, but 
it's still reported missing in runtime.

The issue reported on StackOverflow: 

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to