[ 
https://issues.apache.org/jira/browse/SPARK-17596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15509146#comment-15509146
 ] 

Evgeniy Tsvigun edited comment on SPARK-17596 at 9/21/16 8:07 AM:
------------------------------------------------------------------

Thanks for the hint. I did include scala-library in the package, but I just 
tried excluding it - same thing. Checked that my package contains only driver 
classes, compiled with Scala 2.11.8, and I see Spark 2.0.0 distro has 
scala-library-2.11.8.jar

Here's my buildfile: 
{code:title=build.sbt|borderStyle=solid}
name := "spark-2-streaming-nosuchmethod-arrowassoc"

version := "1.1"

scalaVersion := "2.11.8"

libraryDependencies ++= {
  val sparkV = "2.0.0"
  Seq(
    "org.apache.spark" %% "spark-core" % sparkV % "provided",
    "org.apache.spark" %% "spark-streaming" % sparkV % "provided"
  )
}

assemblyOption in assembly := (assemblyOption in 
assembly).value.copy(includeScala = false)
{code}


was (Author: utgarda):
Thanks for the hint. I did include scala-library in the package, but I just 
tried excluding it - same thing. Checked that my package contains only driver 
classes, compiled with Scala 2.11.8, and I see Spark 2.0.0 distro has 
scala-library-2.11.8.jar

Here's my buildfile: 

```
name := "spark-2-streaming-nosuchmethod-arrowassoc"

version := "1.1"

scalaVersion := "2.11.8"

libraryDependencies ++= {
  val sparkV = "2.0.0"
  Seq(
    "org.apache.spark" %% "spark-core" % sparkV % "provided",
    "org.apache.spark" %% "spark-streaming" % sparkV % "provided"
  )
}

assemblyOption in assembly := (assemblyOption in 
assembly).value.copy(includeScala = false)
```

> Streaming job lacks Scala runtime methods
> -----------------------------------------
>
>                 Key: SPARK-17596
>                 URL: https://issues.apache.org/jira/browse/SPARK-17596
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 2.0.0
>         Environment: Linux 4.4.20 x86_64 GNU/Linux
> openjdk version "1.8.0_102"
> Scala 2.11.8
>            Reporter: Evgeniy Tsvigun
>              Labels: kafka-0.8, streaming
>
> When using -> in Spark Streaming 2.0.0 jobs, or using 
> spark-streaming-kafka-0-8_2.11 v2.0.0, and submitting it with spark-submit, I 
> get the following error:
>     Exception in thread "main" org.apache.spark.SparkException: Job aborted 
> due to stage failure: Task 0 in stage 72.0 failed 1 times, most recent 
> failure: Lost task 0.0 in stage 72.0 (TID 37, localhost): 
> java.lang.NoSuchMethodError: 
> scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
> This only happens with spark-streaming, using ArrowAssoc in plain 
> non-streaming Spark jobs works fine.
> I put a brief illustration of this phenomenon to a GitHub repo: 
> https://github.com/utgarda/spark-2-streaming-nosuchmethod-arrowassoc
> Putting only provided dependencies to build.sbt
> "org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
> "org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided"
> using -> anywhere in the driver code, packing it with sbt-assembly and 
> submitting the job results in an error. This isn't a big problem by itself, 
> using ArrayAssoc can be avoided, but spark-streaming-kafka-0-8_2.11 v2.0.0 
> has it somewhere inside, and generates the same error.
> Packing with scala-library, can see the class in the jar after packing, but 
> it's still reported missing in runtime.
> The issue reported on StackOverflow: 
> http://stackoverflow.com/questions/39395521/spark-2-0-0-streaming-job-packed-with-sbt-assembly-lacks-scala-runtime-methods



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to