You can find the Spark version of spark-submit in the log. Could you check
if it's not consistent?
On Thu, Jan 12, 2017 at 7:35 AM Ramkumar Venkataraman <
ram.the.m...@gmail.com> wrote:
> Spark: 1.6.1
>
> I am trying to use the new mapWithState API and I am getting the following
> error:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/streaming/StateSpec$
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.streaming.StateSpec$
>
> Build.sbt
>
> scalaVersion := "2.10.6"
> typelevelDefaultSettings
> val sparkVersion = "1.6.1"
>
> resolvers ++= Seq(
> "Sonatype OSS Snapshots" at
> "https://oss.sonatype.org/content/repositories/snapshots";
> )
>
> libraryDependencies ++= Seq(
> "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
> "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
> "org.apache.spark" %% "spark-streaming-kafka" % sparkVersion,
> "com.fasterxml.jackson.core" % "jackson-databind" % "2.3.3" // Needed by
> spark-core
> )
> ==
>
> This is how my spark-submit looks like:
>
> ./bin/spark-submit --verbose --master yarn-client --num-executors 50
> --driver-memory=4G --executor-memory=8G --conf
> "spark.driver.extraJavaOptions=-XX:MaxPermSize=6G -XX:+UseConcMarkSweepGC"
> --conf "spark.executor.extraJavaOptions=-XX:+UseConcMarkSweepGC -verbose:gc
> -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" --class MY_DRIVER
> ~/project-assembly-0.0.1-SNAPSHOT.jar
>
> ==
>
> Is there anything I am missing here? I understand that NoClassDefFoundError
> means the required Jars aren't present in the classpath, I am just not able
> to understand why this class alone is missing, when the others related to
> window, etc. are found. Do I have to pass in additional jars to make this
> API work?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-NoClassDefFoundError-StateSpec-tp28301.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>