[
https://issues.apache.org/jira/browse/TOREE-244?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15347127#comment-15347127
]
Keith Kraus commented on TOREE-244:
-----------------------------------
I ran into this bug in version dev9 and was able to resolve it by adding the
assembly to the driver class path in spark-submit.
I did this by changing the last line in the run.sh file to be:
{noformat}
eval exec "${SPARK_HOME}/bin/spark-submit" "${SPARK_OPTS}" --driver-class-path
$PROG_HOME/lib/${KERNEL_ASSEMBLY} --class org.apache.toree.Main
"${TOREE_ASSEMBLY}" "${TOREE_OPTS}" "$@"
{noformat}
> Exception caused by joptsimple.OptionParser.allowsUnrecognizedOptions
> ---------------------------------------------------------------------
>
> Key: TOREE-244
> URL: https://issues.apache.org/jira/browse/TOREE-244
> Project: TOREE
> Issue Type: Bug
> Reporter: Emre Safak
>
> When I run spark-kernel I keep getting the following:
> {code:java}
> Starting Spark Kernel with SPARK_HOME=/usr/lib/spark
> Exception in thread "main" java.lang.NoSuchMethodError:
> joptsimple.OptionParser.allowsUnrecognizedOptions()V
> at
> com.ibm.spark.boot.CommandLineOptions.<init>(CommandLineOptions.scala:30)
> at
> com.ibm.spark.SparkKernel$delayedInit$body.apply(SparkKernel.scala:24)
> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
> at
> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
> at scala.App$class.main(App.scala:71)
> at com.ibm.spark.SparkKernel$.main(SparkKernel.scala:23)
> at com.ibm.spark.SparkKernel.main(SparkKernel.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}
> I followed the [integration
> guide|https://github.com/ibm-et/spark-kernel/wiki/Guide-to-Integrating-the-Spark-Kernel-with-Jupyter]
> after modifying the Makefile so it would run on my system (Ubuntu 14.04). I
> am using Cloudera Spark 1.5.1 and Scala 2.11.1
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)