Hi Akhil,

Thanks for your reply. 
Here it is

Launch Command: "/usr/lib/jvm/java-7-oracle/jre/bin/java" "-cp" 
"/etc/spark/:/opt/spark/lib/spark-assembly-1.4.0-hadoop2.3.0.jar:/opt/spark-1.4.0-bin-hadoop2.3/lib/datanucleus-rdbms-3.2.9.jar:/opt/spark-1.4.0-bin
-hadoop2.3/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark-1.4.0-bin-hadoop2.3/lib/datanucleus-core-3.2.10.jar"
 "-Xms1024M" "-Xmx1024M" "-Dspark.eventLog.enabled=true" 
"-Dakka.loglevel=WARNING" 
"-Dspark.serializer=org.apache.spark.serializer.KryoSerializer" 
"-Dspark.executor.memory=1024m" "-Dspark.master=spark://1.2.3.4:7077" 
"-Dspark.rpc.askTimeout=10" "-Dspark.app.name=com.scala.algorithm" 
"-Dspark.driver.supervise=true" 
"-Dspark.jars=,file:/root/libs/commons-codec-1.2.jar,file:/root/libs/commons-lang3-3.1.jar,file:/root/libs/commons-logging-1.1.1.jar,file:/root/libs/guava-16.0.1.jar,file:/root/libs/httpclient-4.2.5.jar,file:/root/libs/httpcore-4.2.4.jar,file:/root/libs/ivy-2.3.0.jar,file:/root/libs/joda-convert-1.6.jar,file:/root/libs/joda-time-2.3.jar,file:/root/libs/jsr166e-1.1.0.jar,file:/root/libs/kryo-3.0.2.jar,file:/root/libs/lift-json_2.10-2.6.2.jar,file:/root/libs/lz4-1.2.0.jar,file:/root/libs/metrics-core-3.0.2.jar,file:/root/libs/netty-3.9.0.Final.jar,file:/root/libs/reflectasm-1.09.jar,file:/root/libs/slf4j-api-1.7.5.jar,file:/root/libs/snappy-java-1.0.5.jar,file:/root/spark-algorithm-build/0.1.1/algorithm.jar"
 "-Dspark.driver.memory=1g" "-XX:MaxPermSize=128m" 
"org.apache.spark.deploy.worker.DriverWrapper" 
"akka.tcp://sparkWorker@1.2.3.4:50911/user/Worker" 
"/srv/spark/work/driver-20150628164945-0020/analytics-0.1.1.jar" 
"com.scala.algorithm"
========================================

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
        at 
org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoClassDefFoundError: net/liftweb/json/DefaultFormats$
        at com.scala.utilities.dal.IOUtil$.parseConfigFile(IOUtil.scala:64)
        at com.scala.algorithmFramework$.main(algorithmfm.scala:20)
        at com.scala.algorithmFramework.main(algorithmfm.scala)
        ... 6 more
Caused by: java.lang.ClassNotFoundException: net.liftweb.json.DefaultFormats$
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 9 more



________________________________________
From: Akhil Das [ak...@sigmoidanalytics.com]
Sent: 29 June 2015 09:43
To: Hisham Mohamed
Cc: user@spark.apache.org
Subject: Re: spark-submit in deployment mode with the "--jars" option

Can you paste the stacktrace? Looks like you are missing few jars.

Thanks
Best Regards

On Sun, Jun 28, 2015 at 11:14 PM, hishamm 
<hisham.moha...@unige.ch<mailto:hisham.moha...@unige.ch>> wrote:
Hi,

I want to deploy my application on a standalone cluster.
Spark submit acts in strange way. When I deploy the application in
*"client"* mode, everything works well and my application can see the
additional jar files.

Here is the command:
>   spark-submit --master spark://1.2.3.4:7077<http://1.2.3.4:7077> 
> --deploy-mode  client
> --supervise --jars $(echo /myjars/*.jar | tr ' ' ',')  --class
> com.algorithm /my/path/algorithm.jar

However, when I submit the command in *"cluster"* deployment mode. The
driver can not see the additional jars.
I always get *java.lang.ClassNotFoundException*

Here is the command:
>   spark-submit --master spark://1.2.3.4:7077<http://1.2.3.4:7077> 
> --deploy-mode cluster
> --supervise --jars $(echo /myjars/*.jar | tr ' ' ',')  --class
> com.algorithm /my/path/algorithm.jar


Do I miss something ?

thanks,
Hisham



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-in-deployment-mode-with-the-jars-option-tp23519.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to