[ 
https://issues.apache.org/jira/browse/SPARK-12622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15091792#comment-15091792
 ] 

Adrian Bridgett commented on SPARK-12622:
-----------------------------------------

Hi Ajesh - I'm not sure what more I can add that's not already here:

- jar file in /tmp/ - e.g. /tmp/f oo.jar
- spark-submit --class Foo "/tmp/f oo.jar" fails on executors (no such class)
- mv "/tmp/f oo.jar" /tmp/foo.jar
- spark-submit --class Foo "/tmp/foo.jar"  works

spark-defaults.conf contains (amongst tuning lines):
spark.master 
mesos://zk://mesos-1.example.net:2181,mesos-2.example.net:2181,mesos-3.example.net:2181/mesos

> spark-submit fails on executors when jar has a space in it
> ----------------------------------------------------------
>
>                 Key: SPARK-12622
>                 URL: https://issues.apache.org/jira/browse/SPARK-12622
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>         Environment: Linux, Mesos 
>            Reporter: Adrian Bridgett
>            Priority: Minor
>
> spark-submit --class foo "Foo.jar"  works
> but when using "f oo.jar" it starts to run and then breaks on the executors 
> as they cannot find the various functions.
> Out of interest (as HDFS CLI uses this format) I tried f%20oo.jar - this 
> fails immediately.
> {noformat}
> spark-submit --class Foo /tmp/f\ oo.jar
> ...
> spark.jars=file:/tmp/f%20oo.jar
> 6/01/04 14:56:47 INFO spark.SparkContext: Added JAR file:/tmpf%20oo.jar at 
> http://10.1.201.77:43888/jars/f%oo.jar with timestamp 1451919407769
> 16/01/04 14:57:48 WARN scheduler.TaskSetManager: Lost task 4.0 in stage 0.0 
> (TID 2, ip-10-1-200-232.ec2.internal): java.lang.ClassNotFoundException: 
> Foo$$anonfun$46
> {noformat}
> SPARK-6568 is related but maybe specific to the Windows environment



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to