[ 
https://issues.apache.org/jira/browse/SPARK-10481?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-10481:
------------------------------
    Priority: Minor  (was: Major)

> SPARK_PREPEND_CLASSES make spark-yarn related jar could not be found
> --------------------------------------------------------------------
>
>                 Key: SPARK-10481
>                 URL: https://issues.apache.org/jira/browse/SPARK-10481
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 1.4.1
>            Reporter: Jeff Zhang
>            Priority: Minor
>
> It happens when SPARK_PREPEND_CLASSES is set and run spark on yarn.
> If SPARK_PREPEND_CLASSES, spark-yarn related jar won't be found. Because the 
> org.apache.spark.deploy.Client is detected as individual class rather class 
> in jar. 
> {code}
> 15/09/08 08:57:10 ERROR SparkContext: Error initializing SparkContext.
> java.util.NoSuchElementException: head of empty list
>       at scala.collection.immutable.Nil$.head(List.scala:337)
>       at scala.collection.immutable.Nil$.head(List.scala:334)
>       at 
> org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$sparkJar(Client.scala:1048)
>       at 
> org.apache.spark.deploy.yarn.Client$.populateClasspath(Client.scala:1159)
>       at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:534)
>       at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:645)
>       at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:119)
>       at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>       at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)
>       at com.zjffdu.tutorial.spark.WordCount$.main(WordCount.scala:24)
>       at com.zjffdu.tutorial.spark.WordCount.main(WordCount.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:680)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to