[ 
https://issues.apache.org/jira/browse/SPARK-7504?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14536734#comment-14536734
 ] 

Sean Owen commented on SPARK-7504:
----------------------------------

I don't think that's supported, but in any event the error you seem to be 
solving is that you tried to run the AM but not through main()? This doesn't 
sound like something that should work.

> NullPointerException when initializing SparkContext in YARN-cluster mode
> ------------------------------------------------------------------------
>
>                 Key: SPARK-7504
>                 URL: https://issues.apache.org/jira/browse/SPARK-7504
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, YARN
>            Reporter: Zoltán Zvara
>              Labels: deployment, yarn, yarn-client
>
> It is not clear for most users that, while running Spark on YARN a 
> {{SparkContext}} with a given execution plan can be run locally as 
> {{yarn-client}}, but can not deploy itself to the cluster. This is currently 
> performed using {{org.apache.spark.deploy.yarn.Client}}. {color:gray} I think 
> we should support deployment through {{SparkContext}}, but this is not the 
> point I wish to make here. {color}
> Configuring a {{SparkContext}} to deploy itself currently will yield an 
> {{ERROR}} while accessing {{spark.yarn.app.id}} in  
> {{YarnClusterSchedulerBackend}}, and after that a {{NullPointerException}} 
> while referencing the {{ApplicationMaster}} instance.
> Spark should clearly inform the user that it might be running in 
> {{yarn-cluster}} mode without a proper submission using {{Client}} and that 
> deploying is not supported directly from {{SparkContext}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to