[ 
https://issues.apache.org/jira/browse/SPARK-6506?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14484250#comment-14484250
 ] 

Kostas Sakellis commented on SPARK-6506:
----------------------------------------

Here is the exception I saw when I ran the above job:
{code}
Traceback (most recent call last):
  File "pi.py", line 29, in <module>
    sc = SparkContext(appName="PythonPi")
  File 
"/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.23/jars/spark-assembly-1.3.0-cdh5.4.0-hadoop2.6.0-cdh5.4.0.jar/pyspark/context.py",
 line 108, in __init__
  File 
"/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.23/jars/spark-assembly-1.3.0-cdh5.4.0-hadoop2.6.0-cdh5.4.0.jar/pyspark/context.py",
 line 222, in _ensure_initialized
  File 
"/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.23/jars/spark-assembly-1.3.0-cdh5.4.0-hadoop2.6.0-cdh5.4.0.jar/pyspark/java_gateway.py",
 line 32, in launch_gateway
  File "/usr/lib64/python2.6/UserDict.py", line 22, in __getitem__
    raise KeyError(key)
KeyError: 'SPARK_HOME'
{code}

> python support yarn cluster mode requires SPARK_HOME to be set
> --------------------------------------------------------------
>
>                 Key: SPARK-6506
>                 URL: https://issues.apache.org/jira/browse/SPARK-6506
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.3.0
>            Reporter: Thomas Graves
>
> We added support for python running in yarn cluster mode in 
> https://issues.apache.org/jira/browse/SPARK-5173, but it requires that 
> SPARK_HOME be set in the environment variables for application master and 
> executor.  It doesn't have to be set to anything real but it fails if its not 
> set.  See the command at the end of: https://github.com/apache/spark/pull/3976



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to