[ 
https://issues.apache.org/jira/browse/SPARK-1550?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14049918#comment-14049918
 ] 

Matthew Farrellee commented on SPARK-1550:
------------------------------------------

this issue as reported is no longer present in spark 1.0, where defaults are 
provided for app name and master.

{code}
$ SPARK_HOME=dist 
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.1-src.zip python
Python 2.7.5 (default, Feb 19 2014, 13:47:28) 
[GCC 4.8.2 20131212 (Red Hat 4.8.2-7)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from pyspark import SparkContext
>>> sc=SparkContext('local')
[successful creation of context]
{code}

i believe this should be closed as resolved. /cc: [~pwendell]

> Successive creation of spark context fails in pyspark, if the previous 
> initialization of spark context had failed.
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-1550
>                 URL: https://issues.apache.org/jira/browse/SPARK-1550
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: Prabin Banka
>              Labels: pyspark, sparkcontext
>
> For example;-
> In PySpark, if we try to initialize spark context with insufficient 
> arguments, >>>sc=SparkContext('local')
> it fails with an exception 
> Exception: An application name must be set in your configuration
> This is all fine. 
> However, any successive creation of spark context with correct arguments, 
> also fails,
> >>>s1=SparkContext('local', 'test1')
> AttributeError: 'SparkContext' object has no attribute 'master'



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to