Sankar Mittapally created SPARK-17907:

             Summary: Not allowing more spark console
                 Key: SPARK-17907
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.0.0
            Reporter: Sankar Mittapally

We are exploring pyspark and spark cluster, We are able to initiated single 
spark console connection, while trying to establish new connection. We are 
getting error.

ValueError                                Traceback (most recent call last)
<ipython-input-15-05f9533b85b9> in <module>()
      4         .set("spark.executor.memory", "1g")
.set("spark.cores.max","1").set("spark.driver.allowMultipleContexts", "true") )
----> 6 sc = SparkContext(conf = conf)

/opt/spark-2.0.0-bin-hadoop2.7/python/pyspark/ in __init__(self, 
master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, 
gateway, jsc, profiler_cls)
    110         """
    111         self._callsite = first_spark_call() or CallSite(None, None, 
--> 112         SparkContext._ensure_initialized(self, gateway=gateway)
    113         try:
    114             self._do_init(master, appName, sparkHome, pyFiles, 
environment, batchSize, serializer,

/opt/spark-2.0.0-bin-hadoop2.7/python/pyspark/ in 
_ensure_initialized(cls, instance, gateway)
    257                         " created by %s at %s:%s "
    258                         % (currentAppName, currentMaster,
--> 259                             callsite.function, callsite.file, 
    260                 else:
    261                     SparkContext._active_spark_context = instance

ValueError: Cannot run multiple SparkContexts at once; existing 
SparkContext(app=PYSPARK, master=spark:// created by 
__init__ at <ipython-input-2-c7c8de510121>:6 

Command We are using

conf = (SparkConf()
        .set("spark.executor.memory", "1g")
"true") )
sc = SparkContext(conf = conf)

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to