Teppei Daito created SPARK-19301:
------------------------------------

             Summary: SparkContext is ignoring SparkConf when _jvm is not 
initialized on spark-submit
                 Key: SPARK-19301
                 URL: https://issues.apache.org/jira/browse/SPARK-19301
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.1.0
            Reporter: Teppei Daito
            Priority: Critical


When using spark-submit with code below
{code}
SparkContext(conf=SparkConf().setAppName('foo'))
{code}
SparkContext ignores conf arg.
This bug is started by this commit.
https://github.com/apache/spark/commit/5b77e66dd6a128c5992ab3bde418613f84be7009
*ignoring conf which not having _jconf*
https://github.com/apache/spark/blob/5b77e66dd6a128c5992ab3bde418613f84be7009/python/pyspark/context.py#L125

To resolve this problem, you have to call SparkContext._ensure_initialized() 
before calling SparkConf().
As I cannot find test code for the commit above, 
SparkContext initialization process was too complicated for me to write a patch 
to fix this problem.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to