Xianjin YE created SPARK-28203:
----------------------------------

             Summary: PythonRDD should respect SparkContext's conf when passing 
user confMap
                 Key: SPARK-28203
                 URL: https://issues.apache.org/jira/browse/SPARK-28203
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Spark Core
    Affects Versions: 2.4.3
            Reporter: Xianjin YE


PythonRDD have several API which accepts user configs from python side. The 
parameter is called confAsMap and it's intended to merge with RDD's hadoop 
configuration.


 However, the confAsMap is first mapped to Configuration then merged into 
SparkContext's hadoop configuration. The mapped Configuration will load default 
key values in core-default.xml etc., which may be updated in SparkContext's 
hadoop configuration. The default value will override updated value in the 
merge process.

I will submit a pr to fix this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to