[ 
https://issues.apache.org/jira/browse/SPARK-28203?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-28203:
------------------------------------

    Assignee: Apache Spark

> PythonRDD should respect SparkContext's conf when passing user confMap
> ----------------------------------------------------------------------
>
>                 Key: SPARK-28203
>                 URL: https://issues.apache.org/jira/browse/SPARK-28203
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, Spark Core
>    Affects Versions: 2.4.3
>            Reporter: Xianjin YE
>            Assignee: Apache Spark
>            Priority: Minor
>
> PythonRDD have several API which accepts user configs from python side. The 
> parameter is called confAsMap and it's intended to merge with RDD's hadoop 
> configuration.
>  However, the confAsMap is first mapped to Configuration then merged into 
> SparkContext's hadoop configuration. The mapped Configuration will load 
> default key values in core-default.xml etc., which may be updated in 
> SparkContext's hadoop configuration. The default value will override updated 
> value in the merge process.
> I will submit a pr to fix this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to