Ngone51 opened a new pull request #24530: [SPARK-27520][CORE][WIP] Introduce a 
global config system to replace hadoopConfiguration
URL: https://github.com/apache/spark/pull/24530
 
 
   ## What changes were proposed in this pull request?
   
   hadoopConf can be accessed via `SparkContext.hadoopConfiguration` from both 
user code and Spark internal. The configuration is mainly used to read files 
from hadoop-supported file system(eg. get URI/get FileSystem/add security 
credentials/get metastore connect url/etc.)
   We shall keep a global config that users can set and use that to track the 
hadoop configurations.
   
   This pr implements it with three main features showed below:
   
   * using ThreadLocal to track Hadoop Configuration, so that concurrent jobs 
could use their own Hadoop Configurations
   
   * provide set method by wrapping a Hadoop Configuration to allow user to 
modify Hadoop Configuration globally
   
   * provide SparkContext.withHadoopConf(){} method to allow user to modify 
Hadoop Configuration temporary
   
   ## How was this patch tested?
   
   TODO
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to