yaooqinn commented on pull request #32144:
URL: https://github.com/apache/spark/pull/32144#issuecomment-818550752


   > If you want this done, you will need to introduce hadoop configurations at 
the session level.
   
   The current behavior of the PR is equivalent to 
`sparkSession.sessionState.newHadoopConf().get(key)`, I guess this meets your 
point of ` the session level`. 
   
   ```scala
   private[sql] object SessionState {
     def newHadoopConf(hadoopConf: Configuration, sqlConf: SQLConf): 
Configuration = {
       val newHadoopConf = new Configuration(hadoopConf)
       sqlConf.getAllConfs.foreach { case (k, v) => if (v ne null) 
newHadoopConf.set(k, v) }
       newHadoopConf
     }
   }
   
   ```
   Using the global `sharedState.hadoopConf` here is only to avoid the 
unneccary copy.
   
   As I don't change the write-side, I believe it's also  at the session level.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to