MrBago opened a new pull request #24353: [Spark-27446] Use existing spark conf 
if available.
URL: https://github.com/apache/spark/pull/24353
 
 
   ## What changes were proposed in this pull request?
   
   The RBackend and RBackendHandler create new conf objects that don't pick up 
conf values from the existing SparkSession and therefore always use the default 
conf values instead of values specified by the user.
   
   In this fix we check to see if the spark env already exists, and get the 
conf from there. We fall back to creating a new conf. This follows the pattern 
used in other places including this: 
https://github.com/apache/spark/blob/3725b1324f731d57dc776c256bc1a100ec9e6cd0/core/src/main/scala/org/apache/spark/api/r/BaseRRunner.scala#L261
   
   ## How was this patch tested?
   
   (Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
   (If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)
   
   Please review http://spark.apache.org/contributing.html before opening a 
pull request.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to