HyukjinKwon edited a comment on pull request #30735:
URL: https://github.com/apache/spark/pull/30735#issuecomment-744662981


   Thanks @tgravescs. If guiding users is a matter, we can still do by updating 
migration guide or even show a warning that the configuration was removed.
   
   Given that this is experimental, I wanted to treat a bit differently as the 
removal and behaviour changes are expected as noted in the tags and as 
discussed in the mailing list in the past.
   
   I feel like we're treating GA-ed feature and experimental feature similarly.
   
   In any event, I agree that this is more conservative and possibly smoother 
approach. I'm okay with deprecating.
   
   For setting envioronment variables on executors, I believe 
`PYSPARK_DRIVER_PYTHON` is for driver python and `PYSPARK_PYTHON` is for 
executor python (and driver if `PYSPARK_DRIVER_PYTHON` is not set). So, if 
different python executables are used, they could set both environment 
variables (or equivalent Spark configurations such as `spark.pyspark.python`).


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to