ggarciabas opened a new pull request #31997:
URL: https://github.com/apache/spark/pull/31997


   Check if the PYSPARK_PYTHON was defined in the configurations passed to the 
context.
   
   ### What changes were proposed in this pull request?
   
   Searches for the PYSPARK_PYTHON environment variable in the configurations 
passed to the Context in Python interface.
   
   ### Why are the changes needed?
   
   When the variable is not defined in the local OS the corresponding block 
will use the `python3`, [line 
230](https://github.com/ggarciabas/spark/blob/master/python/pyspark/context.py#L230):
   ```python
   self.pythonExec = os.environ.get("PYSPARK_PYTHON", 'python3')
   ```
   However, if some specific virtual environment is sent to the executors 
and/or the python path is changed in the executors, the configuration 
`spark.executorEnv.PYSPARK_PYTHON` will not be considered.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to