Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/19840
  
    I'm trying to understand what is 
https://github.com/apache/spark/blob/master/python/pyspark/context.py#L191 
really achieving. It seems pretty broken to me and feels like the whole 
`pythonExec` tracking in the various places should be removed.
    
    It causes this problem because it forces the executor to use the driver's 
python even if it's been set to a different path by the user.
    
    It uses `python` instead of `sys.executable` as the default value.
    
    And it ignores the `spark.pyspark.python` config value if it's set.
    
    Instead, shouldn't the logic at 
https://github.com/apache/spark/blob/master/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java#L304
 be used in `PythonRunner` (except for the driver python config) to find out 
the executor's python to use?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to