Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19840
Oh, I see. You're running in client mode. So this one `--conf
spark.yarn.appMasterEnv.PYSPARK_PYTHON=py3.zip/py3/bin/python` is useless.
So I guess the behavior is expected. Because driver will honor
`PYSPARK_PYTHON` env and ship it to executors. So the cluster will use same
python executables.
With your above test, `/path/to/python` is different for driver and
executors, will it bring in issues? Driver uses `PYSPARK_PYTHON` and executors
uses `spark.executorEnv.PYSPARK_PYTHON` which points to different paths.
Normally I think we don't need to set PYSPARK_PYTHON in executor side.
Please correct me if I'm wrong.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]