sunchao commented on pull request #29843:
URL: https://github.com/apache/spark/pull/29843#issuecomment-702520516


   Just one last test failure:
   
   "Exception: Python in worker has different version 3.6 than that in driver 
3.8, PySpark cannot run with different minor versions. Please check environment 
variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set."
   
   @HyukjinKwon do you happen to know the reason for this? looking at the CI 
script it seems it should install either Python 3.6 or 3.8, but not both.
   
   I'm also not sure which part of this PR could affect the Yarn/Python tests. 
I tried on my own Spark fork with some dummy change in `YarnClusterSuite` (just 
to trigger tests on `ExtendedYarnTest`) and the tests there all passed, so it 
seems the failure is indeed related to this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to