Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
> I see this customization to be specific towards how you build your custom
Docker image. Unless it is relevant towards testing, we are trying to keep the
default Docker image as lightweight as possible (as long as it passes our test
cases). Unless one of the committers sees this as an important thing to include
in the default image, I believe it to be a customization.
This is to make the default Docker image work properly, not customization
at all. Without this change, the docker image of pyspark does not work properly
in "client mode" while spark image and rspark image are working properly.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]