oulenz commented on issue #23715: [SPARK-26803][PYTHON] Add sbin subdirectory 
to pyspark
URL: https://github.com/apache/spark/pull/23715#issuecomment-459662638
 
 
   The client/server side distinction isn't relevant when you run spark 
locally. The pyspark package contains everything needed to `spark-submit` a job 
locally. It automatically starts the ui at `localhost:4040`. It already 
contains everything needed to launch the history server, *except the actual 
script*.
   
   I get the impression your real issue is with what's already contained in the 
pyspark package. If you want to change that, that's another discussion. It is 
not a valid argument against this PR.
   
   This PR adds 41KB of scripts to the pyspark package to actually make use of 
functionality that's already there. Have a look at 
https://spark.apache.org/docs/latest/monitoring.html and tell me how a user is 
supposed to tell this doesn't apply to pyspark.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to