oulenz commented on issue #23715: [SPARK-26803][PYTHON] Add sbin subdirectory 
to pyspark
URL: https://github.com/apache/spark/pull/23715#issuecomment-459713631
 
 
   > > I get the impression your real issue is with what's already contained in 
the pyspark package. If you want to change that, that's another discussion. It 
is not a valid argument against this PR
   > 
   > It is a valid argument since you're referring what's already contained in 
the Pyspark package. The current argument can be a subset of this argument.
   
   It's valid if you submit an alternative PR that takes out everything you 
feel doesn't belong in the pyspark package, and argue that that supersedes this 
request. Otherwise it's impossible to argue against.
   
   > Practically I understand it can be useful. But `pip install pyspark` not 
Spark. It looks we're going ahead in a weird way. I can see one time thing can 
be done if other committers prefer but this looks something we shouldn't do in 
principal. Adding @felixcheung and @holdenk to ask more opinions.
   
   If you look at [@holdenk's pr](https://github.com/apache/spark/pull/15659) 
that created the pip package, you can see that the intention was to include all 
of spark that's required to run spark locally. That is the principle that 
should guide the decision whether to add the history server script.
   
   The history server is more than just 'useful'. It's essential for everyone 
new to spark who hasn't a clue how their job is actually executed and whether 
they wrote their script right.  

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to