Github user sryza commented on the pull request:

    https://github.com/apache/spark/pull/5478#issuecomment-93061403
  
    IIUC, the motivation for this change is that the assembly jar distribution 
mechanism doesn't work for some Java versions.
    
    I agree with Andrew that, if at all possible, we should avoid deployment 
models that expect PySpark or anything to be on every node.  Even if we advise 
against it, it increases the number of places one needs to check when debugging 
why something does or does not appear on the executor PYTHONPATH.
    
    Are there workarounds for the Java versions issue that don't require python 
to be installed on the NodeManagers?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to