Github user mariusvniekerk commented on the issue:

    https://github.com/apache/spark/pull/9313
  
    So since py4j now uses the context classloader, we can remove the python 
pieces about loading a class by name.  
    
    @holdenk If you want I can revisit this PR.   
    
    This case occurs for me specifically because I have python modules that 
bundle their jars with them, and when using spark-submit it is rather tedious 
to have to manually muck around with the classloader under python.
    
    We can probably also add it to SparkR since I assume they have similar 
requirements to the PySpark side.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to