Github user GrahamDennis commented on the pull request:
https://github.com/apache/spark/pull/1890#issuecomment-57956359
@rxin, @ash211 It would be good to have a conversation about whether this
is the best approach.
My approach is a sort-of brute-force approach of just adding the
application jar to the classpath, but another approach might be to use an
isolated classloader for the user application jar that only includes
org.apache.spark.*. This would reduce conflicts with transitive dependencies.
I'm not familiar with this type of classloader magic, so I don't know how
feasible this would be.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]