Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3238#issuecomment-63130947
I wonder if using `XmlAppendingTransformer` in the shade plugin would help;
it doesn't look like the different plugin.xml files actually conflict with one
another, seems like you just need all of the content available. (There are some
conflicts in the actual root `<plugin>` tag, which is the part that might not
work.)
Anyway, that's probably for a different change, though, if it's desired.
As for the dist cache option, you could have a
`spark.yarn.datanucleus.cache.dir` config option, and add all the jars in that
directory to the dist cache instead of looking for them locally. The launcher
could populate the directory if no jars are there, although that could result
in races (multiple jobs launching at the same time trying to upload files).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]