Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-57513757
Hey @ScrapCodes thanks for posting the whole thing here. In terms of the
way this interacts with the build - I really think we'll need to write a maven
plugin to make this all work. The reason is that we need to publish separate
artifacts for Scala 2.11 and 2.10 - we can't publish a single artifact that
relies on profile activation. The reason is that most build tools (Maven, SBT)
won't respect profiles from other pom's you are linking against. To test this
yourself you can publish locally for Scala 2.11 and then try to write a project
that links against it - does it work? My guess is that it won't work.
Really I think what we need is a build plug-in that re-writes our published
poms to do two things:
1. Set the correct Scala version in the artifact ID.
2. Advertise the correct set of dependencies for 2.10 vs 2.11.
You said earlier that it's not possible for maven build plug-ins to modify
the build, but I'm pretty sure it is - because the `maven-shade-plugin` that we
use does exactly this. It modifies our published pom to exclude guava (as
@vanzin can tell you since he wrote this). I would take a look and see if you
can mimic how that works. I looked quickly and there is a bunch of logic
related to pom re-writing.
https://github.com/apache/maven-plugins/blob/trunk/maven-shade-plugin/src/main/java/org/apache/maven/plugins/shade/pom/MavenJDOMWriter.java#L1668
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]