GitHub user JoshRosen opened a pull request:

    https://github.com/apache/spark/pull/12563

    [SPARK-14797] [BUILD] Spark SQL POM should not hardcode spark-sketch_2.11 
dep.

    Spark SQL's POM hardcodes a dependency on `spark-sketch_2.11`, which causes 
Scala 2.10 builds to include the `_2.11` dependency. This is harmless since 
`spark-sketch` is a pure-Java module (see #12334 for a discussion of dropping 
the Scala version suffixes from these modules' artifactIds), but it's confusing 
to people looking at the published POMs.
    
    This patch fixes this by using `${scala.binary.version}` to substitute the 
correct suffix, and also adds a set of Maven Enforcer rules to ensure that 
`_2.11` artifacts are not used in 2.10 builds (and vice-versa).
    
    /cc @ahirreddy, who spotted this issue.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/JoshRosen/spark fix-sketch-scala-version

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/12563.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #12563
    
----
commit 313512b5db5c884fd1591e95069c8c1af39fce41
Author: Josh Rosen <[email protected]>
Date:   2016-04-21T06:01:59Z

    Add enforcer rule to demonstrate problem under -Dscala-2.10 profile

commit 9dad944ddde583fc0c52f516f350deb844911c45
Author: Josh Rosen <[email protected]>
Date:   2016-04-21T06:02:58Z

    Fix POM.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to