Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/12334#issuecomment-209616832
> regarding not supporting cross builds for Spark, is that really something
we don't want to support? I've hacked together something (non-Spark-related) in
maven to allow cross-compiling in the same build, and I think that if we're
going to support multiple versions of Scala, that might be a good thing to have
in Spark; it's not unusual for the 2.10 build to be broken these days because
it's not exercised by the PR builders.
Whether we compile for multiple versions of Scala as part of the same CI
build does not necessarily imply usage of [SBT's
cross-build](http://www.scala-sbt.org/0.13/docs/Cross-Build.html) feature (e.g.
being able to do "sbt + package" to build for all Scala versions at the same
time. I think that it's going to be prohibitively difficult to support the use
of SBT's `+` operator given our use of the POM reader for declaring
dependencies. I guess the TL;DR is that I think the best way to build for
multiple Scala versions in CI is to just kick off multiple CI builds in
parallel or have our bash / Python CI scripts orchestrate that at a higher
level rather than trying to push that logic into Maven or SBT so that it can be
done in a single build-tool invocation.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]