(This should fork as its own thread, though it began during discussion of whether to continue Java 7 support in Spark 2.x.)
Simply: would like to more clearly take the temperature of all interested parties about whether to support Scala 2.10 in the Spark 2.x lifecycle. Some of the arguments appear to be: Pro - Some third party dependencies do not support Scala 2.11+ yet and so would not be usable in a Spark app Con - Lower maintenance overhead -- no separate 2.10 build, cross-building, tests to check, esp considering support of 2.12 will be needed - Can use 2.11+ features freely - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to come I would like to not support 2.10 for Spark 2.x, myself. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org