[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-14 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/22967


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-14 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r233490571
  
--- Diff: pom.xml ---
@@ -2717,7 +2717,6 @@
   
 
   
-*:*_2.11
 *:*_2.10
--- End diff --

Yeah that looks right to me.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-13 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r22593
  
--- Diff: pom.xml ---
@@ -2718,7 +2710,6 @@
 
   
 *:*_2.11
-*:*_2.10
--- End diff --

Thanks for the suggestion, and I agree this will make the default scala 
2.12 profile cleaner.  


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-12 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232916947
  
--- Diff: pom.xml ---
@@ -2718,7 +2710,6 @@
 
   
 *:*_2.11
-*:*_2.10
--- End diff --

@dbtsai sorry for the late idea here -- this isn't essential for the 
change, and you don't have to make it here -- but I thought of a better way. 
Really we want the default `maven-enforcer-plugin` config above to exclude 
_2.10 and _2.11 dependencies, and remove everything from the `scala-2.12` 
profile (or else, one still has to enable the profile to get all Scala 2.12 
config). Then, move this `maven-enforcer-plugin` config to the `scala-2.11` 
profile. That copy should only exclude _2.10 dependencies. However to make sure 
Maven doesn't also add that to the _2.11 exclusion rule in the parent, the 
`combine.children="append"` attribute here can become 
`combine.self="override"`. That should get the desired effects.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-11 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232510861
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

That seems fine. A dummy package is also fine.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-11 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232510439
  
--- Diff: pom.xml ---
@@ -2717,7 +2717,6 @@
   
 
   
-*:*_2.11
 *:*_2.10
--- End diff --

Make sense. I made the parent rule to exclude 2.10, and moved the exclusion 
of 2.11 to 2.12 profile. Thanks.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-11 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232505402
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

I just did a text search and replacement, and I didn't read the context of 
having `sparkPackages = "com.databricks:spark-avro_2.11:3.0.0"` here. My bad.

Although avro is now part of spark codebase, but it's in external package 
which is not in the classpath by default. How about I change it to 
`sparkPackages = "org.apache.spark:spark-avro_2.12:3.0.0"` here?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-09 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232178323
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

yes!
(although, let's not use spark here - don't want to encourage naming 
packages with spark in the name)


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232034372
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

It can be anything. Rather than make this an example that requires 
maintenance, why not just change the surrounding text to not necessarily refer 
to an Avro connector, and make this a dummy package like 
com.acme:spark-foo_2.12 ?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232033587
  
--- Diff: pom.xml ---
@@ -2717,7 +2717,6 @@
   
 
   
-*:*_2.11
 *:*_2.10
--- End diff --

Not quite -- the rule has to stay on the 2.12 profile, because it's the one 
that needs to exclude _2.11 dependencies. The exclusion for _2.10 is already in 
the parent rule and can be removed. But that's the only change here.
Yes the version changes look right.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-08 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232024914
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

I thought `com.databricks:spark-avro_2.12` is deprecated and no longer 
exist. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-08 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r232024557
  
--- Diff: pom.xml ---
@@ -1998,7 +1998,7 @@
   -->
   org.jboss.netty
   org.codehaus.groovy
-  *:*_2.10
+  *:*_2.11
 
--- End diff --

@srowen Can you take a look if this looks right now? Thanks! 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231899226
  
--- Diff: pom.xml ---
@@ -154,8 +154,8 @@
 3.4.1
 
 3.2.2
-2.11.12
-2.11
+2.12.7
--- End diff --

The definition of the profile `scala-2.11` needs to change to set these 
back to 2.11 versions. The `scala-2.12` profile doesn't need to set these now, 
then. You can keep the enforcer rule in the 2.12 profile to ban 2.11 
dependencies. Actually, the line banning 2.10 dependencies can be removed as 
that's already set in the main build's config.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-08 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231819016
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

I'd not worry about this example too much - this could be 
`com.databricks:spark-avro_2.12:3.0.0`


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231783302
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

Let me try to take a look as well this weekends.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231783339
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

adding @JoshRosen 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231783212
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

I am not an expert but just know a bit. The mima change look right from a 
cursory look.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231781938
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

Get you. 

BTW, are you familiar with Mima? I still can not figure out why it's still 
failing. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231781880
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

I mean it's related with using external package, it looks so but Avro is 
kind of internal source now .. so it's out of date.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread dbtsai
Github user dbtsai commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231781635
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

I am not familiar with R. Can you elaborate? Thanks.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231781676
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

oh, but the problem is other packages probably wouldn't have _2.12 
distribution. hm, I think this can be left as was for now.

At least I am going to release spark-xml before Spark 3.0.0 anyway. I can 
try to include 2.12 distribution as well and fix it here later.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/22967#discussion_r231780839
  
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR` 
commands, or if initiali
 
 
 {% highlight r %}
-sparkR.session(sparkPackages = "com.databricks:spark-avro_2.11:3.0.0")
+sparkR.session()
--- End diff --

Eh, @dbtsai, I think you can just switch this to other datasources like 
`spark-redshift` or `spark-xml`, and fix the description above `you can find 
data source connectors for popular file formats like Avro`.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22967: [SPARK-25956] Make Scala 2.12 as default Scala ve...

2018-11-07 Thread dbtsai
GitHub user dbtsai opened a pull request:

https://github.com/apache/spark/pull/22967

[SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0 

## What changes were proposed in this pull request?

This PR makes Spark's default Scala version as 2.12, and Scala 2.11 will be 
the alternative version. This implies that Scala 2.12 will be used by our CI 
builds including pull request builds.

We'll update the Jenkins to include a new compile-only jobs for Scala 2.11 
to ensure the code can be still compiled with Scala 2.11.

## How was this patch tested?

existing tests

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dbtsai/spark scala2.12

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/22967.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #22967


commit 635e6e23c5066018fd656738c51d02df8130585e
Author: DB Tsai 
Date:   2018-11-06T22:13:11Z

make scala 2.12 as default

commit 5011dc07c6462e7f5a9974a0b9b28f937d678297
Author: DB Tsai 
Date:   2018-11-06T23:11:34Z

sbt change

commit b4b9cb95df35b754432fb74361c32f563d1661b0
Author: DB Tsai 
Date:   2018-11-07T00:02:22Z

address feedback

commit 292adb111750cfe98593f12f64ebe11067482b44
Author: DB Tsai 
Date:   2018-11-07T00:35:58Z

address feedback




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org