Re: Java 11 support

2018-11-06 Thread shane knapp
cool, i was wondering when we were going to forge ahead in to the great future of jdk8++... i went ahead and created a sub-task of installing a newer version of java on the build nodes ( https://issues.apache.org/jira/browse/SPARK-25953), and once we figure out exact what version we want i'll go

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Shivaram Venkataraman
Right - I think we should move on with 2.4.0. In terms of what can be done to avoid this error there are two strategies - Felix had this other thread about JDK 11 that should at least let Spark run on the CRAN instance. In general this strategy isn't foolproof because the JDK version and other

Re: Java 11 support

2018-11-06 Thread DB Tsai
Scala 2.11 is EOL, and only Scala 2.12 will support JDK 11 https://github.com/scala/scala-dev/issues/559#issuecomment-436160166 , we might need to make Scala 2.12 as default version in Spark 3.0 to move forward. Given

Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread DB Tsai
We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0. Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Felix Cheung
I’d rather not mess with 2.4.0 at this point. On CRAN is nice but users can also install from Apache Mirror. Also I had attempted and failed to get vignettes not to build, it was non trivial and could t get it to work. It I have an idea. As for tests I don’t know exact why is it not skipped.

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Sean Owen
I think we should make Scala 2.12 the default in Spark 3.0. I would also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping 2.11 support it means we'd support Scala 2.11 for years, the lifetime of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or 3.2.0 release, kind

Re: Removing non-deprecated R methods that were deprecated in Python, Scala?

2018-11-06 Thread Reynold Xin
Maybe deprecate and remove in next version? It is bad to just remove a method without deprecation notice. On Tue, Nov 6, 2018 at 5:44 AM Sean Owen wrote: > See https://github.com/apache/spark/pull/22921#discussion_r230568058 > > Methods like toDegrees, toRadians, approxCountDistinct were

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Sean Owen
I think the second option, to skip the tests, is best right now, if the alternative is to have no SparkR release at all! Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it from the PMC, and release that? It's drastic but so is not being able to release, I think. Right? or is

Re: Test and support only LTS JDK release?

2018-11-06 Thread Marcelo Vanzin
+1, that's always been my view. Although, to be fair, and as Sean mentioned, the jump from jdk8 is probably the harder part. After that it's less likely (hopefully?) that we'll run into issues in non-LTS releases. And even if we don't officially support them, trying to keep up with breaking

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Wenchen Fan
Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately? On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung wrote: > Shivaram and I were discussing. > Actually we worked with them before. Another possible approach is to > remove the vignettes eval and all test from the

Re: Removing non-deprecated R methods that were deprecated in Python, Scala?

2018-11-06 Thread Sean Owen
Sounds good, remove in 3.1? I can update accordingly. On Tue, Nov 6, 2018, 10:46 AM Reynold Xin Maybe deprecate and remove in next version? It is bad to just remove a > method without deprecation notice. > > On Tue, Nov 6, 2018 at 5:44 AM Sean Owen wrote: > >> See

Re: Removing non-deprecated R methods that were deprecated in Python, Scala?

2018-11-06 Thread Shivaram Venkataraman
Yep. That sounds good to me. On Tue, Nov 6, 2018 at 11:06 AM Sean Owen wrote: > > Sounds good, remove in 3.1? I can update accordingly. > > On Tue, Nov 6, 2018, 10:46 AM Reynold Xin > >> Maybe deprecate and remove in next version? It is bad to just remove a >> method without deprecation notice.

Re: need assistance debugging a strange build failure...

2018-11-06 Thread shane knapp
btw, this is a compilation error in the SBT build that only shows up on the ubuntu workers. On Mon, Nov 5, 2018 at 5:07 PM shane knapp wrote: > the maven build is quite happy: > > https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7-ubuntu-testing/ > > i've wiped all

Test and support only LTS JDK release?

2018-11-06 Thread DB Tsai
Given Oracle's new 6-month release model, I feel the only realistic option is to only test and support JDK such as JDK 11 LTS and future LTS release. I would like to have a discussion on this in Spark community. Thanks, DB Tsai | Siri Open Source Technologies [not a contribution] | 

Re: Java 11 support

2018-11-06 Thread Sean Owen
I think that Java 9 support basically gets Java 10, 11 support. But the jump from 8 to 9 is unfortunately more breaking than usual because of the total revamping of the internal JDK classes. I think it will be mostly a matter of dependencies needing updates to work. I agree this is probably pretty

Re: Test and support only LTS JDK release?

2018-11-06 Thread DB Tsai
OpenJDK will follow Oracle's release cycle, https://openjdk.java.net/projects/jdk/ , a strict six months model. I'm not familiar with other non-Oracle VMs and Redhat support. DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Felix Cheung
We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11) I think it’s ok to announce the release of 2.4.0 From: Wenchen Fan Sent: Tuesday, November 6, 2018 8:51 AM To: Felix Cheung Cc: Matei Zaharia; Sean

Java 11 support

2018-11-06 Thread Felix Cheung
Speaking of, get we work to support Java 11? That will fix all the problems below. From: Felix Cheung Sent: Tuesday, November 6, 2018 8:57 AM To: Wenchen Fan Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman Subject: Re:

Re: Java 11 support

2018-11-06 Thread Felix Cheung
+1 for Spark 3, definitely Thanks for the updates From: Sean Owen Sent: Tuesday, November 6, 2018 9:11 AM To: Felix Cheung Cc: dev Subject: Re: Java 11 support I think that Java 9 support basically gets Java 10, 11 support. But the jump from 8 to 9 is

Re: Test and support only LTS JDK release?

2018-11-06 Thread Reynold Xin
What does OpenJDK do and other non-Oracle VMs? I know there was a lot of discussions from Redhat etc to support. On Tue, Nov 6, 2018 at 11:24 AM DB Tsai wrote: > Given Oracle's new 6-month release model, I feel the only realistic option > is to only test and support JDK such as JDK 11 LTS and

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Felix Cheung
Shivaram and I were discussing. Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release. From: Matei Zaharia Sent: Tuesday, November 6, 2018 12:07 AM To: Felix

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Dongjoon Hyun
+1 for making Scala 2.12 as default for Spark 3.0. Bests, Dongjoon. On Tue, Nov 6, 2018 at 11:13 AM DB Tsai wrote: > We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next > Spark version will be 3.0, so it's a great time to discuss should we make > Scala 2.12 as default

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread DB Tsai
Ideally, supporting only Scala 2.12 in Spark 3 will be ideal. DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc > On Nov 6, 2018, at 2:55 PM, Felix Cheung wrote: > > So to clarify, only scala 2.12 is supported in Spark 3? > > > From: Ryan Blue > Sent: Tuesday,

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Wenchen Fan
We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x? On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin wrote: > Have we deprecated Scala 2.11 already in an existing release? > > On Tue, Nov 6, 2018 at

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread DB Tsai
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Sean Owen
That's possible here, sure. The issue is: would you exclude Scala 2.13 support in 3.0 for this, if it were otherwise ready to go? I think it's not a hard rule that something has to be deprecated previously to be removed in a major release. The notice is helpful, sure, but there are lots of ways to

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Reynold Xin
Have we deprecated Scala 2.11 already in an existing release? On Tue, Nov 6, 2018 at 4:43 PM DB Tsai wrote: > Ideally, supporting only Scala 2.12 in Spark 3 will be ideal. > > DB Tsai | Siri Open Source Technologies [not a contribution] |  > Apple, Inc > > > On Nov 6, 2018, at 2:55 PM,

Re: Test and support only LTS JDK release?

2018-11-06 Thread Ryan Blue
+1 for supporting LTS releases. On Tue, Nov 6, 2018 at 11:48 AM Robert Stupp wrote: > +1 on supporting LTS releases. > > VM distributors (RedHat, Azul - to name two) want to provide patches to > LTS versions (i.e. into http://hg.openjdk.java.net/jdk-updates/jdk11u/). > How that will play out in

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Felix Cheung
So to clarify, only scala 2.12 is supported in Spark 3? From: Ryan Blue Sent: Tuesday, November 6, 2018 1:24 PM To: d_t...@apple.com Cc: Sean Owen; Spark Dev List; cdelg...@apple.com Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0 +1 to Scala

Re: Test and support only LTS JDK release?

2018-11-06 Thread Felix Cheung
Is there a list of LTS release that I can reference? From: Ryan Blue Sent: Tuesday, November 6, 2018 1:28 PM To: sn...@snazy.de Cc: Spark Dev List; cdelg...@apple.com Subject: Re: Test and support only LTS JDK release? +1 for supporting LTS releases. On Tue,

Re: Test and support only LTS JDK release?

2018-11-06 Thread Robert Stupp
+1 on supporting LTS releases. VM distributors (RedHat, Azul - to name two) want to provide patches to LTS versions (i.e. into http://hg.openjdk.java.net/jdk-updates/jdk11u/). How that will play out in reality ... I don't know. Whether Oracle will contribute to that repo for 8 after it's EOL

Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Ryan Blue
+1 to Scala 2.12 as the default in Spark 3.0. On Tue, Nov 6, 2018 at 11:50 AM DB Tsai wrote: > +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. > > As Scala 2.11 will not support Java 11 unless we make a significant > investment, if we decide not to drop Scala 2.11 in Spark 3.0,

Re: Test and support only LTS JDK release?

2018-11-06 Thread Marcelo Vanzin
https://www.oracle.com/technetwork/java/javase/eol-135779.html On Tue, Nov 6, 2018 at 2:56 PM Felix Cheung wrote: > > Is there a list of LTS release that I can reference? > > > > From: Ryan Blue > Sent: Tuesday, November 6, 2018 1:28 PM > To: sn...@snazy.de > Cc:

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

2018-11-06 Thread Matei Zaharia
Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas. Matei > On Nov 5, 2018, at 9:09 PM, Felix Cheung wrote: > > I don¡Št know what the cause is yet.

Removing non-deprecated R methods that were deprecated in Python, Scala?

2018-11-06 Thread Sean Owen
See https://github.com/apache/spark/pull/22921#discussion_r230568058 Methods like toDegrees, toRadians, approxCountDistinct were 'renamed' in Spark 2.1: deprecated, and replaced with an identical method with different name. However, these weren't actually deprecated in SparkR. Is it an oversight