Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-21 Thread Steve Loughran
> On 20 Nov 2015, at 21:39, Reynold Xin wrote: > > OK I'm not exactly asking for a vote here :) > > I don't think we should look at it from only maintenance point of view -- > because in that case the answer is clearly supporting as few versions as > possible (or just rm -rf spark source code

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-21 Thread Sean Owen
On Fri, Nov 20, 2015 at 10:39 PM, Reynold Xin wrote: > I don't think we should look at it from only maintenance point of view -- > because in that case the answer is clearly supporting as few versions as > possible (or just rm -rf spark source code and call it a day). It is a > tradeoff between th

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread Chester Chen
for #1-3, the answer is likely No. Recently we upgrade to Spark 1.5.1, with CDH5.3, CDH5.4 and HDP2.2 and others. We were using CDH5.3 client to talk to CDH5.4. We were doing this to see if we support many different hadoop cluster versions without changing the build. This was ok for yarn-clu

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread Sandy Ryza
To answer your fourth question from Cloudera's perspective, we would never support a customer running Spark 2.0 on a Hadoop version < 2.6. -Sandy On Fri, Nov 20, 2015 at 1:39 PM, Reynold Xin wrote: > OK I'm not exactly asking for a vote here :) > > I don't think we should look at it from only m

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread Reynold Xin
OK I'm not exactly asking for a vote here :) I don't think we should look at it from only maintenance point of view -- because in that case the answer is clearly supporting as few versions as possible (or just rm -rf spark source code and call it a day). It is a tradeoff between the number of user

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread Steve Loughran
On 20 Nov 2015, at 14:28, ches...@alpinenow.com wrote: Assuming we have 1.6 and 1.7 releases, then spark 2.0 is about 9 months away. customer will need to upgrade the new Hadoop clusters to Apache 2.6 or later to leverage new spark 2.0 in one year. I think this po

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread chester
Assuming we have 1.6 and 1.7 releases, then spark 2.0 is about 9 months away. customer will need to upgrade the new Hadoop clusters to Apache 2.6 or later to leverage new spark 2.0 in one year. I think this possible as latest release on cdh5.x, HDP 2.x are both on Apache 2.6.0 already. Company

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread Steve Loughran
On 19 Nov 2015, at 22:14, Reynold Xin mailto:r...@databricks.com>> wrote: I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I think everybody is for that. https://issues.apache.org/jira/browse/SPARK-11807 Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4.

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-20 Thread Saisai Shao
+1. Hadoop 2.6 would be a good choice with many features added (like supporting long running service, label based scheduling). Currently there's lot of reflection codes to support multiple version of Yarn, so upgrading to a newer version will really ease the pain :). Thanks Saisai On Fri, Nov 20

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-19 Thread Jean-Baptiste Onofré
+1 Regards JB On 11/19/2015 11:14 PM, Reynold Xin wrote: I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I think everybody is for that. https://issues.apache.org/jira/browse/SPARK-11807 Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is to say, k

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-19 Thread Henri Dubois-Ferriere
+1 On 19 November 2015 at 14:14, Reynold Xin wrote: > I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I > think everybody is for that. > > https://issues.apache.org/jira/browse/SPARK-11807 > > Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is > to

Re: Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-19 Thread Ted Yu
:dev@spark.apache.org; Sean Owen; > Thomas Graves > 主 题:Dropping support for earlier Hadoop versions in Spark 2.0? > > > I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I > think everybody is for that. > > https://issues.apache.org/jira/browse/SPARK-1

回复:Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-19 Thread 张志强(旺轩)
I agreed +1--发件人:Reynold Xin日 期:2015年11月20日 06:14:44收件人:dev@spark.apache.org; Sean Owen; Thomas Graves主 题:Dropping support for earlier Hadoop versions in Spark 2.0?I proposed dropping support for Hadoop 1.x in the Spark 2.0 email

Dropping support for earlier Hadoop versions in Spark 2.0?

2015-11-19 Thread Reynold Xin
I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I think everybody is for that. https://issues.apache.org/jira/browse/SPARK-11807 Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is to say, keep only Hadoop 2.6 and greater. What are the community's th