Should a new job be setup under Spark-Master-Maven-with-YARN for hadoop 2.6.x ?
Cheers On Thu, Nov 19, 2015 at 5:16 PM, 张志强(旺轩) <zzq98...@alibaba-inc.com> wrote: > I agreed > +1 > > ------------------------------------------------------------------ > 发件人:Reynold Xin<r...@databricks.com> > 日 期:2015年11月20日 06:14:44 > 收件人:dev@spark.apache.org<dev@spark.apache.org>; Sean Owen<sro...@gmail.com>; > Thomas Graves<tgra...@apache.org> > 主 题:Dropping support for earlier Hadoop versions in Spark 2.0? > > > I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I > think everybody is for that. > > https://issues.apache.org/jira/browse/SPARK-11807 > > Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is > to say, keep only Hadoop 2.6 and greater. > > What are the community's thoughts on that? > > >