yes , spark download page does mention that 2.2.1 is for 'hadoop-2.7 and later', but my confusion is because spark was released on 1st dec and hadoop-3 stable version released on 13th Dec. And to my similar question on stackoverflow.com <https://stackoverflow.com/questions/47920005/how-is-hadoop-3-0-0-s-compatibility-with-older-versions-of-hive-pig-sqoop-and> , Mr. jacek-laskowski <https://stackoverflow.com/users/1305344/jacek-laskowski> replied that spark-2.2.1 doesn't support hadoop-3. so I am just looking for more clarity on this doubt before moving on to upgrades.
Thanks all for help. Akshay. On Mon, Jan 8, 2018 at 8:47 AM, Saisai Shao <sai.sai.s...@gmail.com> wrote: > AFAIK, there's no large scale test for Hadoop 3.0 in the community. So it > is not clear whether it is supported or not (or has some issues). I think > in the download page "Pre-Built for Apache Hadoop 2.7 and later" mostly > means that it supports Hadoop 2.7+ (2.8...), but not 3.0 (IIUC). > > Thanks > Jerry > > 2018-01-08 4:50 GMT+08:00 Raj Adyanthaya <raj...@gmail.com>: > >> Hi Akshay >> >> On the Spark Download page when you select Spark 2.2.1 it gives you an >> option to select package type. In that, there is an option to select >> "Pre-Built for Apache Hadoop 2.7 and later". I am assuming it means that it >> does support Hadoop 3.0. >> >> http://spark.apache.org/downloads.html >> >> Thanks, >> Raj A. >> >> On Sat, Jan 6, 2018 at 8:23 PM, akshay naidu <akshaynaid...@gmail.com> >> wrote: >> >>> hello Users, >>> I need to know whether we can run latest spark on latest hadoop version >>> i.e., spark-2.2.1 released on 1st dec and hadoop-3.0.0 released on 13th dec. >>> thanks. >>> >> >> >