Just run : mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package
Thanks Best Regards On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura <sandeepv...@gmail.com> wrote: > Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh > > I am running the below command in spark/yarn directory where pom.xml file > is available > > mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package > > Please correct me if i am wrong. > > > > > On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao <sai.sai.s...@gmail.com> > wrote: > >> Looks like you have to build Spark with related Hadoop version, otherwise >> you will meet exception as mentioned. you could follow this doc: >> http://spark.apache.org/docs/latest/building-spark.html >> >> 2015-03-25 15:22 GMT+08:00 sandeep vura <sandeepv...@gmail.com>: >> >>> Hi Sparkers, >>> >>> I am trying to load data in spark with the following command >>> >>> *sqlContext.sql("LOAD DATA LOCAL INPATH >>> '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");* >>> >>> *Getting exception below* >>> >>> >>> *Server IPC version 9 cannot communicate with client version 4* >>> >>> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13 >>> >>> >>> >>> >>> >> >