Re: Re: Failed to APPEND_FILE, hadoop.hdfs.protocol.AlreadyBeingCreatedException

2017-01-20 Thread Ravindra Pesala
Hi, Please use "mvn clean -DskipTests -Pspark-1.5 -Dspark.version=1.5.2 -Phadoop-2.7.2 package" Regards, Ravindra On 20 January 2017 at 15:42, manish gupta wrote: > Can you try compiling with hadoop-2.7.2 version and use it and let us know > if the issue still

Re: Re: Failed to APPEND_FILE, hadoop.hdfs.protocol.AlreadyBeingCreatedException

2017-01-20 Thread manish gupta
Can you try compiling with hadoop-2.7.2 version and use it and let us know if the issue still persists. "mvn package -DskipTests -Pspark-1.5.2 -Phadoop-2.7.2 -DskipTests" Regards Manish Gupta On Fri, Jan 20, 2017 at 1:30 PM, 彭 wrote: > I build the jar with hadoop2.6, like

Re: Re: Failed to APPEND_FILE, hadoop.hdfs.protocol.AlreadyBeingCreatedException

2017-01-20 Thread Liang Chen
Hi mvn -DskipTests -Pspark-1.5 -Dspark.version=1.5.2 clean package Please refer to build doc: https://github.com/apache/incubator-carbondata/tree/master/build Regards Liang 2017-01-20 16:00 GMT+08:00 彭 : > I build the jar with hadoop2.6, like "mvn package -DskipTests >

Re: Failed to APPEND_FILE, hadoop.hdfs.protocol.AlreadyBeingCreatedException

2017-01-19 Thread manish gupta
Hi, Which version of hadoop you are using while compiling the carbondata jar? If you are using hadoop-2.2.0, then please go through the below link which says that there is some issue with hadoop-2.2.0 while writing a file in append mode.

Re: Failed to APPEND_FILE, hadoop.hdfs.protocol.AlreadyBeingCreatedException

2017-01-19 Thread ffpeng90
I have met the same problem. I load data for three times and this exception always throws at the third time. I use the branch-1.0 version from git. Table : cc.sql(s"create table if not exists flightdb15(ID Int, date string, country string, name string, phonetype string, serialname string,