Re: Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-03-30 Thread BabuLal
Hi yixu2001 Can you please verify your issue with PR https://github.com/apache/carbondata/pull/2097 . PR is for Branch 1.3 because you are using carbondata1.3 . Let me know if issue still exists. Thanks Babu -- Sent from:

Re: Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-03-23 Thread BabuLal
Hi Issue is fixed and PR is raised. 1. PR :- https://github.com/apache/carbondata/pull/2097 2. Below situation is handled in PR a. Skip 0 byte deletedelta b. On OutputStream Close/flush if any error is thrown from hdfs (SpaceQuota/No Lease ..ect) then Exception was not thrown to

Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-03-22 Thread BabuLal
hi all i am able to reproduce same exception in my cluster and got the same exception. (Trace is listed below) -- scala> carbon.sql("select count(*) from public.c_compact4").show 2018-03-22 20:40:33,105 | WARN | main | main spark.sql.sources.options.keys expected, but read nothing |

Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-03-20 Thread BabuLal
Hi yixu2001 We have tried same code which is given in the below mail but not able to reproduce the same. Please find some of the analysis points 1. As per your Exception

Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-03-20 Thread Liang Chen
Hi Thanks for your feedback. Let me first reproduce this issue, and check the detail. Regards Liang yixu2001 wrote > I'm using carbondata1.3+spark2.1.1+hadoop2.7.1 to do multi update > operations > here is the replay step: > > import org.apache.spark.sql.SparkSession > import

Re: Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-02-26 Thread yixu2001
eout=1s" \ --conf "spark.yarn.executor.memoryOverhead=2048" \ --conf "spark.yarn.driver.memoryOverhead=1024" \ --conf "spark.speculation=true" \ --conf "spark.sql.warehouse.dir=/apps/hive/warehouse" \ --conf "spark.rpc.askTimeout=300" \ --co

Re: Getting [Problem in loading segment blocks] error after doing multi update operations

2018-02-26 Thread sounak
Hi, I tried to reproduce the issue but it is running fine. Are you running this script in a cluster and any special configuration you have set in carbon.properties? The script almost ran 200 times but no problem was observed. On Sun, Feb 25, 2018 at 1:59 PM, 杨义 wrote: >

Getting [Problem in loading segment blocks] error after doing multi update operations

2018-02-25 Thread 杨义
I'm using carbondata1.3+spark2.1.1+hadoop2.7.1 to do multi update operations here is the replay step: import org.apache.spark.sql.SparkSession import org.apache.spark.sql.CarbonSession._ val cc = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user/ip_crm") //