回复: Block B-tree loading failed add debug information
dev What time is expected to release a patch, we can test 发件人: yixu2001 发送时间: 2017-10-30 18:11 收件人: dev 主题: Block B-tree loading failed add debug information dev environment spark.2.1.1 carbondata 1.1.1 hadoop 2.7.2 add debug information Block B-tree loading faile why CarbonUtil.calculateMetaSizeCalculation results getBlockLength=0 getBlockOffset=8301549 ? Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata :getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 isV1=false getVersion=ColumnarFormatV3 1 debug information scala> cc.sql("select prod_inst_id,count(*) from e_carbon.prod_inst_his1023c group by prod_inst_id having count(*)>1").show [Stage 0:=>(157 + 50) / 283]17/10/30 10:39:24 WARN scheduler.TaskSetManager: Lost task 252.0 in stage 0.0 (TID 201, HDD010, executor 22): org.apache.carbondata.core.datastore.exception.IndexBuilderException: Block B-tree loading failed at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:264) at org.apache.carbondata.core.datastore.BlockIndexStore.getAll(BlockIndexStore.java:189) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.initQuery(AbstractQueryExecutor.java:131) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.getBlockExecutionInfos(AbstractQueryExecutor.java:186) at org.apache.carbondata.core.scan.executor.impl.VectorDetailQueryExecutor.execute(VectorDetailQueryExecutor.java:36) at org.apache.carbondata.spark.vectorreader.VectorizedCarbonRecordReader.initialize(VectorizedCarbonRecordReader.java:112) at org.apache.carbondata.spark.rdd.CarbonScanRDD.compute(CarbonScanRDD.scala:204) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.util.concurrent.ExecutionException: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 getVersion=ColumnarFormatV3 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:254) ... 21 more Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata=lianch:getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 isV1=false getVersion=ColumnarFormatV3 at org.apache.carbondata.core.datastore.AbstractBlockIndexStoreCache.checkAndLoadTableBlocks(AbstractBlockIndexStoreCache.java:116) at org.apache.carbondata.core.datastore.BlockIndexStore.loadBlock(BlockIndexStore.java:304) at org.apache.carbondata.core.datastore.BlockIndexStore.get(BlockIndexStore.java:109) at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:294) at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:284) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 more [Stage 0:==> (223 + 50) / 283]17/10/30 10:39:26 ERROR scheduler.TaskSetManager: Task 252 in stage 0.0 failed 10 times; aborting job 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 61.0 in stage 0.0 (TID 184, HDD012, executor 7): TaskKilled (killed intentionally) 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 71.0 in stage 0.0 (TID 212, HDD008, executor 1
Block B-tree loading failed add debug information
dev environment spark.2.1.1 carbondata 1.1.1 hadoop 2.7.2 add debug information Block B-tree loading faile why CarbonUtil.calculateMetaSizeCalculation results getBlockLength=0 getBlockOffset=8301549 ? Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata :getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 isV1=false getVersion=ColumnarFormatV3 1 debug information scala> cc.sql("select prod_inst_id,count(*) from e_carbon.prod_inst_his1023c group by prod_inst_id having count(*)>1").show [Stage 0:=>(157 + 50) / 283]17/10/30 10:39:24 WARN scheduler.TaskSetManager: Lost task 252.0 in stage 0.0 (TID 201, HDD010, executor 22): org.apache.carbondata.core.datastore.exception.IndexBuilderException: Block B-tree loading failed at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:264) at org.apache.carbondata.core.datastore.BlockIndexStore.getAll(BlockIndexStore.java:189) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.initQuery(AbstractQueryExecutor.java:131) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.getBlockExecutionInfos(AbstractQueryExecutor.java:186) at org.apache.carbondata.core.scan.executor.impl.VectorDetailQueryExecutor.execute(VectorDetailQueryExecutor.java:36) at org.apache.carbondata.spark.vectorreader.VectorizedCarbonRecordReader.initialize(VectorizedCarbonRecordReader.java:112) at org.apache.carbondata.spark.rdd.CarbonScanRDD.compute(CarbonScanRDD.scala:204) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.util.concurrent.ExecutionException: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 getVersion=ColumnarFormatV3 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:254) ... 21 more Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata=lianch:getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 isV1=false getVersion=ColumnarFormatV3 at org.apache.carbondata.core.datastore.AbstractBlockIndexStoreCache.checkAndLoadTableBlocks(AbstractBlockIndexStoreCache.java:116) at org.apache.carbondata.core.datastore.BlockIndexStore.loadBlock(BlockIndexStore.java:304) at org.apache.carbondata.core.datastore.BlockIndexStore.get(BlockIndexStore.java:109) at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:294) at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:284) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 more [Stage 0:==> (223 + 50) / 283]17/10/30 10:39:26 ERROR scheduler.TaskSetManager: Task 252 in stage 0.0 failed 10 times; aborting job 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 61.0 in stage 0.0 (TID 184, HDD012, executor 7): TaskKilled (killed intentionally) 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 71.0 in stage 0.0 (TID 212, HDD008, executor 18): TaskKilled (killed intentionally) 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 27.0 in stage 0.0 (TID 83, HDD007, executor 8): TaskKilled (killed intentiona
Block B-tree loading failed add debug information
dev environment spark.2.1.1 carbondata 1.1.1 hadoop 2.7.2 add debug information Block B-tree loading failed Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata :getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 isV1=false getVersion=ColumnarFormatV3 scala> cc.sql("select prod_inst_id,count(*) from e_carbon.prod_inst_his1023c group by prod_inst_id having count(*)>1").show [Stage 0:=>(157 + 50) / 283]17/10/30 10:39:24 WARN scheduler.TaskSetManager: Lost task 252.0 in stage 0.0 (TID 201, HDD010, executor 22): org.apache.carbondata.core.datastore.exception.IndexBuilderException: Block B-tree loading failed at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:264) at org.apache.carbondata.core.datastore.BlockIndexStore.getAll(BlockIndexStore.java:189) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.initQuery(AbstractQueryExecutor.java:131) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.getBlockExecutionInfos(AbstractQueryExecutor.java:186) at org.apache.carbondata.core.scan.executor.impl.VectorDetailQueryExecutor.execute(VectorDetailQueryExecutor.java:36) at org.apache.carbondata.spark.vectorreader.VectorizedCarbonRecordReader.initialize(VectorizedCarbonRecordReader.java:112) at org.apache.carbondata.spark.rdd.CarbonScanRDD.compute(CarbonScanRDD.scala:204) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.util.concurrent.ExecutionException: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 getVersion=ColumnarFormatV3 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:254) ... 21 more Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_his1023c/Fact/Part0/Segment_1.1/part-0-172_batchno0-0-1508833127408.carbondata=lianch:getBlockLength=0 getBlockOffset=8301549 requiredMetaSize=-8301549 isV1=false getVersion=ColumnarFormatV3 at org.apache.carbondata.core.datastore.AbstractBlockIndexStoreCache.checkAndLoadTableBlocks(AbstractBlockIndexStoreCache.java:116) at org.apache.carbondata.core.datastore.BlockIndexStore.loadBlock(BlockIndexStore.java:304) at org.apache.carbondata.core.datastore.BlockIndexStore.get(BlockIndexStore.java:109) at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:294) at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:284) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 more [Stage 0:==> (223 + 50) / 283]17/10/30 10:39:26 ERROR scheduler.TaskSetManager: Task 252 in stage 0.0 failed 10 times; aborting job 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 61.0 in stage 0.0 (TID 184, HDD012, executor 7): TaskKilled (killed intentionally) 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 71.0 in stage 0.0 (TID 212, HDD008, executor 18): TaskKilled (killed intentionally) 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 27.0 in stage 0.0 (TID 83, HDD007, executor 8): TaskKilled (killed intentionally) 17/10/30 10:39:26 WARN scheduler.TaskSetManager: Lost task 94.0 in stage 0.0 (TID 250, HDD014, executor 2
Re: Block B-tree loading failed
Hi Looks that the path is invalid, can you provide full script: how you created carbonsession? - Caused by: org.apache.carbondata.core.datastore.exception.IndexBuilderException: Invalid carbon data file: hdfs://ns1/user/e_carbon/public/carbon.store/e_carbon/prod_inst_cold/Fact/Part0/Segment_0/part-0-30_batchno0-0-1505272524271.carbondata -- Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Block B-tree loading failed
dev spark2.1.1 carbon 1.1.1 scala> cc.sql("select area_code,count(*) from e_carbon.prod_inst_cold group by area_code").show; [Stage 0:> (0 + 18) / 243]17/09/13 17:48:33 WARN scheduler.TaskSetManager: Lost task 8.0 in stage 0.0 (TID 17, HDD008, executor 3): org.apache.carbondata.core.datastore.exception.IndexBuilderException: Block B-tree loading failed at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:264) at org.apache.carbondata.core.datastore.BlockIndexStore.getAll(BlockIndexStore.java:189) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.initQuery(AbstractQueryExecutor.java:131) at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.getBlockExecutionInfos(AbstractQueryExecutor.java:186) at org.apache.carbondata.core.scan.executor.impl.VectorDetailQueryExecutor.execute(VectorDetailQueryExecutor.java:36) at org.apache.carbondata.spark.vectorreader.VectorizedCarbonRecordReader.initialize(VectorizedCarbonRecordReader.java:112) at org.apache.carbondata.spark.rdd.CarbonScanRDD.compute(CarbonScanRDD.scala:204) at org.apache.spark.rdd.RDD.computeOrReadCheckpoi