Hi, Are you getting this exception continuously for every load? Usually it occurs when you try to load the data concurrently to the same table. So please make sure that no other instance of carbon is running and data load on the same table is not happening. Check if any locks are created under system temp folder with <detabasename>/<tablename>/lockfile, if it exists please delete.
Thanks & Regards, Ravi On Mon, 29 Aug 2016 1:27 pm Zen Wellon, <[email protected]> wrote: > Hi guys, > When I tried to load some data into carbondata table with carbon 0.1.0, I > met a problem below. > > WARN 29-08 15:40:17,535 - Lost task 10.0 in stage 2.1 (TID 365, > amlera-30-6.gtj): java.lang.RuntimeException: Dictionary file ***(sensitive > column) is locked for updation. Please try after some time > at scala.sys.package$.error(package.scala:27) > at > > org.apache.carbondata.spark.rdd.CarbonGlobalDictionaryGenerateRDD$$anon$1.<init>(CarbonGlobalDictionaryRDD.scala:354) > at > > org.apache.carbondata.spark.rdd.CarbonGlobalDictionaryGenerateRDD.compute(CarbonGlobalDictionaryRDD.scala:294) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) > at > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) > at org.apache.spark.scheduler.Task.run(Task.scala:89) > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227) > at > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > > -- > > > Best regards, > William Zen >
