[
https://issues.apache.org/jira/browse/CARBONDATA-1290?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16082104#comment-16082104
]
sehriff commented on CARBONDATA-1290:
-------------------------------------
max works right because column id is of string type,but delete problem is valid.
> [branch-1.1]-max() and delete problem
> -------------------------------------
>
> Key: CARBONDATA-1290
> URL: https://issues.apache.org/jira/browse/CARBONDATA-1290
> Project: CarbonData
> Issue Type: Bug
> Reporter: sehriff
>
> 1.max function is not return the right result;
> scala> cc.sql("select * from qqdata2.fullappend where
> id=19999999").show(false)
> +--------+------------------------+----------+-----------------+------+----+----------------------+----+
> |id |qqnum |nick |age
> |gender|auth|qunnum |mvcc|
> +--------+------------------------+----------+-----------------+------+----+----------------------+----+
> |19999999|19999999aaaaaaaa19999999|2009-05-27|19999999c19999999|1 |1
> |19999999dddddd19999999|1 |
> +--------+------------------------+----------+-----------------+------+----+----------------------+----+
> scala> cc.sql("select max(id) from qqdata2.fullappend ").show(false)
> +-------+
> |max(id)|
> +-------+
> |9999999|
> +-------+
> 2.delete error
> scala> cc.sql("delete from qqdata2.fullappend where id>1 and id<100000").show
> 17/07/11 17:32:33 AUDIT ProjectForDeleteCommand:[Thread-1] Delete data
> request has been received for qqdata2.fullappend.
> [Stage 21:> (0 + 2) /
> 2]17/07/11 17:32:52 WARN TaskSetManager: Lost task 1.0 in stage 21.0 (TID 40,
> executor 2): java.lang.ArrayIndexOutOfBoundsException: 1
> at
> org.apache.carbondata.core.mutate.CarbonUpdateUtil.getRequiredFieldFromTID(CarbonUpdateUtil.java:67)
> at
> org.apache.carbondata.core.mutate.CarbonUpdateUtil.getSegmentWithBlockFromTID(CarbonUpdateUtil.java:76)
> at
> org.apache.spark.sql.execution.command.deleteExecution$$anonfun$4.apply(IUDCommands.scala:555)
> at
> org.apache.spark.sql.execution.command.deleteExecution$$anonfun$4.apply(IUDCommands.scala:552)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> at
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> at org.apache.spark.scheduler.Task.run(Task.scala:99)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)