[ 
https://issues.apache.org/jira/browse/CARBONDATA-1141?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16073607#comment-16073607
 ] 

Jatin commented on CARBONDATA-1141:
-----------------------------------

I have tried the same scenario with latest code but I didn't able to reproduce 
the scenario. Please provide more details.

> Data load is partially successful  but delete error
> ---------------------------------------------------
>
>                 Key: CARBONDATA-1141
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-1141
>             Project: CarbonData
>          Issue Type: Bug
>          Components: spark-integration, sql
>    Affects Versions: 1.2.0
>         Environment: spark on 
> yarn,carbondata1.2.0,hadoop2.7,spark2.1.0,hive2.1.0
>            Reporter: zhuzhibin
>             Fix For: 1.2.0
>
>         Attachments: error1.png, error.png
>
>
> when I tried to load data into table (data size is about 300 million),the log 
> showed me that “Data load is partially successful for table",
> but when I executed delete table operation,some errors appeared,the error 
> message is "java.lang.ArrayIndexOutOfBoundsException: 1
> at 
> org.apache.carbondata.core.mutate.CarbonUpdateUtil.getRequiredFieldFromTID(CarbonUpdateUtil.java:67)".
> when I executed another delete table operation with where condition,it was 
> succeeful,but executed select operation then appeared 
> "java.lang.ArrayIndexOutOfBoundsException Driver stacktrace:
>   at 
> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)"
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to