[ 
https://issues.apache.org/jira/browse/CARBONDATA-4048?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chetan Bhat closed CARBONDATA-4048.
-----------------------------------
    Resolution: Won't Fix

subquery used within update query fetches more than 1 row. Hence the exception 
was thrown. Exception message is handled properly  and carbon is printing the 
message from spark directly.

> Update fails after continous update operations with error
> ---------------------------------------------------------
>
>                 Key: CARBONDATA-4048
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-4048
>             Project: CarbonData
>          Issue Type: Bug
>          Components: data-query
>    Affects Versions: 2.1.0
>         Environment: Spark 2.3.2
>            Reporter: Chetan Bhat
>            Priority: Minor
>
> Create table , load data and perform continous update operation on the table.
> 0: jdbc:hive2://10.20.255.171:23040> CREATE TABLE uniqdata (CUST_ID 
> int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ 
> timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 
> decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, 
> Double_COLUMN2 double,INTEGER_COLUMN1 int) stored as carbondata TBLPROPERTIES 
> ("TABLE_BLOCKSIZE"= "256 MB",'flat_folder'='true');
> +---------+--+
> | Result |
> +---------+--+
> +---------+--+
> No rows selected (0.177 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> LOAD DATA inpath 
> 'hdfs://hacluster/chetan/2000_UniqData.csv' INTO table uniqdata 
> options('DELIMITER'=',', 'FILEHEADER'='CUST_ID, CUST_NAME, 
> ACTIVE_EMUI_VERSION, DOB, DOJ, BIGINT_COLUMN1, BIGINT_COLUMN2, 
> DECIMAL_COLUMN1, DECIMAL_COLUMN2, Double_COLUMN1, Double_COLUMN2, 
> INTEGER_COLUMN1','TIMESTAMPFORMAT'='yyyy-MM-dd 
> HH:mm:ss','BAD_RECORDS_ACTION'='FORCE');
> +---------+--+
> | Result |
> +---------+--+
> +---------+--+
> No rows selected (1.484 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1<123372036856;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 2 |
> +--------------------+--+
> 1 row selected (3.294 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1>123372038852;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 1 |
> +--------------------+--+
> 1 row selected (3.467 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1<=123372036859;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 9 |
> +--------------------+--+
> 1 row selected (3.349 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1>=123372038846;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 8 |
> +--------------------+--+
> 1 row selected (3.259 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1 == 123372038845;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 1 |
> +--------------------+--+
> 1 row selected (4.164 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1 like '123%';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 2000 |
> +--------------------+--+
> 1 row selected (3.695 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1 between 123372038849 AND 
> 123372038855;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 5 |
> +--------------------+--+
> 1 row selected (3.228 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1 = 123372038845 OR false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 1 |
> +--------------------+--+
> 1 row selected (3.548 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1 = 123372038849 AND true;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 1 |
> +--------------------+--+
> 1 row selected (3.321 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(100) where bigint_column1 not between (123372038849) AND 
> (12337203885);
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 4025 |
> +--------------------+--+
> 1 row selected (3.718 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name<'CUST_NAME_01990';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 5978 |
> +--------------------+--+
> 1 row selected (4.109 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name>'CUST_NAME_01990';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 6022 |
> +--------------------+--+
> 1 row selected (3.643 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name<='CUST_NAME_01990';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 5981 |
> +--------------------+--+
> 1 row selected (3.713 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name >='CUST_NAME_01990';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 12050 |
> +--------------------+--+
> 1 row selected (4.019 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name=='CUST_NAME_01990';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 3 |
> +--------------------+--+
> 1 row selected (3.327 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name like 'CUST_NAME_0199%';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 47 |
> +--------------------+--+
> 1 row selected (3.47 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name between 'CUST_NAME_01990' AND 
> 'CUST_NAME_01995';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 26 |
> +--------------------+--+
> 1 row selected (3.512 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name = 'CUST_NAME_01990' OR false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 3 |
> +--------------------+--+
> 1 row selected (3.589 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name='cust_00033' AND false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 0 |
> +--------------------+--+
> 1 row selected (0.462 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=('deepti') where cust_name not between 'CUST_NAME_01990' AND 
> 'CUST_NAME_1999';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 18107 |
> +--------------------+--+
> 1 row selected (4.353 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100.00) where decimal_column1<12345678917.1234000000;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 224 |
> +--------------------+--+
> 1 row selected (3.494 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100.00) where decimal_column1>12345.6808982656;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 23964 |
> +--------------------+--+
> 1 row selected (4.677 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100.00) where decimal_column1<=12345678917.1234000000;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 24247 |
> +--------------------+--+
> 1 row selected (4.717 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100.00) where decimal_column1>=12345.6807932656;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 6025 |
> +--------------------+--+
> 1 row selected (3.754 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100.00) where decimal_column1=12345.6807332656;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 0 |
> +--------------------+--+
> 1 row selected (2.198 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100) where decimal_column1 like '123%';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 6025 |
> +--------------------+--+
> 1 row selected (4.173 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100) where decimal_column1 between 12345.6808882656 AND 
> 12345.6809002656;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 0 |
> +--------------------+--+
> 1 row selected (2.113 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100) where (decimal_column1) = (12345678920.1234000000) OR 
> false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 3 |
> +--------------------+--+
> 1 row selected (3.798 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100) where decimal_column1 = 12345.6808782656 OR false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 0 |
> +--------------------+--+
> 1 row selected (2.126 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(100.0) where decimal_column1 not between 12345.6808882656 
> AND 12345.6809002656;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 42325 |
> +--------------------+--+
> 1 row selected (5.33 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(10000) 
> where cust_id<10987;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 47870 |
> +--------------------+--+
> 1 row selected (5.907 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(1000) 
> where cust_id>10987;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 456 |
> +--------------------+--+
> 1 row selected (3.596 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id<=10987;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 96220 |
> +--------------------+--+
> 1 row selected (7.291 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id>=10987;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 480 |
> +--------------------+--+
> 1 row selected (3.597 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id==10987;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 24 |
> +--------------------+--+
> 1 row selected (3.534 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id like '109%';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 2568 |
> +--------------------+--+
> 1 row selected (3.909 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id between 10987 AND 10999;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 480 |
> +--------------------+--+
> 1 row selected (3.717 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id = 10987 OR false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 24 |
> +--------------------+--+
> 1 row selected (3.535 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id = 10987 AND false;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 0 |
> +--------------------+--+
> 1 row selected (0.467 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_id)=(100) 
> where cust_id not between 10987 AND 10999;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 147666 |
> +--------------------+--+
> 1 row selected (9.295 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(ASCII(cust_name)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 24 |
> +--------------------+--+
> 1 row selected (3.562 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(CONCAT(cust_name,ACTIVE_EMUI_VERSION)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 48 |
> +--------------------+--+
> 1 row selected (3.785 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(CONCAT('Mr.', cust_name)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 72 |
> +--------------------+--+
> 1 row selected (3.509 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(concat_ws('/',cust_name)) where cust_id=10902;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 24 |
> +--------------------+--+
> 1 row selected (3.525 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(concat_ws('/',cust_name,active_emui_version)) where 
> cust_id=10902;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 48 |
> +--------------------+--+
> 1 row selected (3.537 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(LENGTH(cust_name)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 96 |
> +--------------------+--+
> 1 row selected (3.627 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(LENGTH(bigint_column1)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 120 |
> +--------------------+--+
> 1 row selected (3.61 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(LTRIM(cust_name)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 144 |
> +--------------------+--+
> 1 row selected (3.705 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(FORMAT_NUMBER(bigint_column1, 2)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 168 |
> +--------------------+--+
> 1 row selected (3.63 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(REVERSE(cust_name)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 192 |
> +--------------------+--+
> 1 row selected (3.655 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(REPEAT(cust_name,2)) where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 216 |
> +--------------------+--+
> 1 row selected (3.677 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (cust_name)=('A J 
> Styles') where cust_id=10902;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 72 |
> +--------------------+--+
> 1 row selected (3.674 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_name)=(cust_id) where cust_id=10902;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 96 |
> +--------------------+--+
> 1 row selected (3.643 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=('deepti') where cust_id=10903;
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 240 |
> +--------------------+--+
> 1 row selected (3.756 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_id)=(21474836479999999) where cust_name='CUST_NAME_01999';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 150 |
> +--------------------+--+
> 1 row selected (3.727 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (cust_id)=(-214748364867) where cust_name='CUST_NAME_01998';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 125 |
> +--------------------+--+
> 1 row selected (3.741 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column1)=(-9223372036854770000) where cust_name='CUST_NAME_01998';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 250 |
> +--------------------+--+
> 1 row selected (3.696 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (bigint_column2)=(922337203685479999999999) where cust_name='CUST_NAME_01998';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 375 |
> +--------------------+--+
> 1 row selected (3.688 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (double_column1)=(4.9E-32) where cust_name='CUST_NAME_01998';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 500 |
> +--------------------+--+
> 1 row selected (3.889 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set 
> (decimal_column1)=(1234567890123456789012.123456789012345) where 
> cust_name='CUST_NAME_01998';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 625 |
> +--------------------+--+
> 1 row selected (11.982 seconds)
> 0: jdbc:hive2://10.20.255.171:23040> update uniqdata set (dob)=('2016-13-31 
> 12:00:00') where cust_name='CUST_NAME_01998';
> +--------------------+--+
> | Updated Row Count |
> +--------------------+--+
> | 750 |
> +--------------------+--+
> 1 row selected (5.614 seconds)
>  
> After some time the update fails.
> 0: jdbc:hive2://10.20.255.171:23040> Update uniqdata set (CUST_ID) = ((select 
> ASCII(CUST_NAME) from uniqdata where cust_name='CUST_NAME_00060'));
> Error: java.lang.RuntimeException: Update operation failed. more than one row 
> returned by a subquery used as an expression:
> Subquery subquery980
> +- *(1) Project [ascii(CUST_NAME#45) AS ascii(CUST_NAME)#982]
>  +- *(1) Filter (isnotnull(cust_name#45) && (cust_name#45 = CUST_NAME_00060))
>  +- *(1) FileScan carbondata chetan.uniqdata[cust_name#45] PushedFilters: 
> [IsNotNull(cust_name), EqualTo(cust_name,CUST_NAME_00060)], ReadSchema: 
> struct<CUST_NAME:string> (state=,code=0)
>  
> Log-
> 2020-11-07 00:30:09,167 | ERROR | [HiveServer2-Background-Pool: Thread-363] | 
> Exception in update operation | 
> org.apache.spark.sql.execution.command.mutation.CarbonProjectForUpdateCommand.processData(CarbonProjectForUpdateCommand.scala:228)2020-11-07
>  00:30:09,167 | ERROR | [HiveServer2-Background-Pool: Thread-363] | Exception 
> in update operation | 
> org.apache.spark.sql.execution.command.mutation.CarbonProjectForUpdateCommand.processData(CarbonProjectForUpdateCommand.scala:228)java.lang.RuntimeException:
>  more than one row returned by a subquery used as an expression:Subquery 
> subquery980+- *(1) Project [ascii(CUST_NAME#45) AS ascii(CUST_NAME)#982]   +- 
> *(1) Filter (isnotnull(cust_name#45) && (cust_name#45 = CUST_NAME_00060))     
>  +- *(1) FileScan carbondata chetan.uniqdata[cust_name#45] PushedFilters: 
> [IsNotNull(cust_name), EqualTo(cust_name,CUST_NAME_00060)], ReadSchema: 
> struct<CUST_NAME:string>
>  at scala.sys.package$.error(package.scala:27) at 
> org.apache.spark.sql.execution.ScalarSubquery.updateResult(subquery.scala:69) 
> at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$waitForSubqueries$1.apply(SparkPlan.scala:185)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$waitForSubqueries$1.apply(SparkPlan.scala:184)
>  at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at 
> org.apache.spark.sql.execution.SparkPlan.waitForSubqueries(SparkPlan.scala:184)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:154)
>  at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152) at 
> org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:83)
>  at 
> org.apache.spark.sql.execution.ProjectExec.produce(basicPhysicalOperators.scala:35)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.doCodeGen(WholeStageCodegenExec.scala:524)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:576)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>  at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152) at 
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127) at 
> org.apache.spark.sql.execution.columnar.InMemoryRelation.buildBuffers(InMemoryRelation.scala:107)
>  at 
> org.apache.spark.sql.execution.columnar.InMemoryRelation.<init>(InMemoryRelation.scala:102)
>  at 
> org.apache.spark.sql.execution.columnar.InMemoryRelation$.apply(InMemoryRelation.scala:43)
>  at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$cacheQuery$1.apply(CacheManager.scala:97)
>  at 
> org.apache.spark.sql.execution.CacheManager.writeLock(CacheManager.scala:67) 
> at 
> org.apache.spark.sql.execution.CacheManager.cacheQuery(CacheManager.scala:91) 
> at org.apache.spark.sql.Dataset.persist(Dataset.scala:2929) at 
> org.apache.spark.sql.execution.command.mutation.CarbonProjectForUpdateCommand.processData(CarbonProjectForUpdateCommand.scala:140)
>  at 
> org.apache.spark.sql.execution.command.DataCommand$$anonfun$run$2.apply(package.scala:132)
>  at 
> org.apache.spark.sql.execution.command.DataCommand$$anonfun$run$2.apply(package.scala:132)
>  at 
> org.apache.spark.sql.execution.command.Auditable$class.runWithAudit(package.scala:104)
>  at 
> org.apache.spark.sql.execution.command.DataCommand.runWithAudit(package.scala:130)
>  at org.apache.spark.sql.execution.command.DataCommand.run(package.scala:132) 
> at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
>  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190) at 
> org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190) at 
> org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3259) at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
>  at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3258) at 
> org.apache.spark.sql.Dataset.<init>(Dataset.scala:190) at 
> org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75) at 
> org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) at 
> org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694) at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:232)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:175)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)
>  at java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:185)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)2020-11-07 00:30:09,173 | INFO  | 
> [HiveServer2-Background-Pool: Thread-363] | updateLock unlocked successfully 
> after update uniqdata | 
> org.apache.spark.sql.execution.command.mutation.CarbonProjectForUpdateCommand.processData(CarbonProjectForUpdateCommand.scala:248)2020-11-07
>  00:30:09,178 | INFO  | [HiveServer2-Background-Pool: Thread-363] | 
> compactionLock unlocked successfully after update uniqdata | 
> org.apache.spark.sql.execution.command.mutation.CarbonProjectForUpdateCommand.processData(CarbonProjectForUpdateCommand.scala:254)2020-11-07
>  00:30:09,182 | INFO  | [HiveServer2-Background-Pool: Thread-363] | Metadata 
> lock has been successfully released | 
> org.apache.carbondata.core.locks.CarbonLockUtil.fileUnlock(CarbonLockUtil.java:49)2020-11-07
>  00:30:09,189 | AUDIT | [HiveServer2-Background-Pool: Thread-363] | 
> \{"time":"November 7, 2020 12:30:09 AM 
> CST","username":"anonymous","opName":"UPDATE 
> DATA","opId":"2805904551130888","opStatus":"FAILED","opTime":"1313 
> ms","table":"chetan.uniqdata","extraInfo":{"Exception":"java.lang.RuntimeException","Message":"Update
>  operation failed. more than one row returned by a subquery used as an 
> expression:\nSubquery subquery980\n+- *(1) Project [ascii(CUST_NAME#45) AS 
> ascii(CUST_NAME)#982]\n   +- *(1) Filter (isnotnull(cust_name#45) && 
> (cust_name#45 = CUST_NAME_00060))\n      +- *(1) FileScan carbondata 
> chetan.uniqdata[cust_name#45] PushedFilters: [IsNotNull(cust_name), 
> EqualTo(cust_name,CUST_NAME_00060)], ReadSchema: 
> struct<CUST_NAME:string>\n"}} | 
> org.apache.carbondata.processing.util.Auditor.logOperationEnd(Auditor.java:97)2020-11-07
>  00:30:09,191 | ERROR | [HiveServer2-Background-Pool: Thread-363] | Error 
> executing query, currentState RUNNING,  | 
> org.apache.spark.internal.Logging$class.logError(Logging.scala:91)java.lang.RuntimeException:
>  Update operation failed. more than one row returned by a subquery used as an 
> expression:Subquery subquery980+- *(1) Project [ascii(CUST_NAME#45) AS 
> ascii(CUST_NAME)#982]   +- *(1) Filter (isnotnull(cust_name#45) && 
> (cust_name#45 = CUST_NAME_00060))      +- *(1) FileScan carbondata 
> chetan.uniqdata[cust_name#45] PushedFilters: [IsNotNull(cust_name), 
> EqualTo(cust_name,CUST_NAME_00060)], ReadSchema: struct<CUST_NAME:string>
>  at scala.sys.package$.error(package.scala:27) at 
> org.apache.spark.sql.execution.command.mutation.CarbonProjectForUpdateCommand.processData(CarbonProjectForUpdateCommand.scala:232)
>  at 
> org.apache.spark.sql.execution.command.DataCommand$$anonfun$run$2.apply(package.scala:132)
>  at 
> org.apache.spark.sql.execution.command.DataCommand$$anonfun$run$2.apply(package.scala:132)
>  at 
> org.apache.spark.sql.execution.command.Auditable$class.runWithAudit(package.scala:104)
>  at 
> org.apache.spark.sql.execution.command.DataCommand.runWithAudit(package.scala:130)
>  at org.apache.spark.sql.execution.command.DataCommand.run(package.scala:132) 
> at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
>  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190) at 
> org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190) at 
> org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3259) at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
>  at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3258) at 
> org.apache.spark.sql.Dataset.<init>(Dataset.scala:190) at 
> org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75) at 
> org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) at 
> org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694) at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:232)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:175)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)
>  at java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:185)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)2020-11-07 00:30:09,192 | ERROR | 
> [HiveServer2-Background-Pool: Thread-363] | Error running hive query:  | 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:179)org.apache.hive.service.cli.HiveSQLException:
>  java.lang.RuntimeException: Update operation failed. more than one row 
> returned by a subquery used as an expression:Subquery subquery980+- *(1) 
> Project [ascii(CUST_NAME#45) AS ascii(CUST_NAME)#982]   +- *(1) Filter 
> (isnotnull(cust_name#45) && (cust_name#45 = CUST_NAME_00060))      +- *(1) 
> FileScan carbondata chetan.uniqdata[cust_name#45] PushedFilters: 
> [IsNotNull(cust_name), EqualTo(cust_name,CUST_NAME_00060)], ReadSchema: 
> struct<CUST_NAME:string>
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:269)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:175)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)
>  at java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>  at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:185)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)2020-11-07 00:30:09,195 | INFO  | 
> [HiveServer2-Handler-Pool: Thread-83] | Asked to cancel job group 
> 60a4f869-1300-4a8a-a575-196f32184ed1 | 
> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to