Hi Mahender

 

How have you create your ORC table?

 

Use

 

show create table <TABLE_NAME>

 

and send the output

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

 
<http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908
.pdf>
http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.
pdf

Author of the books "A Practitioner's Guide to Upgrading to Sybase ASE 15",
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN:
978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
one out shortly

 

 <http://talebzadehmich.wordpress.com/> http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is
the responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.

 

From: Mahender SNPC [mailto:[email protected]] 
Sent: 26 November 2015 22:09
To: [email protected]; [email protected]
Subject: RE: Need you Expert inputs

 

Hi Team,
 
We need your expert input in resolving issue while updating record in Hive
ORC bucket table. I'm getting "FAILED: SemanticException [Error 10294]:
Attempt to do update or delete using transaction manager that does not
support these operations" . I'm in HDP 2.3



 Here are the step I followed for update


     STEP 1: 

    set hive.support.concurrency = true;

 

    SET hive.enforce.bucketing = true;

 

    SET hive.exec.dynamic.partition.mode = nonstrict;

 

    SET hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;

 

    SET hive.compactor.initiator.on = true;

 

    SET hive.compactor.worker.threads = 2;

 

 

 

STEP  2: 

 

create table testTableNew(id int ,name string ) clustered by (id) into 2
buckets stored as orc TBLPROPERTIES('transactional'='true');

 

 STEP 3:

 

insert into table testTableNew values (1,'row1'),(2,'row2'),(3,'row3'); -- 3
rows inserted successfully

 

 STEP 4:

 

 delete from testTable where id = 1; 

 

FAILED: SemanticException [Error 10294]: Attempt to do update or delete
using transaction manager that does not support these operations." 

 

STEP 5 :

 

 

 

insert into table testTable values (1,'row1'),(2,'row2'); 

 

FAILED: SemanticException [Error 10294]: Attempt to do update or delete
using transaction manager that does not support these operations."

 

 

 

*         Is there any setting I might be missing here. One small hunch is,
Is it necessary to have zoo keeper running for performing update or delete?.


 

 The other questions, which we are looking @ performance of update. 

 

 

We are having scenario, where we need to update Single or couple of rows say
10 rows in Hive ORC table, it contains almost 200 million of Records in
table. Can you share your thoughts whether it is advisable to use "Update"
or "delete"  in hive 0.14 (HDInsight 3.2), 

 

Post Update will the hive orc table as is like previous.

 

Regards,

Mahender    

Reply via email to