Hi,

hadoop version
Hadoop 3.0.3

Hive version
Apache Hive (version 3.0.0)

ORC transactional table is created as follows:

create table t (
 owner                   varchar(30)
,object_name             varchar(30)
,subobject_name          varchar(30)
,object_id               bigint
,data_object_id          bigint
,object_type             varchar(19)
,created                 timestamp
,last_ddl_time           timestamp
,timestamp2               varchar(19)
,status                  varchar(7)
,temporary2              varchar(1)
,generated               varchar(1)
,secondary               varchar(1)
,namespace               bigint
,edition_name            varchar(30)
,padding1                varchar(4000)
,padding2                varchar(3500)
,attribute               varchar(32)
,op_type                 int
,op_time                 timestamp
)
CLUSTERED BY (object_id) INTO 256 BUCKETS
STORED AS ORC
TBLPROPERTIES ( "orc.compress"="SNAPPY",
"transactional"="true",
"orc.create.index"="true",
"orc.bloom.filter.columns"="object_id",
"orc.bloom.filter.fpp"="0.05",
"orc.stripe.size"="268435456",
"orc.row.index.stride"="10000" )
;

The following a simple update

use asehadoop;
set hive.support.concurrency=true;
set hive.enforce.bucketing=true;
set hive.exec.dynamic.partition.mode=nonstrict;
set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
set hive.compactor.initiator.on=true;
set hive.compactor.worker.threads=20;
UPDATE t set object_name = 'Mich' WHERE object_id = 594688;

And this is the error I get at the end

Error: Error while processing statement: FAILED: Execution Error, return
code -101 from org.apache.hadoop.hive.ql.exec.StatsTask.
org.apache.hadoop.fs.FileStatus.compareTo(Lorg/apache/hadoop/fs/FileStatus;)I
(state=08S01,code=-101)

Appreciate any info.

Regards,

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Reply via email to