> Or, is this an artifact of an incompatibility between ORC files written by
> the Hive 2.x ORC serde not being readable by the Hive 1.x ORC serde?
> 3. Is there a difference in the ORC file format spec. at play here?
Nope, we're still defaulting to hive-0.12 format ORC files in Hive-2.x.
We
serde?
6. Any similar issues with Parquet format in Hive 1.x and 2.x?
From: Aviral Agarwal [mailto:aviral12...@gmail.com]
Sent: Wednesday, August 23, 2017 10:34 PM
To: user@hive.apache.org
Subject: Re: ORC Transaction Table - Spark
So, there is no way possible right now for Spark to read
Aviral Agarwal <aviral12...@gmail.com>
>> *Reply-To: *"user@hive.apache.org" <user@hive.apache.org>
>> *Date: *Wednesday, August 23, 2017 at 12:24 AM
>> *To: *"user@hive.apache.org" <user@hive.apache.org>
>> *Subject: *Re: ORC Transaction Ta
ted.
>
>
>
> *From: *Aviral Agarwal <aviral12...@gmail.com>
> *Reply-To: *"user@hive.apache.org" <user@hive.apache.org>
> *Date: *Wednesday, August 23, 2017 at 12:24 AM
> *To: *"user@hive.apache.org" <user@hive.apache.org>
> *Subject:
user@hive.apache.org" <user@hive.apache.org>
Subject: Re: ORC Transaction Table - Spark
Hi,
Yes it caused by wrong naming convention of the delta directory :
/apps/hive/warehouse/foo.db/bar/year=2017/month=5/delta_0645253_0645253_0001
How do I solve this ?
Thanks !
Aviral Agarwal
On
esday, August 22, 2017 at 5:39 AM
To: "user@hive.apache.org" <user@hive.apache.org>
Subject: ORC Transaction Table - Spark
Hi,
I am trying to read hive orc transaction table through Spark but I am getting
the following error
Caused by: java.
Hi,
I am trying to read hive orc transaction table through Spark but I am
getting the following error
Caused by: java.lang.RuntimeException: serious problem
at
org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1021)
at