FYI it looks like this has to do with the representation of timestamps during
vectorized execution - timestamps are represented as a long value representing
nanoseconds since the epoch, thus the max timestamp value in that format is
2262-04-11
On Sep 22, 2014, at 4:13 PM, Peyman Mohajerian wr
So i found out more detail about this issue,
if in:
select cast('2999-12-31 23:59:59' as timestamp) from table;
if the table has 'orc' data, and you are using hive .13 and set
hive.vectorized.execution.enabled = true;
then this issue occurs, it maybe related to: hive-6656 i'm not certain of
that.
It is using either 1.6 or 1.7, but i tested
System.out.println(" " + Timestamp.valueOf("2999-12-31 23:59:59" ));
on both 1.7 and 1.6 version and it works in both cases.
On Wed, Sep 10, 2014 at 10:12 PM, Jason Dere wrote:
> Hmm that's odd .. it looks like this works for me:
>
> hive> select cast(
Hmm that's odd .. it looks like this works for me:
hive> select cast('2999-12-31 23:59:59' as timestamp);
OK
2999-12-31 23:59:59
Time taken: 0.212 seconds, Fetched: 1 row(s)
For string to timestamp conversion, it should be using
java.sql.Timestamp.valueOf(). What version of jvm are you using?
The point of over-flow is:
2262-04-11 20:00:00
if you go a second earlier it works fine:
2262-04-11 19:23:59
On Wed, Sep 10, 2014 at 5:38 PM, Peyman Mohajerian
wrote:
> Hi Guys,
>
> I Hive .13 for this conversion:
> select cast('2999-12-31 23:59:59' as timestamp)
> I get:
> 1830-11-23 00:50:51.5
Hi Guys,
I Hive .13 for this conversion:
select cast('2999-12-31 23:59:59' as timestamp)
I get:
1830-11-23 00:50:51.580896768
up to around year 2199 it works fine, the work around is to convert the
string to int and then back to timestamp:
from_unixtime(unix_timestamp('2999-12-31 23:59:59.00')