I will take a look. Thanks Bryan!
On Thu, Apr 1, 2010 at 12:38 AM, Bryan Talbot btal...@aeriagames.com wrote:
I guess most places are running their clusters with UTC time zones or these
functions are not widely used.
Any chance of getting a committer to look at the patch with unit tests?
Hi all,
i want to upgrade the hive from 0.5 to 0.6 cause i want to use
CREATE VIEW feature
Any help would be appreciated.
thanks,
prakash
Hi Prakash,
Since 0.6 hasn't been released yet, you'll need to build from source on trunk.
This also means you might want to use it only on a test cluster until you are
confident in its stability level.
You'll also need to upgrade your existing metastore:
hive describe ut;
OK
timebigint
daystring
Time taken: 0.128 seconds
hive select * from ut;
OK
1270145333155tuesday
Time taken: 0.085 seconds
When I run this simple query, I'm getting a NULL for the time column with
data type bigint.
hive select unix_timestamp(time),day from ut;
Total
unix_timestamp() returns the unix time given a date string:
e.g. unix_timestamp('2009-03-20 11:30:01') = 1237573801
please see:
http://wiki.apache.org/hadoop/Hive/LanguageManual/UDF#Date_Functions
Cheers,
Paul
From: tom kersnick [mailto:hiveu...@gmail.com]
Sent: Thursday, April 01, 2010 1:12
ok thanks
I should have caught that.
/tom
On Thu, Apr 1, 2010 at 2:13 PM, Carl Steinbach c...@cloudera.com wrote:
Hi Tom,
Unix Time is defined as the number of *seconds* since January 1, 1970. It
looks like the data you have in cola is in milliseconds. You need to divide
this value
So its working, but Im having a time zone issue.
My servers are located in EST, but i need this data in PST.
So when it converts this:
hive select from_unixtime(1270145333,'-MM-dd HH:mm:ss') from
ut2;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since
Setting TZ in your .bash_profile won't work because the map/reduce tasks
runs on the hadoop clusters.
If you start your hadoop tasktracker with that TZ setting, it will probably
work.
Zheng
On Thu, Apr 1, 2010 at 3:32 PM, tom kersnick hiveu...@gmail.com wrote:
So its working, but Im having a
Ok thanks!
I will try it out.
/tom
On Thu, Apr 1, 2010 at 4:31 PM, Zheng Shao zsh...@gmail.com wrote:
Setting TZ in your .bash_profile won't work because the map/reduce tasks
runs on the hadoop clusters.
If you start your hadoop tasktracker with that TZ setting, it will probably