lowered 1073741824 to half of it but still getting the same issue.
On Wed, Sep 21, 2016 at 6:44 PM, Sanjeev Verma
wrote:
> its 1073741824 now but I cant see anything running on client side, the job
> which kicked up by the query got completed but HS2 is crashing
>
>
its 1073741824 now but I cant see anything running on client side, the job
which kicked up by the query got completed but HS2 is crashing
On Wed, Sep 21, 2016 at 6:40 PM, Prasanth Jayachandran <
pjayachand...@hortonworks.com> wrote:
> FetchOperator will run client side. What is the value for
>
FetchOperator will run client side. What is the value for
hive.fetch.task.conversion.threshold?
Thanks
Prasanth
> On Sep 21, 2016, at 6:37 PM, Sanjeev Verma wrote:
>
> I am getting hiveserver2 memory even after increasing the heap size from 8G
> to 24G, in clue why
I am getting hiveserver2 memory even after increasing the heap size from 8G
to 24G, in clue why it still going to OOM with enough heapsize
"HiveServer2-HttpHandler-Pool: Thread-58026" prio=5 tid=58026 RUNNABLE
at java.lang.OutOfMemoryError.(OutOfMemoryError.java:48)
at
select to_date(ts),year(ts),month(ts),day(ts),hour(ts),minute(ts),second(ts)
from (select from_unixtime (unix_timestamp
('2016-09-15T23:45:22.943762Z',"-MM-dd'T'HH:mm:ss")) as ts) as t;
OK
2016-09-15 2016 915 23 45 22
Dudu
From: Manish Rangari
I run MSCK REPAIR TABLE mytable; and got
Error while processing statement: FAILED: Execution Error, return code 1
from org.apache.hadoop.hive.ql.exec.DDLTask
On Mon, Sep 12, 2016 at 6:56 PM, Lefty Leverenz
wrote:
> Here's a list of the wikidocs about dynamic partitions
Hello,
Could I please have write access to the Hive wiki so that I can help with
fixes? My Confluence username is *icook*.
Thanks,
Ian Cook
Cloudera
Guys,
I am trying to extract date, time, month, minute etc from below timestamp
format but did not find any function for this. Can anyone help me to
extract the details?
2016-09-15T23:45:22.943762Z
2016-09-15T23:45:22.948829Z
--Manish