Re: hiveserver2 oom

2016-09-22 Thread Tale Firefly
Hey Sanjeev.

Can you put the /tmp/hive/hive.log (on the hvevserver2 host) when you
launch the query ?

Best regards.

Tale

On Thu, Sep 22, 2016 at 5:03 AM, Sanjeev Verma 
wrote:

> lowered 1073741824 to half of it but still getting the same issue.
>
> On Wed, Sep 21, 2016 at 6:44 PM, Sanjeev Verma 
> wrote:
>
>> its 1073741824 now but I cant see anything running on client side, the
>> job which kicked up by the query got completed but HS2 is crashing
>>
>> On Wed, Sep 21, 2016 at 6:40 PM, Prasanth Jayachandran <
>> pjayachand...@hortonworks.com> wrote:
>>
>>> FetchOperator will run client side. What is the value for
>>> hive.fetch.task.conversion.threshold?
>>>
>>> Thanks
>>> Prasanth
>>> > On Sep 21, 2016, at 6:37 PM, Sanjeev Verma 
>>> wrote:
>>> >
>>> > I am getting hiveserver2 memory even after increasing the heap size
>>> from 8G to 24G, in clue why it still going to OOM with enough heapsize
>>> >
>>> > "HiveServer2-HttpHandler-Pool: Thread-58026" prio=5 tid=58026 RUNNABLE
>>> >  at java.lang.OutOfMemoryError.(OutOfMemoryError.java:48)
>>> >  at org.apache.hadoop.util.LineReader.(LineReader.java:140)
>>> >  at org.apache.hadoop.mapreduce.lib.input.SplitLineReader.
>>> (SplitLineReader.java:37)
>>> >  at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineR
>>> eader.(UncompressedSplitLineReader.java:46)
>>> >  at org.apache.hadoop.mapred.LineRecordReader.(LineRecordR
>>> eader.java:128)
>>> >  at org.apache.hadoop.mapred.TextInputFormat.getRecordReader(Tex
>>> tInputFormat.java:67)
>>> >  at org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputForma
>>> tSplit.getRecordReader(FetchOperator.java:682)
>>> >  at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader
>>> (FetchOperator.java:328)
>>> >  at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(Fetc
>>> hOperator.java:450)
>>> >  at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOp
>>> erator.java:419)
>>> >  at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.jav
>>> a:143)
>>> >  at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1745)
>>> >  at org.apache.hive.service.cli.operation.SQLOperation.getNextRo
>>> wSet(SQLOperation.java:347)
>>> >  at org.apache.hive.service.cli.operation.OperationManager.getOp
>>> erationNextRowSet(OperationManager.java:223
>>> >  at org.apache.hive.service.cli.session.HiveSessionImpl.fetchRes
>>> ults(HiveSessionImpl.java:716)
>>> >  at sun.reflect.GeneratedMethodAccessor15.invoke()
>>> >  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>> thodAccessorImpl.java:43)
>>> >  at java.lang.reflect.Method.invoke(Method.java:606)
>>> >  at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(
>>> HiveSessionProxy.java:78)
>>> >  at org.apache.hive.service.cli.session.HiveSessionProxy.access$
>>> 000(HiveSessionProxy.java:36)
>>> >  at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(H
>>> iveSessionProxy.java:63)
>>> >  at java.security.AccessController.doPrivileged(Native Method)
>>> >  at javax.security.auth.Subject.doAs(Subject.java:415)
>>> >  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGro
>>> upInformation.java:1709)
>>>
>>>
>>
>


Re: iso 8601 to utc with timezone conversion

2016-09-22 Thread Manish R
Hi Andres,

No that is not in UTC format. Plz see the description of that field below.
so if timezone of table2 is Europe/Amsterdam then we have to convert the
request_date of table1 in UTC Europe/Amsterdam timezone ( for example
2016-09-18 23:30:52). We have a lot of timezone entries in table2 and I
wonder how am I going to convert all the request_date field according to
timezone field. Do I have to maintains separate table for that?

timestamp

The time when the load balancer received the request from the client, in
ISO 8601 format.

On Fri, Sep 23, 2016 at 1:26 AM, Andres Koitmäe 
wrote:

> Hi!
>
> It seems that in Table1 you already have request_date in UTC format. *Z *at
> the end of the timezone is the zone designator for the zero UTC offset.
>
> Now all you have to do is to use standard Hive functions which you can
> find from Hive wiki https://cwiki.apache.org/confluence/display/Hive/
> LanguageManual+UDF#LanguageManualUDF-TypeConversionFunctions
>
> Use from_utc_timestamp to convert request_date to timestamp to timezone
> specified in Table 2 (join two tables using aid column)
>
> Regards,
>
> Andres Koitmäe
>
> On 22 September 2016 at 20:05, Manish R 
> wrote:
>
>> Hi Guys,
>>
>> There is a scenario here that I am trying to implement
>>
>> I have a table say table1 which contains aid and request_date in ISO 8601
>> format. I have one more table say table2 which contains aid and timezone
>> details. Now I want to convert request_date from table1 to UTC and apply
>> the timezone that is in table2 format for that corresponding aid.
>>
>> Table 1 example data
>> *2016-09-15T23:45:22.943762Z abs123*
>> *2016-09-16T22:48:12.943762Z erty456*
>>
>> Table 2 example data
>> *abs123   Asia/Kolkata*
>> *erty456  Europe/Amsterdam*
>>
>
>


Re: iso 8601 to utc with timezone conversion

2016-09-22 Thread Andres Koitmäe
Hi!

It seems that in Table1 you already have request_date in UTC format. *Z *at
the end of the timezone is the zone designator for the zero UTC offset.

Now all you have to do is to use standard Hive functions which you can find
from Hive wiki
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-TypeConversionFunctions

Use from_utc_timestamp to convert request_date to timestamp to timezone
specified in Table 2 (join two tables using aid column)

Regards,

Andres Koitmäe

On 22 September 2016 at 20:05, Manish R 
wrote:

> Hi Guys,
>
> There is a scenario here that I am trying to implement
>
> I have a table say table1 which contains aid and request_date in ISO 8601
> format. I have one more table say table2 which contains aid and timezone
> details. Now I want to convert request_date from table1 to UTC and apply
> the timezone that is in table2 format for that corresponding aid.
>
> Table 1 example data
> *2016-09-15T23:45:22.943762Z abs123*
> *2016-09-16T22:48:12.943762Z erty456*
>
> Table 2 example data
> *abs123   Asia/Kolkata*
> *erty456  Europe/Amsterdam*
>


Running multiple hive queries in the same jvm

2016-09-22 Thread rahul challapalli
Team,

I want to know whether there is any way in which I can run 3 hive queries
sequentially in a single jvm. From the docs, I found that setting "
mapreduce.framework.name=local" might achieve what I am looking for. Can
some one confirm?

- Rahul


Hive query fails with error "expecting dummy store operator but found: FS[26]"

2016-09-22 Thread Tale Firefly
Hello !

I send you this mail because I perform an hive query with Tez  and it fails
with a strange error :
The error is like this :
###
ERROR : Vertex failed, vertexName=Reducer 2,
vertexId=vertex_1473870963805_157168_11_02, diagnostics=[Task failed,
taskId=task_1473870963805_157168_11_02_35, diagnostics=[TaskAttempt 0
failed, info=[Error: Failure while running task:java.lang.RuntimeException:
java.lang.IllegalStateException: Was expecting dummy store operator but
found: FS[26]
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)

at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at
org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:344)

at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:181)

at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:172)

at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)

at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:172)

at
org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:168)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

###

I wanted to know is someone already saw this kind of error ?
It is done by doing a query using beeline and Tez as execution engine.

I found some information here :
http://svn.apache.org/viewvc/hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/exec/DummyStoreOperator.java?view=markup&pathrev=1423731

But I am not sure to fully understand yet why the error mention FS ? Is FS
the file separator described here ?
http://www.aivosto.com/vbtips/control-characters.html#FS

Thank you in advance for your help o/

Best regards.

Tale


Re: Write access to Hive wiki please

2016-09-22 Thread Lefty Leverenz
Done.  Welcome to the Hive wiki team, Ian!

-- Lefty


On Wed, Sep 21, 2016 at 10:03 AM, Ian Cook  wrote:

> Hello,
>
> Could I please have write access to the Hive wiki so that I can help with
> fixes? My Confluence username is *icook*.
>
> Thanks,
> Ian Cook
> Cloudera
>


iso 8601 to utc with timezone conversion

2016-09-22 Thread Manish R
Hi Guys,

There is a scenario here that I am trying to implement

I have a table say table1 which contains aid and request_date in ISO 8601
format. I have one more table say table2 which contains aid and timezone
details. Now I want to convert request_date from table1 to UTC and apply
the timezone that is in table2 format for that corresponding aid.

Table 1 example data
*2016-09-15T23:45:22.943762Z abs123*
*2016-09-16T22:48:12.943762Z erty456*

Table 2 example data
*abs123   Asia/Kolkata*
*erty456  Europe/Amsterdam*


Re: Need help with query

2016-09-22 Thread Andrew Sears

Hi there,

The detailed error should be in the hiveserver2.log


Cheers, Andrew On Wed, Sep 21, 2016 at 3:36 PM, Igor Kravzov < 
igork.ine...@gmail.com [igork.ine...@gmail.com] > wrote:
I run MSCK REPAIR TABLE mytable; and got Error while processing statement: 
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask


On Mon, Sep 12, 2016 at 6:56 PM, Lefty Leverenz < leftylever...@gmail.com 
[leftylever...@gmail.com] > wrote:
Here's a list of the wikidocs about dynamic partitions 
[https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-DynamicPartitions] 
.


-- Lefty


On Mon, Sep 12, 2016 at 3:25 PM, Devopam Mittra < devo...@gmail.com 
[devo...@gmail.com] > wrote:
Kindly learn dynamic partition from cwiki. That will be the perfect 
solution to your requirement in my opinion.

Regards
Dev


On 13 Sep 2016 12:49 am, "Igor Kravzov" < igork.ine...@gmail.com 
[igork.ine...@gmail.com] > wrote:

Hi,
I have a query like this one
alter table my_table add if not exists partition (mmdd=20160912) 
location '/mylocation/20160912';
Is it possible to make so I don't have to change date every day? Something 
with CURRENT_DATE;?

Thanks in advance.

Re: Extracting data from ELB log date format

2016-09-22 Thread Manish Rangari
Thanks Nichole and Dudu for the reply. I got what I was looking for.

--Manish

On Thu, Sep 22, 2016 at 2:52 AM, Markovitz, Dudu 
wrote:

> select to_date(ts),year(ts),month(ts),day(ts),hour(ts),minute(ts),second(ts)
> from (select from_unixtime (unix_timestamp 
> ('2016-09-15T23:45:22.943762Z',"-MM-dd'T'HH:mm:ss"))
> as ts) as t;
>
> OK
>
> 2016-09-15 2016 915   23   45   22
>
>
>
> Dudu
>
>
>
> *From:* Manish Rangari [mailto:linuxtricksfordev...@gmail.com]
> *Sent:* Wednesday, September 21, 2016 4:23 PM
> *To:* user@hive.apache.org
> *Subject:* Extracting data from ELB log date format
>
>
>
> Guys,
>
>
>
> I am trying to extract date, time, month, minute etc from below timestamp
> format but did not find any function for this. Can anyone help me to
> extract the details?
>
>
>
> 2016-09-15T23:45:22.943762Z
>
> 2016-09-15T23:45:22.948829Z
>
>
>
> --Manish
>