[jira] [Commented] (HIVE-9423) HiveServer2: Provide the user with different error messages depending on the Thrift client exception code

2016-10-01 Thread Chaoyu Tang (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-9423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15539510#comment-15539510
 ] 

Chaoyu Tang commented on HIVE-9423:
---

Thanks [~leftylev] for catching this. My fault. I have updated the errata.txt 
for this missing JIRA in the commit message. See HIVE-14874.

> HiveServer2: Provide the user with different error messages depending on the 
> Thrift client exception code
> -
>
> Key: HIVE-9423
> URL: https://issues.apache.org/jira/browse/HIVE-9423
> Project: Hive
>  Issue Type: Bug
>  Components: HiveServer2
>Affects Versions: 0.12.0, 0.13.0, 0.14.0, 0.15.0
>Reporter: Vaibhav Gumashta
>Assignee: Peter Vary
> Fix For: 2.2.0, 2.1.1
>
> Attachments: HIVE-9423.2.patch, HIVE-9423.3.patch, HIVE-9423.4.patch, 
> HIVE-9423.5-branch-2.1.patch, HIVE-9423.5.patch, 
> HIVE-9423.6-branch-2.1.patch, HIVE-9423.patch
>
>
> After verifying that the original problem is mostly solved by the Thrift 
> upgrade, I created a patch to provide better error message when possible
> Original description for reference:
> ---
> An example of where it is needed: it has been reported that when # of client 
> connections is greater than   {{hive.server2.thrift.max.worker.threads}}, 
> HiveServer2 stops accepting new connections and ends up having to be 
> restarted. This should be handled more gracefully by the server and the JDBC 
> driver, so that the end user gets aware of the problem and can take 
> appropriate steps (either close existing connections or bump of the config 
> value or use multiple server instances with dynamic service discovery 
> enabled). Similarly, we should also review the behaviour of background thread 
> pool to have a well defined behavior on the the pool getting exhausted. 
> Ideally implementing some form of general admission control will be a better 
> solution, so that we do not accept new work unless sufficient resources are 
> available and display graceful degradation under overload.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Resolved] (HIVE-14874) Master: Update errata.txt for the missing JIRA number in HIVE-9423 commit msg

2016-10-01 Thread Chaoyu Tang (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14874?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chaoyu Tang resolved HIVE-14874.

   Resolution: Fixed
Fix Version/s: 2.2.0

> Master: Update errata.txt for the missing JIRA number in HIVE-9423 commit msg
> -
>
> Key: HIVE-14874
> URL: https://issues.apache.org/jira/browse/HIVE-14874
> Project: Hive
>  Issue Type: Bug
>Reporter: Chaoyu Tang
>Assignee: Chaoyu Tang
>Priority: Trivial
> Fix For: 2.2.0
>
>
> Missing the JIRA number in commit msg for master branch, see
> See 
> https://issues.apache.org/jira/browse/HIVE-9423?focusedCommentId=15537841=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15537841



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-14874) Master: Update errata.txt for the missing JIRA number in HIVE-9423 commit msg

2016-10-01 Thread Chaoyu Tang (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14874?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chaoyu Tang updated HIVE-14874:
---
Summary: Master: Update errata.txt for the missing JIRA number in HIVE-9423 
commit msg  (was: Master: Update errata.txt for missing JIRA nubmer in 
HIVE-9423 commit msg)

> Master: Update errata.txt for the missing JIRA number in HIVE-9423 commit msg
> -
>
> Key: HIVE-14874
> URL: https://issues.apache.org/jira/browse/HIVE-14874
> Project: Hive
>  Issue Type: Bug
>Reporter: Chaoyu Tang
>Assignee: Chaoyu Tang
>Priority: Trivial
>
> Missing the JIRA number in commit msg for master branch, see
> See 
> https://issues.apache.org/jira/browse/HIVE-9423?focusedCommentId=15537841=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15537841



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-12458) remove identity_udf.jar from source

2016-10-01 Thread Vaibhav Gumashta (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-12458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vaibhav Gumashta updated HIVE-12458:

Target Version/s: 2.2.0  (was: 1.3.0, 2.2.0)

> remove identity_udf.jar from source
> ---
>
> Key: HIVE-12458
> URL: https://issues.apache.org/jira/browse/HIVE-12458
> Project: Hive
>  Issue Type: Bug
>  Components: Test
>Reporter: Thejas M Nair
>Assignee: Vaibhav Gumashta
> Fix For: 2.2.0
>
>
> We should not be checking in jars into the source repo.
> We could use hive-contrib jar like its used in 
> ./ql/src/test/queries/clientpositive/add_jar_pfile.q 
> add jar 
> pfile://${system:test.tmp.dir}/hive-contrib-${system:hive.version}.jar;



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Resolved] (HIVE-12458) remove identity_udf.jar from source

2016-10-01 Thread Vaibhav Gumashta (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-12458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vaibhav Gumashta resolved HIVE-12458.
-
Resolution: Duplicate

Fixing this as part of HIVE-14721.

> remove identity_udf.jar from source
> ---
>
> Key: HIVE-12458
> URL: https://issues.apache.org/jira/browse/HIVE-12458
> Project: Hive
>  Issue Type: Bug
>  Components: Test
>Reporter: Thejas M Nair
>Assignee: Vaibhav Gumashta
>
> We should not be checking in jars into the source repo.
> We could use hive-contrib jar like its used in 
> ./ql/src/test/queries/clientpositive/add_jar_pfile.q 
> add jar 
> pfile://${system:test.tmp.dir}/hive-contrib-${system:hive.version}.jar;



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-12458) remove identity_udf.jar from source

2016-10-01 Thread Vaibhav Gumashta (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-12458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vaibhav Gumashta updated HIVE-12458:

Fix Version/s: 2.2.0

> remove identity_udf.jar from source
> ---
>
> Key: HIVE-12458
> URL: https://issues.apache.org/jira/browse/HIVE-12458
> Project: Hive
>  Issue Type: Bug
>  Components: Test
>Reporter: Thejas M Nair
>Assignee: Vaibhav Gumashta
> Fix For: 2.2.0
>
>
> We should not be checking in jars into the source repo.
> We could use hive-contrib jar like its used in 
> ./ql/src/test/queries/clientpositive/add_jar_pfile.q 
> add jar 
> pfile://${system:test.tmp.dir}/hive-contrib-${system:hive.version}.jar;



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-6425) Unable to create external table with 3000+ columns

2016-10-01 Thread Miklos Szurap (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-6425?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538571#comment-15538571
 ] 

Miklos Szurap commented on HIVE-6425:
-

HIVE-9815 and HIVE-12274 are also addressing this issue.

> Unable to create external table with 3000+ columns
> --
>
> Key: HIVE-6425
> URL: https://issues.apache.org/jira/browse/HIVE-6425
> Project: Hive
>  Issue Type: Bug
>  Components: Metastore
>Affects Versions: 0.10.0
> Environment: Linux, CDH 4.2.0
>Reporter: Anurag
>  Labels: patch
> Attachments: Hive_Script.txt
>
>
> While creating an external table in Hive to a table in HBase with 3000+ 
> columns, Hive shows up an error:
> FAILED: Error in metadata: 
> MetaException(message:javax.jdo.JDODataStoreException: Put request failed : 
> INSERT INTO "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES 
> (?,?,?)
> NestedThrowables:
> org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO 
> "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES (?,?,?) )
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538473#comment-15538473
 ] 

Hive QA commented on HIVE-14873:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12831216/HIVE-14873.01.patch

{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 6 failed/errored test(s), 10652 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[acid_mapjoin]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[ctas]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[vector_join_part_col_char]
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3]
org.apache.hadoop.hive.metastore.TestMetaStoreMetrics.testMetaDataCounts
org.apache.hive.jdbc.TestJdbcWithMiniHS2.testAddJarConstructorUnCaching
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1371/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1371/console
Test logs: 
http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-Build-1371/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 6 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12831216 - PreCommit-HIVE-Build

> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
> Attachments: HIVE-14873.01.patch, HIVE-14873.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Jesus Camacho Rodriguez (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538353#comment-15538353
 ] 

Jesus Camacho Rodriguez commented on HIVE-14873:


[~ashutoshc], could you take a look? Thanks

> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
> Attachments: HIVE-14873.01.patch, HIVE-14873.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Jesus Camacho Rodriguez (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jesus Camacho Rodriguez updated HIVE-14873:
---
Attachment: HIVE-14873.01.patch

> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
> Attachments: HIVE-14873.01.patch, HIVE-14873.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538263#comment-15538263
 ] 

Hive QA commented on HIVE-14873:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12831212/HIVE-14873.patch

{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 9 failed/errored test(s), 10652 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[acid_mapjoin]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[ctas]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[show_functions]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[vector_join_part_col_char]
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[vectorized_date_funcs]
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3]
org.apache.hadoop.hive.metastore.TestMetaStoreMetrics.testMetaDataCounts
org.apache.hive.jdbc.TestJdbcWithMiniHS2.testAddJarConstructorUnCaching
org.apache.hive.service.cli.TestEmbeddedThriftBinaryCLIService.testTaskStatus
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1370/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1370/console
Test logs: 
http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-Build-1370/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 9 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12831212 - PreCommit-HIVE-Build

> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
> Attachments: HIVE-14873.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14765) metrics - gauge overwritten messages

2016-10-01 Thread Barna Zsombor Klara (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14765?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538250#comment-15538250
 ] 

Barna Zsombor Klara commented on HIVE-14765:


Hi [~sershe]
I think I found a possible reason for this. Would you mind if I take a look at 
it?

> metrics - gauge overwritten messages
> 
>
> Key: HIVE-14765
> URL: https://issues.apache.org/jira/browse/HIVE-14765
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>
> {noformat}
> 2016-09-14T21:09:55,553 WARN  [HiveServer2-HttpHandler-Pool: Thread-48]: 
> metrics2.CodahaleMetrics (CodahaleMetrics.java:addGauge(304)) - A Gauge with 
> name [init_total_count_dbs] already exists.  The old gauge will be 
> overwritten, but this is not recommended
> 2016-09-14T21:09:55,553 WARN  [HiveServer2-HttpHandler-Pool: Thread-48]: 
> metrics2.CodahaleMetrics (CodahaleMetrics.java:addGauge(304)) - A Gauge with 
> name [init_total_count_tables] already exists.  The old gauge will be 
> overwritten, but this is not recommended
> 2016-09-14T21:09:55,554 WARN  [HiveServer2-HttpHandler-Pool: Thread-48]: 
> metrics2.CodahaleMetrics (CodahaleMetrics.java:addGauge(304)) - A Gauge with 
> name [init_total_count_partitions] already exists.  The old gauge will be 
> overwritten, but this is not recommended
> {noformat}
> Might have something to do with metastore being a threadlocal (just shooting 
> in the dark)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14810) Failing test: TestMetaStoreMetrics.testMetaDataCounts

2016-10-01 Thread Barna Zsombor Klara (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538242#comment-15538242
 ] 

Barna Zsombor Klara commented on HIVE-14810:


Hi [~sseth] sorry for "stealing" this issue, but I looked into it because I was 
working on metrics and wanted to make sure I didn't cause the  regression. 
Please let me know if you are fine with the fix.


> Failing test: TestMetaStoreMetrics.testMetaDataCounts
> -
>
> Key: HIVE-14810
> URL: https://issues.apache.org/jira/browse/HIVE-14810
> Project: Hive
>  Issue Type: Sub-task
>Reporter: Siddharth Seth
>Assignee: Barna Zsombor Klara
> Attachments: HIVE-14810.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14810) Failing test: TestMetaStoreMetrics.testMetaDataCounts

2016-10-01 Thread Barna Zsombor Klara (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538237#comment-15538237
 ] 

Barna Zsombor Klara commented on HIVE-14810:


The fixed test has passed, there are no new regressions in the report.

> Failing test: TestMetaStoreMetrics.testMetaDataCounts
> -
>
> Key: HIVE-14810
> URL: https://issues.apache.org/jira/browse/HIVE-14810
> Project: Hive
>  Issue Type: Sub-task
>Reporter: Siddharth Seth
>Assignee: Barna Zsombor Klara
> Attachments: HIVE-14810.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Jesus Camacho Rodriguez (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jesus Camacho Rodriguez updated HIVE-14873:
---
Attachment: HIVE-14873.patch

> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
> Attachments: HIVE-14873.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Jesus Camacho Rodriguez (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jesus Camacho Rodriguez updated HIVE-14873:
---
Status: Patch Available  (was: In Progress)

> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
> Attachments: HIVE-14873.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Work started] (HIVE-14873) Add UDF for extraction of 'day of week'

2016-10-01 Thread Jesus Camacho Rodriguez (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14873?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HIVE-14873 started by Jesus Camacho Rodriguez.
--
> Add UDF for extraction of 'day of week'
> ---
>
> Key: HIVE-14873
> URL: https://issues.apache.org/jira/browse/HIVE-14873
> Project: Hive
>  Issue Type: Bug
>  Components: Parser, UDF
>Affects Versions: 2.1.0
>Reporter: Jesus Camacho Rodriguez
>Assignee: Jesus Camacho Rodriguez
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14855) test patch

2016-10-01 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14855?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538157#comment-15538157
 ] 

Hive QA commented on HIVE-14855:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12831209/HIVE-14855.2.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1369/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1369/console
Test logs: 
http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-Build-1369/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2016-10-01 08:19:51.655
+ [[ -n /usr/java/jdk1.8.0_25 ]]
+ export JAVA_HOME=/usr/java/jdk1.8.0_25
+ JAVA_HOME=/usr/java/jdk1.8.0_25
+ export 
PATH=/usr/java/jdk1.8.0_25/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/java/jdk1.8.0_25/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-Build-1369/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2016-10-01 08:19:51.657
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at 297b443 HIVE-14865 Fix comments after HIVE-14350 (Eugene 
Koifman, reviewed by Alan Gates)
+ git clean -f -d
Removing b/
Removing beeline/src/java/org/apache/hive/beeline/hs2connection/
Removing beeline/src/test/org/apache/hive/beeline/hs2connection/
Removing beeline/src/test/resources/test-hs2-conn-conf-kerberos-http.xml
Removing beeline/src/test/resources/test-hs2-conn-conf-kerberos-nossl.xml
Removing beeline/src/test/resources/test-hs2-conn-conf-kerberos-ssl.xml
Removing beeline/src/test/resources/test-hs2-connection-conf-list.xml
Removing beeline/src/test/resources/test-hs2-connection-config-noauth.xml
Removing beeline/src/test/resources/test-hs2-connection-multi-conf-list.xml
Removing beeline/src/test/resources/test-hs2-connection-zookeeper-config.xml
Removing itests/hive-unit/src/test/java/org/apache/hive/beeline/hs2connection/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 297b443 HIVE-14865 Fix comments after HIVE-14350 (Eugene 
Koifman, reviewed by Alan Gates)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2016-10-01 08:19:52.736
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh 
/data/hive-ptest/working/scratch/build.patch
error: ql/src/test/org/apache/hadoop/hive/ql/parse/TestMergeStatement.java: No 
such file or directory
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12831209 - PreCommit-HIVE-Build

> test patch
> --
>
> Key: HIVE-14855
> URL: https://issues.apache.org/jira/browse/HIVE-14855
> Project: Hive
>  Issue Type: Bug
>  Components: Transactions
>Reporter: Eugene Koifman
>Assignee: Eugene Koifman
> Attachments: HIVE-14855.2.patch, HIVE-14855.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14063) beeline to auto connect to the HiveServer2

2016-10-01 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538150#comment-15538150
 ] 

Hive QA commented on HIVE-14063:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12831194/HIVE-14063.02.patch

{color:green}SUCCESS:{color} +1 due to 5 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 6 failed/errored test(s), 10671 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[acid_mapjoin]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[ctas]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[vector_join_part_col_char]
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3]
org.apache.hadoop.hive.metastore.TestMetaStoreMetrics.testMetaDataCounts
org.apache.hive.jdbc.TestJdbcWithMiniHS2.testAddJarConstructorUnCaching
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1368/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1368/console
Test logs: 
http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-Build-1368/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 6 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12831194 - PreCommit-HIVE-Build

> beeline to auto connect to the HiveServer2
> --
>
> Key: HIVE-14063
> URL: https://issues.apache.org/jira/browse/HIVE-14063
> Project: Hive
>  Issue Type: Improvement
>  Components: Beeline
>Reporter: Vihang Karajgaonkar
>Assignee: Vihang Karajgaonkar
>Priority: Minor
> Attachments: HIVE-14063.01.patch, HIVE-14063.02.patch, 
> beeline.conf.template
>
>
> Currently one has to give an jdbc:hive2 url in order for Beeline to connect a 
> hiveserver2 instance. It would be great if Beeline can get the info somehow 
> (from a properties file at a well-known location?) and connect automatically 
> if user doesn't specify such a url. If the properties file is not present, 
> then beeline would expect user to provide the url and credentials using 
> !connect or ./beeline -u .. commands
> While Beeline is flexible (being a mere JDBC client), most environments would 
> have just a single HS2. Having users to manually connect into this via either 
> "beeline ~/.propsfile" or -u or !connect statements is lowering the 
> experience part.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14100) Adding a new logged_in_user() UDF which returns the user provided when connecting

2016-10-01 Thread Lefty Leverenz (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538069#comment-15538069
 ] 

Lefty Leverenz commented on HIVE-14100:
---

Doc note:  The new UDF logged_in_user() needs to be documented in the wiki for 
release 2.2.0.

* [Hive Operators and UDFs -- Misc. Functions | 
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-Misc.Functions]

Added a TODOC2.2 label.

> Adding a new logged_in_user() UDF which returns the user provided when 
> connecting
> -
>
> Key: HIVE-14100
> URL: https://issues.apache.org/jira/browse/HIVE-14100
> Project: Hive
>  Issue Type: Bug
>  Components: Authentication, Beeline
>Reporter: Peter Vary
>Assignee: Peter Vary
>Priority: Minor
>  Labels: TODOC2.2
> Fix For: 2.2.0
>
> Attachments: HIVE-14100.2.patch, HIVE-14100.2.patch, 
> HIVE-14100.2.patch, HIVE-14100.patch
>
>
> There is an existing current_user() UDF which returns the user provided by 
> the configured {{hive.security.authenticator.manager}}. This is often the 
> same as the user provided on connection, but some cases, like 
> HadoopDefaultAuthenticator this could be different.
> Some cases we need the logged in user independently of the configured 
> authenticator, so a new UDF is created which provides this - returns the 
> {{SessionState.get().getUserName()}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-14100) Adding a new logged_in_user() UDF which returns the user provided when connecting

2016-10-01 Thread Lefty Leverenz (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-14100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lefty Leverenz updated HIVE-14100:
--
Labels: TODOC2.2  (was: )

> Adding a new logged_in_user() UDF which returns the user provided when 
> connecting
> -
>
> Key: HIVE-14100
> URL: https://issues.apache.org/jira/browse/HIVE-14100
> Project: Hive
>  Issue Type: Bug
>  Components: Authentication, Beeline
>Reporter: Peter Vary
>Assignee: Peter Vary
>Priority: Minor
>  Labels: TODOC2.2
> Fix For: 2.2.0
>
> Attachments: HIVE-14100.2.patch, HIVE-14100.2.patch, 
> HIVE-14100.2.patch, HIVE-14100.patch
>
>
> There is an existing current_user() UDF which returns the user provided by 
> the configured {{hive.security.authenticator.manager}}. This is often the 
> same as the user provided on connection, but some cases, like 
> HadoopDefaultAuthenticator this could be different.
> Some cases we need the logged in user independently of the configured 
> authenticator, so a new UDF is created which provides this - returns the 
> {{SessionState.get().getUserName()}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14863) Decimal to int conversion produces incorrect values

2016-10-01 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14863?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538057#comment-15538057
 ] 

Hive QA commented on HIVE-14863:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12831185/HIVE-14863.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 11 failed/errored test(s), 10653 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[acid_mapjoin]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[ctas]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[decimal_11]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[decimal_2]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[vector_join_part_col_char]
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[schema_evol_text_nonvec_part_all_primitive]
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[schema_evol_text_vec_part_all_primitive]
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[schema_evol_text_vecrow_part_all_primitive]
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3]
org.apache.hadoop.hive.metastore.TestMetaStoreMetrics.testMetaDataCounts
org.apache.hive.jdbc.TestJdbcWithMiniHS2.testAddJarConstructorUnCaching
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1367/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1367/console
Test logs: 
http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-Build-1367/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 11 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12831185 - PreCommit-HIVE-Build

> Decimal to int conversion produces incorrect values
> ---
>
> Key: HIVE-14863
> URL: https://issues.apache.org/jira/browse/HIVE-14863
> Project: Hive
>  Issue Type: Bug
>Reporter: Sergey Shelukhin
>Assignee: Sergey Shelukhin
>Priority: Critical
> Attachments: HIVE-14863.patch
>
>
> {noformat}
> > select cast(cast ('111' as decimal(38,0)) as int);
> OK
> 307163591
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-12222) Define port range in property for RPCServer

2016-10-01 Thread Lefty Leverenz (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-1?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538046#comment-15538046
 ] 

Lefty Leverenz commented on HIVE-1:
---

Doc note:  This adds *hive.spark.client.rpc.server.port* to HiveConf.java, so 
it needs to be documented in the wiki for release 2.2.0.

* [Configuration Properties -- Spark | 
https://cwiki.apache.org/confluence/display/Hive/Configuration+Properties#ConfigurationProperties-Spark]

Added a TODOC2.2 label.

> Define port range in property for RPCServer
> ---
>
> Key: HIVE-1
> URL: https://issues.apache.org/jira/browse/HIVE-1
> Project: Hive
>  Issue Type: Improvement
>  Components: CLI, Spark
>Affects Versions: 1.2.1
> Environment: Apache Hadoop 2.7.0
> Apache Hive 1.2.1
> Apache Spark 1.5.1
>Reporter: Andrew Lee
>Assignee: Aihua Xu
>  Labels: TODOC2.2
> Fix For: 2.2.0
>
> Attachments: HIVE-1.1.patch, HIVE-1.2.patch, 
> HIVE-1.3.patch
>
>
> Creating this JIRA after discussin with Xuefu on the dev mailing list. Would 
> need some help to review and update the fields in this JIRA ticket, thanks.
> I notice that in 
> ./spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java
> The port number is assigned with 0 which means it will be a random port every 
> time when the RPC Server is created to talk to Spark in the same session.
> Because of this, this is causing problems to configure firewall between the 
> HiveCLI RPC Server and Spark due to unpredictable port numbers here. In other 
> word, users need to open all hive ports range 
> from Data Node => HiveCLI (edge node).
> {code}
>  this.channel = new ServerBootstrap()
>   .group(group)
>   .channel(NioServerSocketChannel.class)
>   .childHandler(new ChannelInitializer() {
>   @Override
>   public void initChannel(SocketChannel ch) throws Exception {
> SaslServerHandler saslHandler = new SaslServerHandler(config);
> final Rpc newRpc = Rpc.createServer(saslHandler, config, ch, 
> group);
> saslHandler.rpc = newRpc;
> Runnable cancelTask = new Runnable() {
> @Override
> public void run() {
>   LOG.warn("Timed out waiting for hello from client.");
>   newRpc.close();
> }
> };
> saslHandler.cancelTask = group.schedule(cancelTask,
> RpcServer.this.config.getServerConnectTimeoutMs(),
> TimeUnit.MILLISECONDS);
>   }
>   })
> {code}
> 2 Main reasons.
> - Most users (what I see and encounter) use HiveCLI as a command line tool, 
> and in order to use that, they need to login to the edge node (via SSH). Now, 
> here comes the interesting part.
> Could be true or not, but this is what I observe and encounter from time to 
> time. Most users will abuse the resource on that edge node (increasing 
> HADOOP_HEAPSIZE, dumping output to local disk, running huge python workflow, 
> etc), this may cause the HS2 process to run into OOME, choke and die, etc. 
> various resource issues including others like login, etc.
> - Analyst connects to Hive via HS2 + ODBC. So HS2 needs to be highly 
> available. This makes sense to run it on the gateway node or a service node 
> and separated from the HiveCLI.
> The logs are located in different location, monitoring and auditing is easier 
> to run HS2 with a daemon user account, etc. so we don't want users to run 
> HiveCLI where HS2 is running.
> It's better to isolate the resource this way to avoid any memory, file 
> handlers, disk space, issues.
> From a security standpoint, 
> - Since users can login to edge node (via SSH), the security on the edge node 
> needs to be fortified and enhanced. Therefore, all the FW comes in and 
> auditing.
> - Regulation/compliance for auditing is another requirement to monitor all 
> traffic, specifying ports and locking down the ports makes it easier since we 
> can focus
> on a range to monitor and audit.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-12222) Define port range in property for RPCServer

2016-10-01 Thread Lefty Leverenz (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-1?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lefty Leverenz updated HIVE-1:
--
Labels: TODOC2.2  (was: )

> Define port range in property for RPCServer
> ---
>
> Key: HIVE-1
> URL: https://issues.apache.org/jira/browse/HIVE-1
> Project: Hive
>  Issue Type: Improvement
>  Components: CLI, Spark
>Affects Versions: 1.2.1
> Environment: Apache Hadoop 2.7.0
> Apache Hive 1.2.1
> Apache Spark 1.5.1
>Reporter: Andrew Lee
>Assignee: Aihua Xu
>  Labels: TODOC2.2
> Fix For: 2.2.0
>
> Attachments: HIVE-1.1.patch, HIVE-1.2.patch, 
> HIVE-1.3.patch
>
>
> Creating this JIRA after discussin with Xuefu on the dev mailing list. Would 
> need some help to review and update the fields in this JIRA ticket, thanks.
> I notice that in 
> ./spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java
> The port number is assigned with 0 which means it will be a random port every 
> time when the RPC Server is created to talk to Spark in the same session.
> Because of this, this is causing problems to configure firewall between the 
> HiveCLI RPC Server and Spark due to unpredictable port numbers here. In other 
> word, users need to open all hive ports range 
> from Data Node => HiveCLI (edge node).
> {code}
>  this.channel = new ServerBootstrap()
>   .group(group)
>   .channel(NioServerSocketChannel.class)
>   .childHandler(new ChannelInitializer() {
>   @Override
>   public void initChannel(SocketChannel ch) throws Exception {
> SaslServerHandler saslHandler = new SaslServerHandler(config);
> final Rpc newRpc = Rpc.createServer(saslHandler, config, ch, 
> group);
> saslHandler.rpc = newRpc;
> Runnable cancelTask = new Runnable() {
> @Override
> public void run() {
>   LOG.warn("Timed out waiting for hello from client.");
>   newRpc.close();
> }
> };
> saslHandler.cancelTask = group.schedule(cancelTask,
> RpcServer.this.config.getServerConnectTimeoutMs(),
> TimeUnit.MILLISECONDS);
>   }
>   })
> {code}
> 2 Main reasons.
> - Most users (what I see and encounter) use HiveCLI as a command line tool, 
> and in order to use that, they need to login to the edge node (via SSH). Now, 
> here comes the interesting part.
> Could be true or not, but this is what I observe and encounter from time to 
> time. Most users will abuse the resource on that edge node (increasing 
> HADOOP_HEAPSIZE, dumping output to local disk, running huge python workflow, 
> etc), this may cause the HS2 process to run into OOME, choke and die, etc. 
> various resource issues including others like login, etc.
> - Analyst connects to Hive via HS2 + ODBC. So HS2 needs to be highly 
> available. This makes sense to run it on the gateway node or a service node 
> and separated from the HiveCLI.
> The logs are located in different location, monitoring and auditing is easier 
> to run HS2 with a daemon user account, etc. so we don't want users to run 
> HiveCLI where HS2 is running.
> It's better to isolate the resource this way to avoid any memory, file 
> handlers, disk space, issues.
> From a security standpoint, 
> - Since users can login to edge node (via SSH), the security on the edge node 
> needs to be fortified and enhanced. Therefore, all the FW comes in and 
> auditing.
> - Regulation/compliance for auditing is another requirement to monitor all 
> traffic, specifying ports and locking down the ports makes it easier since we 
> can focus
> on a range to monitor and audit.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14579) Add support for date extract

2016-10-01 Thread Jesus Camacho Rodriguez (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15538029#comment-15538029
 ] 

Jesus Camacho Rodriguez commented on HIVE-14579:


[~cartershanklin], I have created HIVE-14873 as a follow-up; I checked quickly 
and there is no UDF for that now, but the implementation (based on 'day of 
month') should be straightforward.

> Add support for date extract
> 
>
> Key: HIVE-14579
> URL: https://issues.apache.org/jira/browse/HIVE-14579
> Project: Hive
>  Issue Type: Sub-task
>  Components: UDF
>Reporter: Ashutosh Chauhan
>Assignee: Jesus Camacho Rodriguez
>  Labels: TODOC2.2
> Fix For: 2.2.0
>
> Attachments: HIVE-14579.01.patch, HIVE-14579.patch, HIVE-14579.patch
>
>
> https://www.postgresql.org/docs/9.1/static/functions-datetime.html#FUNCTIONS-DATETIME-EXTRACT



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)