[ 
https://issues.apache.org/jira/browse/HIVE-15090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15651643#comment-15651643
 ] 

Hive QA commented on HIVE-15090:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12838189/HIVE-15090.3-branch-2.1.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 20 failed/errored test(s), 10462 tests 
executed
*Failed tests:*
{noformat}
TestJdbcWithMiniHA - did not produce a TEST-*.xml file (likely timed out) 
(batchId=494)
TestJdbcWithMiniMr - did not produce a TEST-*.xml file (likely timed out) 
(batchId=491)
TestMsgBusConnection - did not produce a TEST-*.xml file (likely timed out) 
(batchId=362)
TestOperationLoggingAPIWithTez - did not produce a TEST-*.xml file (likely 
timed out) (batchId=484)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_acid_table_stats 
(batchId=92)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_insert_values_orig_table_use_metadata
 (batchId=109)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_list_bucket_dml_12 
(batchId=87)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_schema_evol_3a 
(batchId=97)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_stats_null_optimizer 
(batchId=154)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_between_in 
(batchId=99)
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver_orc_ppd_basic 
(batchId=521)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver_constprog_partitioner
 (batchId=539)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_orc_ppd_basic 
(batchId=187)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_orc_ppd_schema_evol_3a
 (batchId=198)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_between_in 
(batchId=199)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_cast_constant
 (batchId=183)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_complex_all
 (batchId=200)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_vector_between_in 
(batchId=233)
org.apache.hive.jdbc.TestJdbcWithMiniHS2.testAddJarConstructorUnCaching 
(batchId=492)
org.apache.hive.jdbc.TestJdbcWithMiniLlap.testLlapInputFormatEndToEnd 
(batchId=487)
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2049/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2049/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2049/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 20 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12838189 - PreCommit-HIVE-Build

> Temporary DB failure can stop ExpiredTokenRemover thread
> --------------------------------------------------------
>
>                 Key: HIVE-15090
>                 URL: https://issues.apache.org/jira/browse/HIVE-15090
>             Project: Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 1.3.0, 2.1.0, 2.0.1, 2.2.0
>            Reporter: Peter Vary
>            Assignee: Peter Vary
>             Fix For: 2.2.0
>
>         Attachments: HIVE-15090.2-branch-2.1.patch, HIVE-15090.2.patch, 
> HIVE-15090.2.patch, HIVE-15090.3-branch-2.1.patch, HIVE-15090.patch
>
>
> In HIVE-13090 we decided that we should not close the metastore if there is 
> an unexpected exception during the expired token removal process, but that 
> fix leaves a running metastore without ExpiredTokenRemover thread.
> To fix this I will move the catch inside the running loop, and hope the 
> thread could recover from the exception



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to