[ 
https://issues.apache.org/jira/browse/HIVE-15381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15731089#comment-15731089
 ] 

Hive QA commented on HIVE-15381:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12842227/HIVE-15381.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 10 failed/errored test(s), 10752 tests 
executed
*Failed tests:*
{noformat}
TestMiniLlapLocalCliDriver - did not produce a TEST-*.xml file (likely timed 
out) (batchId=141)
        
[acid_vectorization_missing_cols.q,orc_merge9.q,vector_acid3.q,delete_where_no_match.q,vector_reduce1.q,stats_only_null.q,vectorization_part_project.q,vectorization_6.q,count.q,tez_vector_dynpart_hashjoin_2.q,parallel.q,delete_all_non_partitioned.q,delete_all_partitioned.q,vectorization_10.q,insert1.q,custom_input_output_format.q,vectorized_bucketmapjoin1.q,cbo_rp_windowing_2.q,vector_reduce3.q,smb_cache.q,hybridgrace_hashjoin_1.q,vector_count_distinct.q,schema_evol_orc_acid_part.q,hybridgrace_hashjoin_2.q,cross_join.q,parquet_predicate_pushdown.q,vector_varchar_mapjoin1.q,tez_smb_main.q,quotedid_smb.q,vector_bucket.q]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample2] (batchId=5)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample4] (batchId=15)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample6] (batchId=61)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample7] (batchId=60)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample9] (batchId=38)
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[transform_ppr2] 
(batchId=134)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[stats_based_fetch_decision]
 (batchId=150)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] 
(batchId=92)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_4] 
(batchId=92)
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2478/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2478/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2478/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 10 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12842227 - PreCommit-HIVE-Build

> don't log the callstack for reduce.xml-doesn't-exist
> ----------------------------------------------------
>
>                 Key: HIVE-15381
>                 URL: https://issues.apache.org/jira/browse/HIVE-15381
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Sergey Shelukhin
>            Assignee: Sergey Shelukhin
>            Priority: Trivial
>         Attachments: HIVE-15381.patch
>
>
> Pointless exception in the logs:
> {noformat}
> java.io.FileNotFoundException: File 
> file:[snip]/itests/qtest/target/tmp/localscratchdir/bcc7fce3-b9a3-4d5a-bf52-4e3b70ad9fed/hive_2016-12-07_09-53-18_167_8716888773328063866-1/-mr-10002/3fb6d7bd-d8b6-4238-bc08-90b2f0217197/reduce.xml
>  does not exist
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:609)
>  ~[hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:822)
>  ~[hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:599)
>  ~[hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
>  ~[hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:140)
>  ~[hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:341) 
> ~[hadoop-common-2.7.2.jar:?]
>       at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:767) 
> ~[hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:421) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getReduceWork(Utilities.java:313) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:292) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.ql.io.HiveOutputFormatImpl.checkOutputSpecs(HiveOutputFormatImpl.java:61)
>  [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:268) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
>  [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at java.security.AccessController.doPrivileged(Native Method) 
> ~[?:1.8.0_102]
>       at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_102]
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>  [hadoop-common-2.7.2.jar:?]
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at java.security.AccessController.doPrivileged(Native Method) 
> ~[?:1.8.0_102]
>       at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_102]
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>  [hadoop-common-2.7.2.jar:?]
>       at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561) 
> [hadoop-mapreduce-client-core-2.7.2.jar:?]
>       at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:401) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:151) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2166) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1822) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1510) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1221) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1211) 
> [hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) 
> [hive-cli-2.2.0-SNAPSHOT.jar:?]
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) 
> [hive-cli-2.2.0-SNAPSHOT.jar:?]
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:400) 
> [hive-cli-2.2.0-SNAPSHOT.jar:?]
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336) 
> [hive-cli-2.2.0-SNAPSHOT.jar:?]
>       at 
> org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:1131) 
> [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:1108) 
> [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.cli.control.CoreCliDriver$3.invokeInternal(CoreCliDriver.java:81)
>  [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.cli.control.CoreCliDriver$3.invokeInternal(CoreCliDriver.java:78)
>  [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.util.ElapsedTimeLoggingWrapper.invoke(ElapsedTimeLoggingWrapper.java:33)
>  [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.cli.control.CoreCliDriver.beforeClass(CoreCliDriver.java:84)
>  [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at 
> org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:71)
>  [hive-it-util-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>       at org.junit.rules.RunRules.evaluate(RunRules.java:20) 
> [junit-4.11.jar:?]
>       at org.junit.runners.ParentRunner.run(ParentRunner.java:309) 
> [junit-4.11.jar:?]
>       at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:367)
>  [surefire-junit4-2.19.1.jar:2.19.1]
>       at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:274)
>  [surefire-junit4-2.19.1.jar:2.19.1]
>       at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>  [surefire-junit4-2.19.1.jar:2.19.1]
>       at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:161)
>  [surefire-junit4-2.19.1.jar:2.19.1]
>       at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>  [surefire-booter-2.19.1.jar:2.19.1]
>       at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>  [surefire-booter-2.19.1.jar:2.19.1]
>       at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121) 
> [surefire-booter-2.19.1.jar:2.19.1]
> {noformat}
> This is not even an error, so it should be logged at info or warn with no 
> stack



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to