[
https://issues.apache.org/jira/browse/HIVE-13509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15243964#comment-15243964
]
Hive QA commented on HIVE-13509:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12798886/HIVE-13509.1.patch
{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 2 failed/errored test(s), 9977 tests executed
*Failed tests:*
{noformat}
TestMiniTezCliDriver-cte_4.q-orc_merge5.q-vectorization_limit.q-and-12-more -
did not produce a TEST-*.xml file
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver_index_bitmap3
{noformat}
Test results:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/7611/testReport
Console output:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/7611/console
Test logs:
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-7611/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 2 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12798886 - PreCommit-HIVE-TRUNK-Build
> HCatalog getSplits should ignore the partition with invalid path
> ----------------------------------------------------------------
>
> Key: HIVE-13509
> URL: https://issues.apache.org/jira/browse/HIVE-13509
> Project: Hive
> Issue Type: Improvement
> Components: HCatalog
> Reporter: Chaoyu Tang
> Assignee: Chaoyu Tang
> Attachments: HIVE-13509.1.patch, HIVE-13509.patch
>
>
> It is quite common that there is the discrepancy between partition directory
> and its HMS metadata, simply because the directory could be added/deleted
> externally using hdfs shell command. Technically it should be fixed by MSCK
> and alter table .. add/drop command etc, but sometimes it might not be
> practical especially in a multi-tenant env. This discrepancy does not cause
> any problem to Hive, Hive returns no rows for a partition with an invalid
> (e.g. non-existing) path, but it fails the Pig load with HCatLoader, because
> the HCatBaseInputFormat getSplits throws an error when getting a split for a
> non-existing path. The error message might looks like:
> {code}
> Caused by: org.apache.hadoop.mapred.InvalidInputException: Input path does
> not exist:
> hdfs://xyz.com:8020/user/hive/warehouse/xyz/date=2016-01-01/country=BR
> at
> org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:287)
> at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229)
> at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
> at
> org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:162)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)