[
https://issues.apache.org/jira/browse/HIVE-19941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16521051#comment-16521051
]
Hive QA commented on HIVE-19941:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12928673/HIVE-19941.1.patch
{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 12 failed/errored test(s), 14591 tests
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_masking]
(batchId=258)
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[infer_bucket_sort_reducers_power_two]
(batchId=184)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testMultipleTriggers2
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomCreatedDynamicPartitions
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomCreatedDynamicPartitionsMultiInsert
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomCreatedDynamicPartitionsUnionAll
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomCreatedFiles
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomNonExistent
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomReadOps
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighBytesRead
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighShuffleBytes
(batchId=244)
org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerVertexRawInputSplitsNoKill
(batchId=244)
{noformat}
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/12014/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/12014/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-12014/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 12 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12928673 - PreCommit-HIVE-Build
> Row based Filters added via Hive Ranger policies are not pushed to druid
> ------------------------------------------------------------------------
>
> Key: HIVE-19941
> URL: https://issues.apache.org/jira/browse/HIVE-19941
> Project: Hive
> Issue Type: Bug
> Reporter: Nishant Bangarwa
> Assignee: Nishant Bangarwa
> Priority: Major
> Attachments: HIVE-19941.1.patch, HIVE-19941.patch
>
>
> Issue is that when applying table mask we add virtual columns, however
> non-native tables do not have virtual columns, we need to skip adding virtual
> columns when generating masking query.
> Stack Trace -
> {code}
> org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:79 Invalid table
> alias or column reference 'BLOCK__OFFSET__INSIDE__FILE'
> : (possible column names are: __time, yearmonth, year, month, dayofmonth,
> dayofweek, weekofyear, hour, minute, second, payment_typ
> e, fare_amount, surcharge, mta_tax, tip_amount, tolls_amount, total_amount,
> trip_time)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:11830)
> ~[hive-exec-2.1.0.2.6.
> 4.0-91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:11778)
> ~[hive-exec-2.1.0.2.6.4.0
> -91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.genSelectLogicalPlan(CalcitePlanner.java:3780)
> ~[hi
> ve-exec-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.genLogicalPlan(CalcitePlanner.java:4117)
> ~[hive-exe
> c-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.genLogicalPlan(CalcitePlanner.java:4016)
> ~[hive-exe
> c-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.genLogicalPlan(CalcitePlanner.java:4060)
> ~[hive-exe
> c-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1340)
> ~[hive-exec-2.1.0.2
> .6.4.0-91.jar:2.1.0.2.6.4.0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1277)
> ~[hive-exec-2.1.0.2
> .6.4.0-91.jar:2.1.0.2.6.4.0-91]
> at org.apache.calcite.tools.Frameworks$1.apply(Frameworks.java:113)
> ~[calcite-core-1.10.0.2.6.4.0-91.jar:1.10.0.2.6.4.0-91
> ]
> at
> org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:997)
> ~[calcite-core-1.10.0.2.6.4.0-91.jar
> :1.10.0.2.6.4.0-91]
> at
> org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:149)
> ~[calcite-core-1.10.0.2.6.4.0-91.jar:1.10.0.2.6.4.
> 0-91]
> at
> org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106)
> ~[calcite-core-1.10.0.2.6.4.0-91.jar:1.10.0.2.6.4.
> 0-91]
> at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1082)
> ~[hive-exec-2.1.0.2.6.4.0-91.jar:2
> .1.0.2.6.4.0-91]
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)