[
https://issues.apache.org/jira/browse/HIVE-18778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16625179#comment-16625179
]
Hive QA commented on HIVE-18778:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12940952/HIVE-18778.7.patch
{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 35 failed/errored test(s), 14992 tests
executed
*Failed tests:*
{noformat}
TestMiniDruidCliDriver - did not produce a TEST-*.xml file (likely timed out)
(batchId=194)
[druidmini_masking.q,druidmini_test1.q,druidkafkamini_basic.q,druidmini_joins.q,druid_timestamptz.q]
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[authorization_show_grant]
(batchId=18)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[constant_prop_2]
(batchId=30)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[nonmr_fetch] (batchId=21)
org.apache.hadoop.hive.cli.TestEncryptedHDFSCliDriver.testCliDriver[encryption_join_with_different_encryption_keys]
(batchId=186)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_dynamic_partition]
(batchId=193)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_expressions]
(batchId=193)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_extractTime]
(batchId=192)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_floorTime]
(batchId=192)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_mv]
(batchId=192)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_test_insert]
(batchId=193)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_test_ts]
(batchId=193)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3]
(batchId=107)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby9]
(batchId=112)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_complex_types]
(batchId=140)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_complex_types_multi_single_reducer]
(batchId=150)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_cube1]
(batchId=110)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_map_ppr]
(batchId=113)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_map_ppr_multi_distinct]
(batchId=132)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_multi_insert_common_distinct]
(batchId=143)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_multi_single_reducer2]
(batchId=118)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_multi_single_reducer3]
(batchId=131)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_7]
(batchId=148)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_8]
(batchId=115)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_9]
(batchId=123)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_decimal_date]
(batchId=123)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_div0]
(batchId=145)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_limit]
(batchId=120)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_offset_limit]
(batchId=124)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_part_project]
(batchId=125)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[parquet_vectorization_pushdown]
(batchId=124)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[union] (batchId=110)
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[windowing]
(batchId=133)
org.apache.hadoop.hive.cli.TestSparkNegativeCliDriver.testCliDriver[spark_task_failure]
(batchId=266)
org.apache.hive.jdbc.TestJdbcWithMiniLlapArrow.testKillQuery (batchId=251)
{noformat}
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/14002/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/14002/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-14002/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 35 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12940952 - PreCommit-HIVE-Build
> Needs to capture input/output entities in explain
> -------------------------------------------------
>
> Key: HIVE-18778
> URL: https://issues.apache.org/jira/browse/HIVE-18778
> Project: Hive
> Issue Type: Bug
> Reporter: Daniel Dai
> Assignee: Daniel Dai
> Priority: Major
> Attachments: HIVE-18778-SparkPositive.patch, HIVE-18778.1.patch,
> HIVE-18778.2.patch, HIVE-18778.3.patch, HIVE-18778.4.patch,
> HIVE-18778.5.patch, HIVE-18778.6.patch, HIVE-18778.7.patch,
> HIVE-18778_TestCliDriver.patch, HIVE-18788_SparkNegative.patch,
> HIVE-18788_SparkPerf.patch
>
>
> With Sentry enabled, commands like explain drop table foo fail with {{explain
> drop table foo;}}
> {code}
> Error: Error while compiling statement: FAILED: SemanticException No valid
> privileges
> Required privilege( Table) not available in input privileges
> The required privileges: (state=42000,code=40000)
> {code}
> Sentry fails to authorize because the ExplainSemanticAnalyzer uses an
> instance of DDLSemanticAnalyzer to analyze the explain query.
> {code}
> BaseSemanticAnalyzer sem = SemanticAnalyzerFactory.get(conf, input);
> sem.analyze(input, ctx);
> sem.validate()
> {code}
> The inputs/outputs entities for this query are set in the above code.
> However, these are never set on the instance of ExplainSemanticAnalyzer
> itself and thus is not propagated into the HookContext in the calling Driver
> code.
> {code}
> sem.analyze(tree, ctx); --> this results in calling the above code that uses
> DDLSA
> hookCtx.update(sem); --> sem is an instance of ExplainSemanticAnalyzer, this
> code attempts to update the HookContext with the input/output info from ESA
> which is never set.
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)