[
https://issues.apache.org/jira/browse/HIVE-15850?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15858094#comment-15858094
]
Hive QA commented on HIVE-15850:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12851635/HIVE-15850.patch
{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 5 failed/errored test(s), 10226 tests
executed
*Failed tests:*
{noformat}
TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out)
(batchId=235)
TestSparkCliDriver - did not produce a TEST-*.xml file (likely timed out)
(batchId=121)
[stats12.q,groupby4.q,union_top_level.q,stats2.q,groupby10.q,mapjoin_filter_on_outerjoin.q,auto_sortmerge_join_4.q,limit_partition_metadataonly.q,load_dyn_part4.q,union3.q,groupby_multi_single_reducer.q,smb_mapjoin_14.q,groupby3_noskew_multi_distinct.q,stats18.q,union_remove_21.q]
org.apache.hadoop.hive.cli.TestEncryptedHDFSCliDriver.testCliDriver[encryption_join_with_different_encryption_keys]
(batchId=159)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14]
(batchId=223)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23]
(batchId=223)
{noformat}
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3440/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3440/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3440/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 5 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12851635 - PreCommit-HIVE-Build
> Proper handling of timezone in Druid storage handler
> ----------------------------------------------------
>
> Key: HIVE-15850
> URL: https://issues.apache.org/jira/browse/HIVE-15850
> Project: Hive
> Issue Type: Bug
> Components: Druid integration
> Affects Versions: 2.2.0
> Reporter: Jesus Camacho Rodriguez
> Assignee: Jesus Camacho Rodriguez
> Priority: Critical
> Attachments: HIVE-15850.patch
>
>
> We need to make sure that filters on timestamp are represented with timezone
> when we go into Calcite and converting them again when we go back from
> Calcite to Hive. That would help us to 1) push the correct filters to Druid,
> and 2) if filters are not pushed at all (they remain in the Calcite plan),
> they will be correctly represented in Hive. I have checked and AFAIK this is
> currently done correctly (ASTBuilder.java, ExprNodeConverter.java, and
> RexNodeConverter.java).
> Secondly, we need to make sure we read/write timestamp data correctly from/to
> Druid.
> - When we write timestamp to Druid, we should include the timezone, which
> would allow Druid to handle them properly. We do that already.
> - When we read timestamp from Druid, we should transform the timestamp to be
> based on Hive timezone. This will give us a consistent behavior of
> Druid-on-Hive vs Hive-standalone, since timestamp in Hive is represented to
> the user using Hive client timezone. Currently we do not do that.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)