[
https://issues.apache.org/jira/browse/HIVE-15277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15747102#comment-15747102
]
Hive QA commented on HIVE-15277:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12843141/HIVE-15277.patch
{color:green}SUCCESS:{color} +1 due to 6 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 17 failed/errored test(s), 10814 tests
executed
*Failed tests:*
{noformat}
TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out)
(batchId=234)
TestVectorizedColumnReaderBase - did not produce a TEST-*.xml file (likely
timed out) (batchId=251)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[druid_basic2]
(batchId=10)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample2] (batchId=5)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample4] (batchId=15)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample6] (batchId=61)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample7] (batchId=60)
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample9] (batchId=38)
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[transform_ppr2]
(batchId=135)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[cbo_rp_lineage2]
(batchId=139)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[lineage2]
(batchId=148)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[lineage3]
(batchId=146)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[metadataonly1]
(batchId=150)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[stats_based_fetch_decision]
(batchId=151)
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2]
(batchId=93)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[druid_external]
(batchId=85)
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[druid_location]
(batchId=85)
{noformat}
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2567/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2567/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2567/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 17 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12843141 - PreCommit-HIVE-Build
> Teach Hive how to create/delete Druid segments
> -----------------------------------------------
>
> Key: HIVE-15277
> URL: https://issues.apache.org/jira/browse/HIVE-15277
> Project: Hive
> Issue Type: Sub-task
> Components: Druid integration
> Affects Versions: 2.2.0
> Reporter: slim bouguerra
> Assignee: slim bouguerra
> Attachments: HIVE-15277.2.patch, HIVE-15277.patch, HIVE-15277.patch,
> file.patch
>
>
> We want to extend the DruidStorageHandler to support CTAS queries.
> In this implementation Hive will generate druid segment files and insert the
> metadata to signal the handoff to druid.
> The syntax will be as follows:
> {code:sql}
> CREATE TABLE druid_table_1
> STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'
> TBLPROPERTIES ("druid.datasource" = "datasourcename")
> AS <select `timecolumn` as `___time`, `dimension1`,`dimension2`, `metric1`,
> `metric2`....>;
> {code}
> This statement stores the results of query <input_query> in a Druid
> datasource named 'datasourcename'. One of the columns of the query needs to
> be the time dimension, which is mandatory in Druid. In particular, we use the
> same convention that it is used for Druid: there needs to be a the column
> named '__time' in the result of the executed query, which will act as the
> time dimension column in Druid. Currently, the time column dimension needs to
> be a 'timestamp' type column.
> metrics can be of type long, double and float while dimensions are strings.
> Keep in mind that druid has a clear separation between dimensions and
> metrics, therefore if you have a column in hive that contains number and need
> to be presented as dimension use the cast operator to cast as string.
> This initial implementation interacts with Druid Meta data storage to
> add/remove the table in druid, user need to supply the meta data config as
> --hiveconf hive.druid.metadata.password=XXX --hiveconf
> hive.druid.metadata.username=druid --hiveconf
> hive.druid.metadata.uri=jdbc:mysql://host/druid
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)