[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16101155#comment-16101155 ] Lefty Leverenz commented on HIVE-15439: --- Doc note: This should be documented in the Druid Integration wikidoc. * [Druid Integration | https://cwiki.apache.org/confluence/display/Hive/Druid+Integration] > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Labels: TODOC2.2 > Fix For: 2.2.0 > > Attachments: HIVE-15439.3.patch, HIVE-15439.4.patch, > HIVE-15439.5.patch, HIVE-15439.6.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15838387#comment-15838387 ] ASF GitHub Bot commented on HIVE-15439: --- Github user b-slim closed the pull request at: https://github.com/apache/hive/pull/124 > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Labels: TODOC2.2 > Fix For: 2.2.0 > > Attachments: HIVE-15439.3.patch, HIVE-15439.4.patch, > HIVE-15439.5.patch, HIVE-15439.6.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15837866#comment-15837866 ] slim bouguerra commented on HIVE-15439: --- [~leftylev] Yes it needs to be added i guess. > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Fix For: 2.2.0 > > Attachments: HIVE-15439.3.patch, HIVE-15439.4.patch, > HIVE-15439.5.patch, HIVE-15439.6.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15837366#comment-15837366 ] Lefty Leverenz commented on HIVE-15439: --- Does this need to be documented in the wiki? > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Fix For: 2.2.0 > > Attachments: HIVE-15439.3.patch, HIVE-15439.4.patch, > HIVE-15439.5.patch, HIVE-15439.6.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15833944#comment-15833944 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12848830/HIVE-15439.6.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 6 failed/errored test(s), 10990 tests executed *Failed tests:* {noformat} TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out) (batchId=235) org.apache.hadoop.hive.cli.TestEncryptedHDFSCliDriver.testCliDriver[encryption_join_with_different_encryption_keys] (batchId=159) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[offset_limit_ppd_optimizer] (batchId=151) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_part] (batchId=149) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_char_simple] (batchId=147) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=223) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3123/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3123/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3123/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 6 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12848830 - PreCommit-HIVE-Build > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.3.patch, HIVE-15439.4.patch, > HIVE-15439.5.patch, HIVE-15439.6.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15833015#comment-15833015 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12848660/HIVE-15439.5.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3103/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3103/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3103/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-01-21 14:53:01.912 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-3103/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-01-21 14:53:01.915 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at d9343f6 HIVE-15544 : Support scalar subqueries (Vineet Garg via Ashutosh Chauhan) + git clean -f -d + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at d9343f6 HIVE-15544 : Support scalar subqueries (Vineet Garg via Ashutosh Chauhan) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-01-21 14:53:02.796 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file accumulo-handler/src/test/results/positive/accumulo_queries.q.out patching file accumulo-handler/src/test/results/positive/accumulo_single_sourced_multi_insert.q.out patching file druid-handler/pom.xml patching file druid-handler/src/java/org/apache/hadoop/hive/druid/DruidStorageHandler.java patching file druid-handler/src/test/org/apache/hadoop/hive/druid/DruidStorageHandlerTest.java patching file druid-handler/src/test/org/apache/hadoop/hive/druid/TestDerbyConnector.java patching file druid-handler/src/test/org/apache/hadoop/hive/ql/io/DruidRecordWriterTest.java patching file hbase-handler/src/test/results/positive/hbase_queries.q.out patching file hbase-handler/src/test/results/positive/hbase_single_sourced_multi_insert.q.out patching file hbase-handler/src/test/results/positive/hbasestats.q.out patching file metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaHookV2.java patching file metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java patching file metastore/src/java/org/apache/hadoop/hive/metastore/IMetaStoreClient.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/SortedDynPartitionTimeGranularityOptimizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java Hunk #7 succeeded at 10770 (offset 3 lines). Hunk #8 succeeded at 10795 (offset 3 lines). Hunk #9 succeeded at 10917 (offset 3 lines). patching file ql/src/java/org/apache/hadoop/hive/ql/plan/DDLWork.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/InsertTableDesc.java + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15832810#comment-15832810 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12848660/HIVE-15439.5.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3084/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3084/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3084/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-01-21 04:33:46.755 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-3084/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-01-21 04:33:46.758 + cd apache-github-source-source + git fetch origin >From https://github.com/apache/hive 0e78add..d9343f6 master -> origin/master + git reset --hard HEAD HEAD is now at 0e78add HIVE-15625 : escape1 test fails on Mac (Sergey Shelukhin, reviewed by Pengcheng Xiong) + git clean -f -d Removing common/src/java/org/apache/hadoop/hive/conf/HiveConf.java.orig Removing ql/src/java/org/apache/hadoop/hive/ql/exec/DynamicValueRegistry.java Removing ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeDynamicValueEvaluator.java Removing ql/src/java/org/apache/hadoop/hive/ql/exec/tez/DynamicValueRegistryTez.java Removing ql/src/java/org/apache/hadoop/hive/ql/parse/RuntimeValuesInfo.java Removing ql/src/java/org/apache/hadoop/hive/ql/plan/DynamicValue.java Removing ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDynamicValueDesc.java Removing ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFBloomFilter.java Removing ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFInBloomFilter.java Removing ql/src/test/queries/clientpositive/dynamic_semijoin_reduction.q Removing ql/src/test/results/clientpositive/llap/dynamic_semijoin_reduction.q.out Removing storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/LiteralDelegate.java + git checkout master Already on 'master' Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded. (use "git pull" to update your local branch) + git reset --hard origin/master HEAD is now at d9343f6 HIVE-15544 : Support scalar subqueries (Vineet Garg via Ashutosh Chauhan) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-01-21 04:33:48.383 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file accumulo-handler/src/test/results/positive/accumulo_queries.q.out patching file accumulo-handler/src/test/results/positive/accumulo_single_sourced_multi_insert.q.out patching file druid-handler/pom.xml patching file druid-handler/src/java/org/apache/hadoop/hive/druid/DruidStorageHandler.java patching file druid-handler/src/test/org/apache/hadoop/hive/druid/DruidStorageHandlerTest.java patching file druid-handler/src/test/org/apache/hadoop/hive/druid/TestDerbyConnector.java patching file druid-handler/src/test/org/apache/hadoop/hive/ql/io/DruidRecordWriterTest.java patching file hbase-handler/src/test/results/positive/hbase_queries.q.out patching file hbase-handler/src/test/results/positive/hbase_single_sourced_multi_insert.q.out patching file hbase-handler/src/test/results/positive/hbasestats.q.out patching file metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaHookV2.java patching file metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java patching file
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15832523#comment-15832523 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12848449/HIVE-15439.3.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3077/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3077/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3077/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-01-20 22:34:44.797 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-3077/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-01-20 22:34:44.800 + cd apache-github-source-source + git fetch origin >From https://github.com/apache/hive 811b3e3..ab7f6f3 master -> origin/master 40dab03..1695f68 storage-branch-2.2 -> origin/storage-branch-2.2 + git reset --hard HEAD HEAD is now at 811b3e3 HIVE-15580: Eliminate unbounded memory usage for orderBy and groupBy in Hive on Spark (reviewed by Chao Sun) + git clean -f -d Removing pom.xml.orig + git checkout master Already on 'master' Your branch is behind 'origin/master' by 2 commits, and can be fast-forwarded. (use "git pull" to update your local branch) + git reset --hard origin/master HEAD is now at ab7f6f3 HIVE-15390 : Orc reader unnecessarily reading stripe footers with hive.optimize.index.filter set to true (Abhishek Somani, reviewed by Sergey Shelukhin and Prasanth Jayachandran) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-01-20 22:34:46.489 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: patch failed: accumulo-handler/src/test/results/positive/accumulo_single_sourced_multi_insert.q.out:1 error: accumulo-handler/src/test/results/positive/accumulo_single_sourced_multi_insert.q.out: patch does not apply The patch does not appear to apply with p0, p1, or p2 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12848449 - PreCommit-HIVE-Build > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.3.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15829307#comment-15829307 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12848075/HIVE-15439.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 351 failed/errored test(s), 10959 tests executed *Failed tests:* {noformat} TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out) (batchId=233) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_joins] (batchId=217) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_predicate_pushdown] (batchId=217) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_queries] (batchId=217) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_single_sourced_multi_insert] (batchId=217) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[ctas] (batchId=229) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[insert_into_dynamic_partitions] (batchId=229) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[insert_into_table] (batchId=229) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[insert_overwrite_directory] (batchId=229) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[insert_overwrite_dynamic_partitions] (batchId=229) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[insert_overwrite_table] (batchId=229) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[write_final_output_blobstore] (batchId=229) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[acid_subquery] (batchId=36) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[acid_table_stats] (batchId=48) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[analyze_tbl_part] (batchId=44) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[annotate_stats_join_pkfk] (batchId=13) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[avrocountemptytbl] (batchId=74) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[cbo_rp_udf_percentile2] (batchId=18) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[cbo_rp_udf_percentile] (batchId=39) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[create_or_replace_view] (batchId=36) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[decimal_udf] (batchId=8) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[input19] (batchId=79) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_overwrite_directory] (batchId=25) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[join46] (batchId=1) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[join_emit_interval] (batchId=10) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[mapjoin46] (batchId=53) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[multi_insert_gby4] (batchId=43) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[nested_column_pruning] (batchId=31) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample5] (batchId=52) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[serde_opencsv] (batchId=68) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[smb_mapjoin_46] (batchId=38) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[specialChar] (batchId=22) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[subquery_exists] (batchId=38) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[subquery_notexists] (batchId=81) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[subquery_notin_having] (batchId=45) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[subquery_unqualcolumnrefs] (batchId=17) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_array_contains] (batchId=12) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_conv] (batchId=21) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_date_add] (batchId=44) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_date_sub] (batchId=2) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_hex] (batchId=22) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_java_method] (batchId=63) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_map_keys] (batchId=62) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_map_values] (batchId=46) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_months_between] (batchId=48) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_negative] (batchId=1) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_not] (batchId=51) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_percentile] (batchId=59) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_positive] (batchId=39)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15820242#comment-15820242 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12847092/HIVE-15439.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 8 failed/errored test(s), 10941 tests executed *Failed tests:* {noformat} TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out) (batchId=234) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_queries] (batchId=217) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_single_sourced_multi_insert] (batchId=217) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[offset_limit_ppd_optimizer] (batchId=150) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_part] (batchId=148) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[subquery_notin] (batchId=150) org.apache.hadoop.hive.ql.security.authorization.plugin.TestHiveAuthorizerCheckInvocation.org.apache.hadoop.hive.ql.security.authorization.plugin.TestHiveAuthorizerCheckInvocation (batchId=208) org.apache.hadoop.hive.ql.security.authorization.plugin.TestHiveAuthorizerShowFilters.org.apache.hadoop.hive.ql.security.authorization.plugin.TestHiveAuthorizerShowFilters (batchId=208) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2899/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2899/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2899/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 8 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12847092 - PreCommit-HIVE-Build > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch, HIVE-15439.patch, HIVE-15439.patch, > HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15819371#comment-15819371 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12847066/HIVE-15439.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2892/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2892/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2892/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-01-11 22:25:17.697 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-2892/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-01-11 22:25:17.703 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 58247c5 HIVE-15143 : add logging for HIVE-15024 ADDENDUM (Sergey Shelukhin, reviewed by Siddharth Seth) + git clean -f -d + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 58247c5 HIVE-15143 : add logging for HIVE-15024 ADDENDUM (Sergey Shelukhin, reviewed by Siddharth Seth) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-01-11 22:25:22.260 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: a/druid-handler/pom.xml: No such file or directory error: a/druid-handler/src/java/org/apache/hadoop/hive/druid/DruidStorageHandler.java: No such file or directory error: a/druid-handler/src/test/org/apache/hadoop/hive/druid/DruidStorageHandlerTest.java: No such file or directory error: a/druid-handler/src/test/org/apache/hadoop/hive/druid/TestDerbyConnector.java: No such file or directory error: a/druid-handler/src/test/org/apache/hadoop/hive/ql/io/DruidRecordWriterTest.java: No such file or directory error: a/hbase-handler/src/test/results/positive/hbase_queries.q.out: No such file or directory error: a/hbase-handler/src/test/results/positive/hbase_single_sourced_multi_insert.q.out: No such file or directory error: a/hbase-handler/src/test/results/positive/hbasestats.q.out: No such file or directory error: a/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java: No such file or directory error: a/metastore/src/java/org/apache/hadoop/hive/metastore/IMetaStoreClient.java: No such file or directory error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java: No such file or directory error: a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/SortedDynPartitionTimeGranularityOptimizer.java: No such file or directory error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java: No such file or directory error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DDLWork.java: No such file or directory error: a/ql/src/test/results/clientpositive/case_sensitivity.q.out: No such file or directory error: a/ql/src/test/results/clientpositive/input_testxpath.q.out: No such file or directory error: a/ql/src/test/results/clientpositive/udf_coalesce.q.out: No such file or directory The patch does not appear to apply with p0, p1, or p2 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12847066 - PreCommit-HIVE-Build > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task >
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15817780#comment-15817780 ] Jesus Camacho Rodriguez commented on HIVE-15439: [~bslim], can you double check the test failures? It seems you might need to regenerate some q files (e.g. related to HBase and Accumulo storage handlers). > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15816303#comment-15816303 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12846628/HIVE-15439.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 13 failed/errored test(s), 10949 tests executed *Failed tests:* {noformat} TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out) (batchId=233) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_queries] (batchId=217) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_single_sourced_multi_insert] (batchId=217) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[case_sensitivity] (batchId=61) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[input_testxpath] (batchId=28) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_coalesce] (batchId=75) org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver[hbase_queries] (batchId=89) org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver[hbase_single_sourced_multi_insert] (batchId=90) org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver[hbasestats] (batchId=88) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_basic] (batchId=134) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_schema_evol_3a] (batchId=135) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_part] (batchId=148) org.apache.hive.hcatalog.api.TestHCatClientNotification.createTable (batchId=219) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2865/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2865/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2865/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 13 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12846628 - PreCommit-HIVE-Build > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15815933#comment-15815933 ] Jesus Camacho Rodriguez commented on HIVE-15439: +1 (pending QA) > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15815906#comment-15815906 ] slim bouguerra commented on HIVE-15439: --- uploaded new patch after addressing comments from [~jcamachorodriguez] thanks > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch, HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15755356#comment-15755356 ] Hive QA commented on HIVE-15439: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12843622/HIVE-15439.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 19 failed/errored test(s), 10821 tests executed *Failed tests:* {noformat} TestDerbyConnector - did not produce a TEST-*.xml file (likely timed out) (batchId=233) TestVectorizedColumnReaderBase - did not produce a TEST-*.xml file (likely timed out) (batchId=250) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_queries] (batchId=217) org.apache.hadoop.hive.cli.TestAccumuloCliDriver.testCliDriver[accumulo_single_sourced_multi_insert] (batchId=217) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample2] (batchId=5) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample4] (batchId=15) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample6] (batchId=61) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample7] (batchId=60) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample9] (batchId=39) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udf_sort_array] (batchId=59) org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver[hbase_queries] (batchId=89) org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver[hbase_single_sourced_multi_insert] (batchId=90) org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver[hbasestats] (batchId=88) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_schema_evol_3a] (batchId=135) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[transform_ppr2] (batchId=135) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[metadataonly1] (batchId=150) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[subquery_notin] (batchId=150) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_nested_subquery] (batchId=84) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[subquery_shared_alias] (batchId=84) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2614/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2614/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2614/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 19 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12843622 - PreCommit-HIVE-Build > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15755078#comment-15755078 ] slim bouguerra commented on HIVE-15439: --- [~ashutoshc] and [~jcamachorodriguez] can you please take look here. > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-15439) Support INSERT OVERWRITE for internal druid datasources.
[ https://issues.apache.org/jira/browse/HIVE-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15755072#comment-15755072 ] ASF GitHub Bot commented on HIVE-15439: --- GitHub user b-slim opened a pull request: https://github.com/apache/hive/pull/124 [HIVE-15439] adding support for insert overwite Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. In order to add this support will need to add new post insert hook to update the druid metadata. Creation of the segment will be the same as CTAS. https://issues.apache.org/jira/browse/HIVE-15439 You can merge this pull request into a Git repository by running: $ git pull https://github.com/b-slim/hive HIVE-15439-adding-insert Alternatively you can review and apply these changes as the patch at: https://github.com/apache/hive/pull/124.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #124 commit 26b2667d490c6de71d691aa38322b0c1c7e6778e Author: Slim BouguerraDate: 2016-12-15T23:22:11Z adding commit insert function def to MetaHook commit 02f6a8935ab58eb663b6ccf34a931e0e68d4af14 Author: Slim Bouguerra Date: 2016-12-16T00:44:44Z adding commit insert to the FileSkinPlan commit 0e9b1f6e5eca815be6a9724188275f32e65fd40c Author: Slim Bouguerra Date: 2016-12-16T17:47:29Z add joda dependency for UTs and add copyrights > Support INSERT OVERWRITE for internal druid datasources. > > > Key: HIVE-15439 > URL: https://issues.apache.org/jira/browse/HIVE-15439 > Project: Hive > Issue Type: Sub-task > Components: Druid integration >Affects Versions: 2.2.0 >Reporter: slim bouguerra >Assignee: slim bouguerra > Attachments: HIVE-15439.patch > > > Add support for SQL statement INSERT OVERWRITE TABLE druid_internal_table. > In order to add this support will need to add new post insert hook to update > the druid metadata. Creation of the segment will be the same as CTAS. > -- This message was sent by Atlassian JIRA (v6.3.4#6332)