Strings truncated to length 334 when hive data is accessed through hive ODBC driver.
Hi all, I'm trying to fetch data from hive server using hive ODBC driver. I observed that strings begin fetched at application side are truncated to length 334, I further looked into the *hiveclient *code and found there are two variables which are hardcoded to “334” ./src/odbc/src/cpp/thriftserverconstants.h:static const int MAX_DISPLAY_SIZE = 334; ./src/odbc/src/cpp/thriftserverconstants.h:static const int MAX_BYTE_LENGTH = 334; I tried finding usage of these variables and looks like MAX_BYTE_LENGTH is causing issue of string truncation. abhi@vmabhihive:~/hhh/ts/hive-0.8.1$abhi@vm-abhihive:%7E/hhh/ts/hive-0.8.1$find . -type f -name *.h | xargs grep MAX_BYTE_LENGTH ./src/odbc/src/cpp/HiveRowSet.h:*/// Forces all data retrieved to be no more than MAX_BYTE_LENGTH* ./src/odbc/src/cpp/HiveRowSet.h:char m_field_buffer[MAX_BYTE_LENGTH + 1]; Does anyone know why 334 is hardcoded here and any specific reason for the number 334 ? Thanks, Abhijeet. -- Rgds, Abhijeet J. Apsunde
[ANNOUNCE] New Hive Committer - Navis Ryu
The Apache Hive PMC has passed a vote to make Navis Ryu a new committer on the project. JIRA is currently down, so I can't send out a link with his contribution list at the moment, but if you have an account at reviews.facebook.net, you can see his activity here: https://reviews.facebook.net/p/navis/ Navis, please submit your CLA to the Apache Software Foundation as described here: http://www.apache.org/licenses/#clas Congratulations! JVS
Hive-trunk-h0.21 - Build # 1600 - Still Failing
Changes for Build #1598 [cws] HIVE-2789. query_properties.q contains non-deterministic queries (Zhenxiao Luo via cws) Changes for Build #1599 Changes for Build #1600 [kevinwilfong] HIVE-3293. Load file into a table does not update table statistics. (njain via kevinwilfong) 3 tests failed. REGRESSION: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf_ucase Error Message: Timeout occurred. Please note the time in the report does not reflect the time until the timeout. Stack Trace: junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout. at net.sf.antcontrib.logic.ForTask.doSequentialIteration(ForTask.java:259) at net.sf.antcontrib.logic.ForTask.doToken(ForTask.java:268) at net.sf.antcontrib.logic.ForTask.doTheTasks(ForTask.java:324) at net.sf.antcontrib.logic.ForTask.execute(ForTask.java:244) REGRESSION: org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_stats_aggregator_error_1 Error Message: Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit. Stack Trace: junit.framework.AssertionFailedError: Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit. at net.sf.antcontrib.logic.ForTask.doSequentialIteration(ForTask.java:259) at net.sf.antcontrib.logic.ForTask.doToken(ForTask.java:268) at net.sf.antcontrib.logic.ForTask.doTheTasks(ForTask.java:324) at net.sf.antcontrib.logic.ForTask.execute(ForTask.java:244) REGRESSION: org.apache.hadoop.hive.service.TestHiveServerSessions.testSessionVars Error Message: java.net.ConnectException: Connection refused Stack Trace: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused at org.apache.thrift.transport.TSocket.open(TSocket.java:183) at org.apache.hadoop.hive.service.TestHiveServerSessions.setUp(TestHiveServerSessions.java:59) at junit.framework.TestCase.runBare(TestCase.java:132) at junit.framework.TestResult$1.protect(TestResult.java:110) at junit.framework.TestResult.runProtected(TestResult.java:128) at junit.framework.TestResult.run(TestResult.java:113) at junit.framework.TestCase.run(TestCase.java:124) at junit.framework.TestSuite.runTest(TestSuite.java:243) at junit.framework.TestSuite.run(TestSuite.java:238) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) Caused by: java.net.ConnectException: Connection refused at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351) at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) at java.net.Socket.connect(Socket.java:529) at org.apache.thrift.transport.TSocket.open(TSocket.java:178) ... 11 more The Apache Jenkins build system has built Hive-trunk-h0.21 (build #1600) Status: Still Failing Check console output at https://builds.apache.org/job/Hive-trunk-h0.21/1600/ to view the results.
Re: [ANNOUNCE] New Hive Committer - Navis Ryu
Keep up the good work Navis! Congratulations. Come hang out on #hive irc. Edward On Fri, Aug 10, 2012 at 5:58 AM, John Sichi jsi...@gmail.com wrote: The Apache Hive PMC has passed a vote to make Navis Ryu a new committer on the project. JIRA is currently down, so I can't send out a link with his contribution list at the moment, but if you have an account at reviews.facebook.net, you can see his activity here: https://reviews.facebook.net/p/navis/ Navis, please submit your CLA to the Apache Software Foundation as described here: http://www.apache.org/licenses/#clas Congratulations! JVS
Re: [ANNOUNCE] New Hive Committer - Navis Ryu
Congrats Navis.. :) Regards Bejoy KS Sent from handheld, please excuse typos. -Original Message- From: alo alt wget.n...@gmail.com Date: Fri, 10 Aug 2012 17:08:07 To: u...@hive.apache.org Reply-To: u...@hive.apache.org Cc: dev@hive.apache.org; navis@nexr.com Subject: Re: [ANNOUNCE] New Hive Committer - Navis Ryu Congratulations! Well done :) cheers, ALex On Aug 10, 2012, at 11:58 AM, John Sichi jsi...@gmail.com wrote: The Apache Hive PMC has passed a vote to make Navis Ryu a new committer on the project. JIRA is currently down, so I can't send out a link with his contribution list at the moment, but if you have an account at reviews.facebook.net, you can see his activity here: https://reviews.facebook.net/p/navis/ Navis, please submit your CLA to the Apache Software Foundation as described here: http://www.apache.org/licenses/#clas Congratulations! JVS -- Alexander Alten-Lorenz http://mapredit.blogspot.com German Hadoop LinkedIn Group: http://goo.gl/N8pCF
Re: [ANNOUNCE] New Hive Committer - Navis Ryu
Congratulations! Well done :) cheers, ALex On Aug 10, 2012, at 11:58 AM, John Sichi jsi...@gmail.com wrote: The Apache Hive PMC has passed a vote to make Navis Ryu a new committer on the project. JIRA is currently down, so I can't send out a link with his contribution list at the moment, but if you have an account at reviews.facebook.net, you can see his activity here: https://reviews.facebook.net/p/navis/ Navis, please submit your CLA to the Apache Software Foundation as described here: http://www.apache.org/licenses/#clas Congratulations! JVS -- Alexander Alten-Lorenz http://mapredit.blogspot.com German Hadoop LinkedIn Group: http://goo.gl/N8pCF
Build failed in Jenkins: Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false #102
See https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/ -- [...truncated 10115 lines...] [echo] Project: odbc [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/odbc/src/conf does not exist. ivy-resolve-test: [echo] Project: odbc ivy-retrieve-test: [echo] Project: odbc compile-test: [echo] Project: odbc create-dirs: [echo] Project: serde [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/serde/src/test/resources does not exist. init: [echo] Project: serde ivy-init-settings: [echo] Project: serde ivy-resolve: [echo] Project: serde [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml [ivy:report] Processing https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/ivy/resolution-cache/org.apache.hive-hive-serde-default.xml to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/ivy/report/org.apache.hive-hive-serde-default.html ivy-retrieve: [echo] Project: serde dynamic-serde: compile: [echo] Project: serde ivy-resolve-test: [echo] Project: serde ivy-retrieve-test: [echo] Project: serde compile-test: [echo] Project: serde [javac] Compiling 26 source files to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/serde/test/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. create-dirs: [echo] Project: service [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/service/src/test/resources does not exist. init: [echo] Project: service ivy-init-settings: [echo] Project: service ivy-resolve: [echo] Project: service [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml [ivy:report] Processing https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/ivy/resolution-cache/org.apache.hive-hive-service-default.xml to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/ivy/report/org.apache.hive-hive-service-default.html ivy-retrieve: [echo] Project: service compile: [echo] Project: service ivy-resolve-test: [echo] Project: service ivy-retrieve-test: [echo] Project: service compile-test: [echo] Project: service [javac] Compiling 2 source files to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/service/test/classes test: [echo] Project: hive test-shims: [echo] Project: hive test-conditions: [echo] Project: shims gen-test: [echo] Project: shims create-dirs: [echo] Project: shims [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/shims/src/test/resources does not exist. init: [echo] Project: shims ivy-init-settings: [echo] Project: shims ivy-resolve: [echo] Project: shims [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml [ivy:report] Processing https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/ivy/resolution-cache/org.apache.hive-hive-shims-default.xml to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/ivy/report/org.apache.hive-hive-shims-default.html ivy-retrieve: [echo] Project: shims compile: [echo] Project: shims [echo] Building shims 0.20 build_shims: [echo] Project: shims [echo] Compiling https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/shims/src/common/java;/home/jenkins/jenkins-slave/workspace/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/hive/shims/src/0.20/java against hadoop 0.20.2 (https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/102/artifact/hive/build/hadoopcore/hadoop-0.20.2) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml ivy-retrieve-hadoop-shim: [echo] Project: shims [echo] Building shims 0.20S build_shims: [echo] Project: shims [echo] Compiling
JDBC Issue (File Handle leak): Unable to drop the table after reading partial data for it.
Hi Gurus, While I am fixing the JDBC unit test issues on Windows, I observed that HiveServer is keeping the file handle open until client reads the data completely or closes the connection. Because of this open file handle, client can't drop the same table with in the same connection context after reading the partial data for the query select * from table on windows. Whereas if the output of the query is a temp file generated by MR jobs, then we can delete the file but it increases the open file handles count. (file handle leak) I am thinking of closing the existing CommandProcessor before executing the new command to close the leaking file handles. Any suggestions? Thanks, Kanna
Re: [ANNOUNCE] New Hive Committer - Navis Ryu
Congratulations Navis! This is very well deserved. Looking forward to many more patches from you. On Fri, Aug 10, 2012 at 8:10 AM, Bejoy KS bejoy...@yahoo.com wrote: Congrats Navis.. :) Regards Bejoy KS Sent from handheld, please excuse typos. -Original Message- From: alo alt wget.n...@gmail.com Date: Fri, 10 Aug 2012 17:08:07 To: u...@hive.apache.org Reply-To: u...@hive.apache.org Cc: dev@hive.apache.org; navis@nexr.com Subject: Re: [ANNOUNCE] New Hive Committer - Navis Ryu Congratulations! Well done :) cheers, ALex On Aug 10, 2012, at 11:58 AM, John Sichi jsi...@gmail.com wrote: The Apache Hive PMC has passed a vote to make Navis Ryu a new committer on the project. JIRA is currently down, so I can't send out a link with his contribution list at the moment, but if you have an account at reviews.facebook.net, you can see his activity here: https://reviews.facebook.net/p/navis/ Navis, please submit your CLA to the Apache Software Foundation as described here: http://www.apache.org/licenses/#clas Congratulations! JVS -- Alexander Alten-Lorenz http://mapredit.blogspot.com German Hadoop LinkedIn Group: http://goo.gl/N8pCF
Re: Review Request: HIVE-3213: ODBC API enhancements
--- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/5685/#review10147 --- Looks good overall. Most of things I commented on have to do with formatting issues and missing ASF license headers. It may be worth running all of this code through an automated formatter like astyle. odbc/src/cpp/HiveResultSet.cpp https://reviews.apache.org/r/5685/#comment21496 I think we should try to keep the formatting consistent. * Always use braces for blocks (if, while, etc) * open brace always on the same line * } else { odbc/src/cpp/HiveResultSet.cpp https://reviews.apache.org/r/5685/#comment21498 Brace should after the closing paren. odbc/src/cpp/hiveclient.def https://reviews.apache.org/r/5685/#comment21495 Missing ASF license header. odbc/src/cpp/hiveclienthelper.h https://reviews.apache.org/r/5685/#comment21494 Formatting: please fix the line break alignment. odbc/src/cpp/if/fb303.thrift https://reviews.apache.org/r/5685/#comment21491 This file shouldn't be necessary. odbc/src/cpp/if/hive_metastore.thrift https://reviews.apache.org/r/5685/#comment21492 Please remove and use metastore/if/hive_metastore.thrift instead. odbc/src/cpp/if/hive_service.thrift https://reviews.apache.org/r/5685/#comment21490 Please remove this file and use the copy here instead: service/if/hive_service.thrift odbc/src/cpp/if/queryplan.thrift https://reviews.apache.org/r/5685/#comment21489 This file can be removed. We should instead reference the copy in ql/if/queryplan.thrift odbc/src/driver/hiveodbc.h https://reviews.apache.org/r/5685/#comment21487 Please remove the CDH reference in the comment and bump the version number to 0.10.0 odbc/src/driver/hiveodbc.h https://reviews.apache.org/r/5685/#comment21488 Formatting: It would be nice if the comments for these structs were left-justified. odbc/src/driver/hiveodbc.c https://reviews.apache.org/r/5685/#comment21501 Is is possible to reformat this so we get closer to the 100 character line limit? odbc/src/driver/hiveodbc.c https://reviews.apache.org/r/5685/#comment21502 Formatting. odbc/src/driver/hiveodbc.c https://reviews.apache.org/r/5685/#comment21503 Remove. odbc/src/driver/hiveodbc_win32_rc.rc https://reviews.apache.org/r/5685/#comment21505 Missing ASF header. odbc/src/driver/libhiveodbc.def https://reviews.apache.org/r/5685/#comment21506 Missing ASF header. - Carl Steinbach On June 30, 2012, 4:38 a.m., Prasad Mujumdar wrote: --- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/5685/ --- (Updated June 30, 2012, 4:38 a.m.) Review request for hive and Carl Steinbach. Description --- Enhanced ODBC driver with limited ODBC3 compliance. This ticket HIVE-3213 covers the source code, the build changes are tracked by HIVE-3212 This addresses bug HIVE-3213. https://issues.apache.org/jira/browse/HIVE-3213 Diffs - odbc/src/cpp/HiveConnection.h 3b2e2b1 odbc/src/cpp/HiveResultSet.h 25eabc4 odbc/src/cpp/HiveResultSet.cpp d3d375e odbc/src/cpp/HiveRowSet.h ca6e6af odbc/src/cpp/HiveRowSet.cpp 3de6124 odbc/src/cpp/Makefile.am PRE-CREATION odbc/src/cpp/Makefile.in PRE-CREATION odbc/src/cpp/hiveclient.h f1af670 odbc/src/cpp/hiveclient.cpp 450eb0b odbc/src/cpp/hiveclient.def PRE-CREATION odbc/src/cpp/hiveclienthelper.h 5814a03 odbc/src/cpp/hiveconstants.h 72f1049 odbc/src/cpp/if/fb303.thrift PRE-CREATION odbc/src/cpp/if/hive_metastore.thrift PRE-CREATION odbc/src/cpp/if/hive_service.thrift PRE-CREATION odbc/src/cpp/if/queryplan.thrift PRE-CREATION odbc/src/cpp/thriftserverconstants.h fe4bac4 odbc/src/driver/Makefile.am PRE-CREATION odbc/src/driver/Makefile.in PRE-CREATION odbc/src/driver/hiveodbc.h PRE-CREATION odbc/src/driver/hiveodbc.c PRE-CREATION odbc/src/driver/hiveodbc_logo.ico PRE-CREATION odbc/src/driver/hiveodbc_win32_rc.h PRE-CREATION odbc/src/driver/hiveodbc_win32_rc.rc PRE-CREATION odbc/src/driver/libhiveodbc.def PRE-CREATION odbc/src/driver/libtool-version PRE-CREATION odbc/src/test/Makefile.am PRE-CREATION odbc/src/test/Makefile.in PRE-CREATION odbc/src/test/hiveclienttest.c fbb4e24 odbc/src/test/hiveodbctest.c PRE-CREATION odbc/src/test/hivetest.h PRE-CREATION Diff: https://reviews.apache.org/r/5685/diff/ Testing --- Thanks, Prasad Mujumdar
Build failed in Jenkins: Hive-0.9.1-SNAPSHOT-h0.21 #102
See https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/102/ -- [...truncated 36554 lines...] [junit] POSTHOOK: query: select count(1) as cnt from testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: file:/tmp/jenkins/hive_2012-08-10_13-52-17_856_8278669935931733801/-mr-1 [junit] OK [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: default@testhivedrivertable [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/102/artifact/hive/build/service/tmp/hive_job_log_jenkins_201208101352_1540241924.txt [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] OK [junit] PREHOOK: query: create table testhivedrivertable (num int) [junit] PREHOOK: type: DROPTABLE [junit] Copying file: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt [junit] POSTHOOK: query: create table testhivedrivertable (num int) [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] PREHOOK: query: load data local inpath 'https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt' into table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Output: default@testhivedrivertable [junit] Copying data from https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt [junit] Loading data to table default.testhivedrivertable [junit] POSTHOOK: query: load data local inpath 'https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt' into table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] PREHOOK: query: select * from testhivedrivertable limit 10 [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: file:/tmp/jenkins/hive_2012-08-10_13-52-22_287_5067621451772504825/-mr-1 [junit] POSTHOOK: query: select * from testhivedrivertable limit 10 [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: file:/tmp/jenkins/hive_2012-08-10_13-52-22_287_5067621451772504825/-mr-1 [junit] OK [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: default@testhivedrivertable [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/102/artifact/hive/build/service/tmp/hive_job_log_jenkins_201208101352_2002674054.txt [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] OK [junit] PREHOOK: query: create table testhivedrivertable (num int) [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: create table testhivedrivertable (num int) [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: default@testhivedrivertable [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/102/artifact/hive/build/service/tmp/hive_job_log_jenkins_201208101352_1661498330.txt [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/102/artifact/hive/build/service/tmp/hive_job_log_jenkins_201208101352_1665931629.txt [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: drop table testhivedrivertable [junit]
Hive-trunk-h0.21 - Build # 1601 - Still Failing
Changes for Build #1598 [cws] HIVE-2789. query_properties.q contains non-deterministic queries (Zhenxiao Luo via cws) Changes for Build #1599 Changes for Build #1600 [kevinwilfong] HIVE-3293. Load file into a table does not update table statistics. (njain via kevinwilfong) Changes for Build #1601 1 tests failed. FAILED: org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_stats_aggregator_error_1 Error Message: Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit. Stack Trace: junit.framework.AssertionFailedError: Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit. at net.sf.antcontrib.logic.ForTask.doSequentialIteration(ForTask.java:259) at net.sf.antcontrib.logic.ForTask.doToken(ForTask.java:268) at net.sf.antcontrib.logic.ForTask.doTheTasks(ForTask.java:324) at net.sf.antcontrib.logic.ForTask.execute(ForTask.java:244) The Apache Jenkins build system has built Hive-trunk-h0.21 (build #1601) Status: Still Failing Check console output at https://builds.apache.org/job/Hive-trunk-h0.21/1601/ to view the results.
Re: [ANNOUNCE] New Hive Committer - Navis Ryu
Congrats, Navis! Well deserved. Welcome, aboard! Ashutosh On Fri, Aug 10, 2012 at 11:10 AM, Carl Steinbach c...@cloudera.com wrote: Congratulations Navis! This is very well deserved. Looking forward to many more patches from you. On Fri, Aug 10, 2012 at 8:10 AM, Bejoy KS bejoy...@yahoo.com wrote: Congrats Navis.. :) Regards Bejoy KS Sent from handheld, please excuse typos. -Original Message- From: alo alt wget.n...@gmail.com Date: Fri, 10 Aug 2012 17:08:07 To: u...@hive.apache.org Reply-To: u...@hive.apache.org Cc: dev@hive.apache.org; navis@nexr.com Subject: Re: [ANNOUNCE] New Hive Committer - Navis Ryu Congratulations! Well done :) cheers, ALex On Aug 10, 2012, at 11:58 AM, John Sichi jsi...@gmail.com wrote: The Apache Hive PMC has passed a vote to make Navis Ryu a new committer on the project. JIRA is currently down, so I can't send out a link with his contribution list at the moment, but if you have an account at reviews.facebook.net, you can see his activity here: https://reviews.facebook.net/p/navis/ Navis, please submit your CLA to the Apache Software Foundation as described here: http://www.apache.org/licenses/#clas Congratulations! JVS -- Alexander Alten-Lorenz http://mapredit.blogspot.com German Hadoop LinkedIn Group: http://goo.gl/N8pCF
Review Request: HIVE-3056: Ability to bulk update/delete Hive's metastore
--- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/6532/ --- Review request for hive and Carl Steinbach. Description --- This patch creates a HiveMetaTool to fire arbitrary JDOQL against the Hive metastore. SELECTS work, but UPDATES don't. This patch also upgrades Hive's DN to 2.2.4. This addresses bug HIVE-3056. https://issues.apache.org/jira/browse/HIVE-3056 Diffs - ivy/libraries.properties f0b1918 metastore/ivy.xml 3011d2f metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaTool.java PRE-CREATION ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java ccb0d7f Diff: https://reviews.apache.org/r/6532/diff/ Testing --- Thanks, Shreepadma Venugopalan