[jira] [Commented] (HIVE-3488) Issue trying to use the thick client (embedded) from windows.
[ https://issues.apache.org/jira/browse/HIVE-3488?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13482190#comment-13482190 ] Rémy DUBOIS commented on HIVE-3488: --- Any news? Issue trying to use the thick client (embedded) from windows. - Key: HIVE-3488 URL: https://issues.apache.org/jira/browse/HIVE-3488 Project: Hive Issue Type: Bug Components: Windows Affects Versions: 0.8.1 Reporter: Rémy DUBOIS Priority: Critical I'm trying to execute a very simple SELECT query against my remote hive server. If I'm doing a SELECT * from table, everything works well. If I'm trying to execute a SELECT name from table, this error appears: {code:java} Job Submission failed with exception 'java.io.IOException(cannot find dir = /user/hive/warehouse/test/city=paris/out.csv in pathToPartitionInfo: [hdfs://cdh-four:8020/user/hive/warehouse/test/city=paris])' 12/09/19 17:18:44 ERROR exec.Task: Job Submission failed with exception 'java.io.IOException(cannot find dir = /user/hive/warehouse/test/city=paris/out.csv in pathToPartitionInfo: [hdfs://cdh-four:8020/user/hive/warehouse/test/city=paris])' java.io.IOException: cannot find dir = /user/hive/warehouse/test/city=paris/out.csv in pathToPartitionInfo: [hdfs://cdh-four:8020/user/hive/warehouse/test/city=paris] at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getPartitionDescFromPathRecursively(HiveFileFormatUtils.java:290) at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getPartitionDescFromPathRecursively(HiveFileFormatUtils.java:257) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit.init(CombineHiveInputFormat.java:104) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:407) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981) at org.apache.hadoop.mapred.JobClient.access$500(JobClient.java:170) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:891) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Unknown Source) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:818) at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:452) at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1332) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1123) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931) at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:191) at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187) {code} Indeed, this dir (/user/hive/warehouse/test/city=paris/out.csv) can't be found since it deals with my data file, and not a directory. Could you please help me? -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
Build failed in Jenkins: Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false #176
See https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/ -- [...truncated 10128 lines...] [echo] Project: odbc [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/odbc/src/conf does not exist. ivy-resolve-test: [echo] Project: odbc ivy-retrieve-test: [echo] Project: odbc compile-test: [echo] Project: odbc create-dirs: [echo] Project: serde [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/serde/src/test/resources does not exist. init: [echo] Project: serde ivy-init-settings: [echo] Project: serde ivy-resolve: [echo] Project: serde [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml [ivy:report] Processing https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/ivy/resolution-cache/org.apache.hive-hive-serde-default.xml to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/ivy/report/org.apache.hive-hive-serde-default.html ivy-retrieve: [echo] Project: serde dynamic-serde: compile: [echo] Project: serde ivy-resolve-test: [echo] Project: serde ivy-retrieve-test: [echo] Project: serde compile-test: [echo] Project: serde [javac] Compiling 26 source files to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/serde/test/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. create-dirs: [echo] Project: service [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/service/src/test/resources does not exist. init: [echo] Project: service ivy-init-settings: [echo] Project: service ivy-resolve: [echo] Project: service [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml [ivy:report] Processing https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/ivy/resolution-cache/org.apache.hive-hive-service-default.xml to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/ivy/report/org.apache.hive-hive-service-default.html ivy-retrieve: [echo] Project: service compile: [echo] Project: service ivy-resolve-test: [echo] Project: service ivy-retrieve-test: [echo] Project: service compile-test: [echo] Project: service [javac] Compiling 2 source files to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/service/test/classes test: [echo] Project: hive test-shims: [echo] Project: hive test-conditions: [echo] Project: shims gen-test: [echo] Project: shims create-dirs: [echo] Project: shims [copy] Warning: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/shims/src/test/resources does not exist. init: [echo] Project: shims ivy-init-settings: [echo] Project: shims ivy-resolve: [echo] Project: shims [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml [ivy:report] Processing https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/ivy/resolution-cache/org.apache.hive-hive-shims-default.xml to https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/ivy/report/org.apache.hive-hive-shims-default.html ivy-retrieve: [echo] Project: shims compile: [echo] Project: shims [echo] Building shims 0.20 build_shims: [echo] Project: shims [echo] Compiling https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/shims/src/common/java;/home/jenkins/jenkins-slave/workspace/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/hive/shims/src/0.20/java against hadoop 0.20.2 (https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/176/artifact/hive/build/hadoopcore/hadoop-0.20.2) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file = https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21-keepgoing=false/ws/hive/ivy/ivysettings.xml ivy-retrieve-hadoop-shim: [echo] Project: shims [echo] Building shims 0.20S build_shims: [echo] Project: shims [echo] Compiling
[jira] [Created] (HIVE-3610) Add a command Explain dependency ...
Sambavi Muthukrishnan created HIVE-3610: --- Summary: Add a command Explain dependency ... Key: HIVE-3610 URL: https://issues.apache.org/jira/browse/HIVE-3610 Project: Hive Issue Type: New Feature Components: Query Processor Affects Versions: 0.9.0 Reporter: Sambavi Muthukrishnan Assignee: Sambavi Muthukrishnan Priority: Minor Add a new command EXPLAIN DEPENDENCY. Any query can be passed to EXPLAIN DEPENDENCY as with EXPLAIN (FORMATTED/EXTENDED). The output of this command will be JSON that provides the list of tables and partitions that the query depends on. One possible use case is to determine the set of tables/views that are used by a view, and the set of partitions that are used by a given query on that view. This will allow a view to be replicated from one Hive instance to another, since we can determine the set of objects that need to be replicated for replication of the view to be successful. Example output: {input_tables:[{tablename: default@test_sambavi_v2, tabletype: EXTERNAL_TABLE}, {tablename: default@test_sambavi_v1, tabletype: TABLE}], input partitions:[default@srcpart@ds=2008-04-08/hr=11,default@srcpart@ds=2008-04-08/hr=12]} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Assigned] (HIVE-3433) Implement CUBE and ROLLUP operators in Hive
[ https://issues.apache.org/jira/browse/HIVE-3433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ivan Gorbachev reassigned HIVE-3433: Assignee: Ivan Gorbachev (was: Namit Jain) Implement CUBE and ROLLUP operators in Hive --- Key: HIVE-3433 URL: https://issues.apache.org/jira/browse/HIVE-3433 Project: Hive Issue Type: New Feature Components: Query Processor Reporter: Sambavi Muthukrishnan Assignee: Ivan Gorbachev Attachments: hive.3433.1.patch, hive.3433.2.patch, hive.3433.3.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (HIVE-3524) Storing certain Exception objects thrown in HiveMetaStore.java in MetaStoreEndFunctionContext
[ https://issues.apache.org/jira/browse/HIVE-3524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13482655#comment-13482655 ] Kevin Wilfong commented on HIVE-3524: - https://reviews.facebook.net/D5937 Storing certain Exception objects thrown in HiveMetaStore.java in MetaStoreEndFunctionContext - Key: HIVE-3524 URL: https://issues.apache.org/jira/browse/HIVE-3524 Project: Hive Issue Type: Improvement Components: Metastore Affects Versions: 0.10.0 Reporter: Maheshwaran Srinivasan Assignee: Maheshwaran Srinivasan Priority: Minor Fix For: 0.10.0 Original Estimate: 168h Remaining Estimate: 168h Idea is to store exception objects thrown in the scope of the public functions of HiveMetaStore.java. Exception objects will be stored in MetaStoreEndFunctionContext. These could then be optionally processed to gather interesting statistics. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (HIVE-3524) Storing certain Exception objects thrown in HiveMetaStore.java in MetaStoreEndFunctionContext
[ https://issues.apache.org/jira/browse/HIVE-3524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kevin Wilfong updated HIVE-3524: Status: Patch Available (was: Open) Storing certain Exception objects thrown in HiveMetaStore.java in MetaStoreEndFunctionContext - Key: HIVE-3524 URL: https://issues.apache.org/jira/browse/HIVE-3524 Project: Hive Issue Type: Improvement Components: Metastore Affects Versions: 0.10.0 Reporter: Maheshwaran Srinivasan Assignee: Maheshwaran Srinivasan Priority: Minor Fix For: 0.10.0 Original Estimate: 168h Remaining Estimate: 168h Idea is to store exception objects thrown in the scope of the public functions of HiveMetaStore.java. Exception objects will be stored in MetaStoreEndFunctionContext. These could then be optionally processed to gather interesting statistics. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Assigned] (HIVE-3433) Implement CUBE and ROLLUP operators in Hive
[ https://issues.apache.org/jira/browse/HIVE-3433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ivan Gorbachev reassigned HIVE-3433: Assignee: Namit Jain (was: Ivan Gorbachev) Implement CUBE and ROLLUP operators in Hive --- Key: HIVE-3433 URL: https://issues.apache.org/jira/browse/HIVE-3433 Project: Hive Issue Type: New Feature Components: Query Processor Reporter: Sambavi Muthukrishnan Assignee: Namit Jain Attachments: hive.3433.1.patch, hive.3433.2.patch, hive.3433.3.patch, hive-3433.4.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (HIVE-3524) Storing certain Exception objects thrown in HiveMetaStore.java in MetaStoreEndFunctionContext
[ https://issues.apache.org/jira/browse/HIVE-3524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13482654#comment-13482654 ] Kevin Wilfong commented on HIVE-3524: - Maheshwaran, for the future, can you post a link to the Phabricator diff here and mark the JIRA patch available once you have a diff ready for review. Can you also attach a file containing the patch to the JIRA. See https://cwiki.apache.org/confluence/display/Hive/HowToContribute#HowToContribute, particularly the sections on creating a patch and contributing your work. Storing certain Exception objects thrown in HiveMetaStore.java in MetaStoreEndFunctionContext - Key: HIVE-3524 URL: https://issues.apache.org/jira/browse/HIVE-3524 Project: Hive Issue Type: Improvement Components: Metastore Affects Versions: 0.10.0 Reporter: Maheshwaran Srinivasan Assignee: Maheshwaran Srinivasan Priority: Minor Fix For: 0.10.0 Original Estimate: 168h Remaining Estimate: 168h Idea is to store exception objects thrown in the scope of the public functions of HiveMetaStore.java. Exception objects will be stored in MetaStoreEndFunctionContext. These could then be optionally processed to gather interesting statistics. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (HIVE-3433) Implement CUBE and ROLLUP operators in Hive
[ https://issues.apache.org/jira/browse/HIVE-3433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ivan Gorbachev updated HIVE-3433: - Attachment: hive-3433.4.patch Implement CUBE and ROLLUP operators in Hive --- Key: HIVE-3433 URL: https://issues.apache.org/jira/browse/HIVE-3433 Project: Hive Issue Type: New Feature Components: Query Processor Reporter: Sambavi Muthukrishnan Assignee: Namit Jain Attachments: hive.3433.1.patch, hive.3433.2.patch, hive.3433.3.patch, hive-3433.4.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
Build failed in Jenkins: Hive-0.9.1-SNAPSHOT-h0.21 #176
See https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/176/ -- [...truncated 36580 lines...] [junit] POSTHOOK: query: select count(1) as cnt from testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: file:/tmp/hudson/hive_2012-10-23_15-12-40_195_6941679871777973082/-mr-1 [junit] OK [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: default@testhivedrivertable [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/176/artifact/hive/build/service/tmp/hive_job_log_hudson_201210231512_1636851391.txt [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] OK [junit] PREHOOK: query: create table testhivedrivertable (num int) [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: create table testhivedrivertable (num int) [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Copying file: https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt [junit] PREHOOK: query: load data local inpath 'https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt' into table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Output: default@testhivedrivertable [junit] Copying data from https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt [junit] Loading data to table default.testhivedrivertable [junit] POSTHOOK: query: load data local inpath 'https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/ws/hive/data/files/kv1.txt' into table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] PREHOOK: query: select * from testhivedrivertable limit 10 [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: file:/tmp/hudson/hive_2012-10-23_15-12-45_293_7521132247125647168/-mr-1 [junit] POSTHOOK: query: select * from testhivedrivertable limit 10 [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: file:/tmp/hudson/hive_2012-10-23_15-12-45_293_7521132247125647168/-mr-1 [junit] OK [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: default@testhivedrivertable [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/176/artifact/hive/build/service/tmp/hive_job_log_hudson_201210231512_1842180651.txt [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] OK [junit] PREHOOK: query: create table testhivedrivertable (num int) [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: create table testhivedrivertable (num int) [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] PREHOOK: Input: default@testhivedrivertable [junit] PREHOOK: Output: default@testhivedrivertable [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK: type: DROPTABLE [junit] POSTHOOK: Input: default@testhivedrivertable [junit] POSTHOOK: Output: default@testhivedrivertable [junit] OK [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/176/artifact/hive/build/service/tmp/hive_job_log_hudson_201210231512_520239753.txt [junit] Hive history file=https://builds.apache.org/job/Hive-0.9.1-SNAPSHOT-h0.21/176/artifact/hive/build/service/tmp/hive_job_log_hudson_201210231512_1211097336.txt [junit] PREHOOK: query: drop table testhivedrivertable [junit] PREHOOK: type: DROPTABLE [junit] POSTHOOK: query: drop table testhivedrivertable [junit] POSTHOOK:
[jira] [Updated] (HIVE-3433) Implement CUBE and ROLLUP operators in Hive
[ https://issues.apache.org/jira/browse/HIVE-3433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ivan Gorbachev updated HIVE-3433: - Attachment: hive-3433.5.patch Implement CUBE and ROLLUP operators in Hive --- Key: HIVE-3433 URL: https://issues.apache.org/jira/browse/HIVE-3433 Project: Hive Issue Type: New Feature Components: Query Processor Reporter: Sambavi Muthukrishnan Assignee: Namit Jain Attachments: hive.3433.1.patch, hive.3433.2.patch, hive.3433.3.patch, hive-3433.4.patch, hive-3433.5.patch -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (HIVE-3610) Add a command Explain dependency ...
[ https://issues.apache.org/jira/browse/HIVE-3610?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13482841#comment-13482841 ] Sambavi Muthukrishnan commented on HIVE-3610: - Diff is at https://reviews.facebook.net/D6159 Add a command Explain dependency ... -- Key: HIVE-3610 URL: https://issues.apache.org/jira/browse/HIVE-3610 Project: Hive Issue Type: New Feature Components: Query Processor Affects Versions: 0.9.0 Reporter: Sambavi Muthukrishnan Assignee: Sambavi Muthukrishnan Priority: Minor Add a new command EXPLAIN DEPENDENCY. Any query can be passed to EXPLAIN DEPENDENCY as with EXPLAIN (FORMATTED/EXTENDED). The output of this command will be JSON that provides the list of tables and partitions that the query depends on. One possible use case is to determine the set of tables/views that are used by a view, and the set of partitions that are used by a given query on that view. This will allow a view to be replicated from one Hive instance to another, since we can determine the set of objects that need to be replicated for replication of the view to be successful. Example output: {input_tables:[{tablename: default@test_sambavi_v2, tabletype: EXTERNAL_TABLE}, {tablename: default@test_sambavi_v1, tabletype: TABLE}], input partitions:[default@srcpart@ds=2008-04-08/hr=11,default@srcpart@ds=2008-04-08/hr=12]} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (HIVE-3610) Add a command Explain dependency ...
[ https://issues.apache.org/jira/browse/HIVE-3610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sambavi Muthukrishnan updated HIVE-3610: Attachment: explain_dependency.1.patch Add a command Explain dependency ... -- Key: HIVE-3610 URL: https://issues.apache.org/jira/browse/HIVE-3610 Project: Hive Issue Type: New Feature Components: Query Processor Affects Versions: 0.9.0 Reporter: Sambavi Muthukrishnan Assignee: Sambavi Muthukrishnan Priority: Minor Attachments: explain_dependency.1.patch Add a new command EXPLAIN DEPENDENCY. Any query can be passed to EXPLAIN DEPENDENCY as with EXPLAIN (FORMATTED/EXTENDED). The output of this command will be JSON that provides the list of tables and partitions that the query depends on. One possible use case is to determine the set of tables/views that are used by a view, and the set of partitions that are used by a given query on that view. This will allow a view to be replicated from one Hive instance to another, since we can determine the set of objects that need to be replicated for replication of the view to be successful. Example output: {input_tables:[{tablename: default@test_sambavi_v2, tabletype: EXTERNAL_TABLE}, {tablename: default@test_sambavi_v1, tabletype: TABLE}], input partitions:[default@srcpart@ds=2008-04-08/hr=11,default@srcpart@ds=2008-04-08/hr=12]} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (HIVE-3610) Add a command Explain dependency ...
[ https://issues.apache.org/jira/browse/HIVE-3610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sambavi Muthukrishnan updated HIVE-3610: Status: Patch Available (was: Open) Add a command Explain dependency ... -- Key: HIVE-3610 URL: https://issues.apache.org/jira/browse/HIVE-3610 Project: Hive Issue Type: New Feature Components: Query Processor Affects Versions: 0.9.0 Reporter: Sambavi Muthukrishnan Assignee: Sambavi Muthukrishnan Priority: Minor Attachments: explain_dependency.1.patch Add a new command EXPLAIN DEPENDENCY. Any query can be passed to EXPLAIN DEPENDENCY as with EXPLAIN (FORMATTED/EXTENDED). The output of this command will be JSON that provides the list of tables and partitions that the query depends on. One possible use case is to determine the set of tables/views that are used by a view, and the set of partitions that are used by a given query on that view. This will allow a view to be replicated from one Hive instance to another, since we can determine the set of objects that need to be replicated for replication of the view to be successful. Example output: {input_tables:[{tablename: default@test_sambavi_v2, tabletype: EXTERNAL_TABLE}, {tablename: default@test_sambavi_v1, tabletype: TABLE}], input partitions:[default@srcpart@ds=2008-04-08/hr=11,default@srcpart@ds=2008-04-08/hr=12]} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
Hive-trunk-h0.21 - Build # 1753 - Still Failing
Changes for Build #1747 Changes for Build #1748 [namit] HIVE-3544 union involving double column with a map join subquery will fail or give wrong results (Kevin Wilfong via namit) [cws] HIVE-3590. TCP KeepAlive and connection timeout for the HiveServer (Esteban Gutierrez via cws) Changes for Build #1749 Changes for Build #1750 [ecapriolo] HIVE-3599 missing return of compression codec to pool (Owen O'Malley via egc) Changes for Build #1751 Changes for Build #1752 Changes for Build #1753 7 tests failed. REGRESSION: org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_script_broken_pipe1 Error Message: Unexpected exception See build/ql/tmp/hive.log, or try ant test ... -Dtest.silent=false to get more logs. Stack Trace: junit.framework.AssertionFailedError: Unexpected exception See build/ql/tmp/hive.log, or try ant test ... -Dtest.silent=false to get more logs. at junit.framework.Assert.fail(Assert.java:47) at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_script_broken_pipe1(TestNegativeCliDriver.java:11319) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at junit.framework.TestCase.runTest(TestCase.java:168) at junit.framework.TestCase.runBare(TestCase.java:134) at junit.framework.TestResult$1.protect(TestResult.java:110) at junit.framework.TestResult.runProtected(TestResult.java:128) at junit.framework.TestResult.run(TestResult.java:113) at junit.framework.TestCase.run(TestCase.java:124) at junit.framework.TestSuite.runTest(TestSuite.java:232) at junit.framework.TestSuite.run(TestSuite.java:227) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:422) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:931) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:785) REGRESSION: org.apache.hadoop.hive.ql.exec.TestStatsPublisherEnhanced.testStatsPublisherOneStat Error Message: null Stack Trace: junit.framework.AssertionFailedError: null at junit.framework.Assert.fail(Assert.java:47) at junit.framework.Assert.assertTrue(Assert.java:20) at junit.framework.Assert.assertTrue(Assert.java:27) at org.apache.hadoop.hive.ql.exec.TestStatsPublisherEnhanced.testStatsPublisherOneStat(TestStatsPublisherEnhanced.java:81) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at junit.framework.TestCase.runTest(TestCase.java:168) at junit.framework.TestCase.runBare(TestCase.java:134) at junit.framework.TestResult$1.protect(TestResult.java:110) at junit.framework.TestResult.runProtected(TestResult.java:128) at junit.framework.TestResult.run(TestResult.java:113) at junit.framework.TestCase.run(TestCase.java:124) at junit.framework.TestSuite.runTest(TestSuite.java:232) at junit.framework.TestSuite.run(TestSuite.java:227) at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:79) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:422) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:931) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:785) REGRESSION: org.apache.hadoop.hive.ql.exec.TestStatsPublisherEnhanced.testStatsPublisher Error Message: null Stack Trace: junit.framework.AssertionFailedError: null at junit.framework.Assert.fail(Assert.java:47) at junit.framework.Assert.assertTrue(Assert.java:20) at junit.framework.Assert.assertTrue(Assert.java:27) at org.apache.hadoop.hive.ql.exec.TestStatsPublisherEnhanced.testStatsPublisher(TestStatsPublisherEnhanced.java:129) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at junit.framework.TestCase.runTest(TestCase.java:168) at junit.framework.TestCase.runBare(TestCase.java:134) at