Problem loading new UDTF in local hive copy

2013-08-05 Thread nikolaus . stahl

Hi,

I'm trying to compile hive with a new UDTF and have been following the  
wiki instruction  
(https://cwiki.apache.org/confluence/display/Hive/GenericUDAFCaseStudy).


I've added my new function to the function registry and have  
successfully updated show_functions.q.out. However, when I recompile  
and start my local copy of hive with build/dist/bin/hive the "show  
functions;" command is still not listing my new function. Any thoughts  
on what I'm missing? Sorry if this is a naive question.


Thanks for your help,
Niko



Re: TestCliDriver Failed Test

2013-07-31 Thread nikolaus . stahl
Thanks Noland. I tried that but now I'm getting more errors (see  
below). It seems that the java compiler isn't recognizing the package  
for this test. Here's the relevant output, after running the same test  
as before with the very-clean option (I.e.: ant very-clean test  
-Dtestcase=TestCliDriver -Dqfile=show_functions.q -Doverwrite=true ):


set-test-classpath:

compile-test:
 [echo] Project: ql
[javac] Compiling 105 source files to  
/Users/niko/Repos/hive-trunk/build/ql/test/classes
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:21: package org.apache.hadoop.hive.metastore does not  
exist
[javac] import static  
org.apache.hadoop.hive.metastore.MetaStoreUtils.DEFAULT_DATABASE_NAME;

[javac]   ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:21: static import only from classes and  
interfaces
[javac] import static  
org.apache.hadoop.hive.metastore.MetaStoreUtils.DEFAULT_DATABASE_NAME;

[javac] ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:55: package org.apache.hadoop.hive.cli does not  
exist

[javac] import org.apache.hadoop.hive.cli.CliDriver;
[javac]  ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:56: package org.apache.hadoop.hive.cli does not  
exist

[javac] import org.apache.hadoop.hive.cli.CliSessionState;
[javac]  ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:57: package org.apache.hadoop.hive.common.io does not  
exist

[javac] import org.apache.hadoop.hive.common.io.CachingPrintStream;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:58: package org.apache.hadoop.hive.conf does not  
exist

[javac] import org.apache.hadoop.hive.conf.HiveConf;
[javac]   ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:59: package org.apache.hadoop.hive.metastore does not  
exist

[javac] import org.apache.hadoop.hive.metastore.MetaStoreUtils;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:60: package org.apache.hadoop.hive.metastore.api does not  
exist

[javac] import org.apache.hadoop.hive.metastore.api.Index;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:76: package org.apache.hadoop.hive.serde does not  
exist

[javac] import org.apache.hadoop.hive.serde.serdeConstants;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:77: package org.apache.hadoop.hive.serde2.thrift does not  
exist

[javac] import org.apache.hadoop.hive.serde2.thrift.ThriftDeserializer;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:78: package org.apache.hadoop.hive.serde2.thrift.test does not  
exist

[javac] import org.apache.hadoop.hive.serde2.thrift.test.Complex;
[javac] ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:79: package org.apache.hadoop.hive.shims does not  
exist

[javac] import org.apache.hadoop.hive.shims.HadoopShims;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:80: package org.apache.hadoop.hive.shims does not  
exist

[javac] import org.apache.hadoop.hive.shims.ShimLoader;
[javac]^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:112: cannot find  
symbol

[javac] symbol  : class HiveConf
[javac] location: class org.apache.hadoop.hive.ql.QTestUtil
[javac]   protected HiveConf conf;
[javac] ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:117: cannot find  
symbol

[javac] symbol  : class CliDriver
[javac] location: class org.apache.hadoop.hive.ql.QTestUtil
[javac]   private CliDriver cliDriver;
[javac]   ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:118: package HadoopShims does not  
exist

[javac]   private HadoopShims.MiniMrShim mr = null;
[javac]  ^
[javac]  
/Users/niko/Repos/hive-trunk/ql/src/test/org/apache/hadoop/hive/ql/QTe

TestCliDriver Failed Test

2013-07-31 Thread nikolaus . stahl

Hi,

When running the following command:

ant test -Dtestcase=TestCliDriver -Dqfile=show_functions.q -Doverwrite=true

on a clean hive-trunk checkout, I get the following failed test:

test:
 [echo] Project: ql
[junit] WARNING: multiple versions of ant detected in path for junit
[junit]   
jar:file:/usr/share/ant/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit]  and  
jar:file:/Users/niko/Repos/hive-trunk/build/ivy/lib/hadoop0.20S.shim/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Hive history  
file=/Users/niko/Repos/hive-trunk/build/ql/tmp/hive_job_log_604cbdc7-f546-4a74-bba2-43f7c2885811_1343059998.txt
[junit] 2013-07-31 07:19:49.366 java[15847:1203] Unable to load  
realm info from SCDynamicStore
[junit] Exception: java.lang.RuntimeException: Unable to  
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

[junit] Running org.apache.hadoop.hive.cli.TestCliDriver
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] org.apache.hadoop.hive.ql.metadata.HiveException:  
java.lang.RuntimeException: Unable to instantiate  
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
[junit] 	at  
org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:875)
[junit] 	at  
org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:851)
[junit] 	at  
org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:513)
[junit] 	at  
org.apache.hadoop.hive.cli.TestCliDriver.(TestCliDriver.java:48)

[junit] at java.lang.Class.forName0(Native Method)
[junit] at java.lang.Class.forName(Class.java:171)
[junit] 	at  
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:373)
[junit] 	at  
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052)
[junit] 	at  
org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906)
[junit] Caused by: java.lang.RuntimeException: Unable to  
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
[junit] 	at  
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1212)
[junit] 	at  
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:51)
[junit] 	at  
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
[junit] 	at  
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2357)

[junit] at 
org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2368)
[junit] 	at  
org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:869)

[junit] ... 8 more
[junit] Caused by: java.lang.reflect.InvocationTargetException
[junit] 	at  
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
[junit] 	at  
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
[junit] 	at  
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
[junit] 	at  
java.lang.reflect.Constructor.newInstance(Constructor.java:513)
[junit] 	at  
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1210)

[junit] ... 13 more
[junit] Caused by: javax.jdo.JDOFatalInternalException:  
Unexpected exception caught.

[junit] NestedThrowables:
[junit] java.lang.reflect.InvocationTargetException
[junit] 	at  
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
[junit] 	at  
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
[junit] 	at  
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
[junit] 	at  
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:266)
[junit] 	at  
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:295)
[junit] 	at  
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:228)
[junit] 	at  
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:203)
[junit] 	at  
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
[junit] 	at  
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
[junit] 	at  
org.apache.hadoop.hive.metastore.RetryingRawStore.(RetryingRawStore.java:62)
[junit] 	at  
org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
[junit] 	at  
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:414)
[junit] 	at  
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
[junit] 	at  
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:440)
[junit] 	at  
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
 

Extending Explode

2013-07-26 Thread nikolaus . stahl

Hi,

I'd like to make a patch that extends the functionality of "explode"  
to include an output column with the position of each item in the  
original array.


I imagine this could be useful to the greater community and am  
wondering if I should extend the current explode function or if I  
should write a completely new function. Any thoughts on what will be  
more useful and more likely to be added to the hive-trunk would be  
greatly appreciated.


Thanks,
Niko