[
https://issues.apache.org/jira/browse/HIVE-1157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12915619#action_12915619
]
Namit Jain commented on HIVE-1157:
----------------------------------
The changes looked good, but I got the following error:
[junit] Begin query: alter1.q
[junit] diff -a -I file: -I pfile: -I hdfs: -I /tmp/ -I invalidscheme: -I
lastUpdateTime -I lastAccessTime -I [Oo]wner -I CreateTime -I LastAccessTime -I
Location -I transient_lastDdlTime -I last_modified_ -I
java.lang.RuntimeException -I at org -I at sun -I at java -I at junit -I Caused
by: -I [.][.][.] [0-9]* more
/data/users/njain/hive_commit2/hive_commit2/build/ql/test/logs/clientpositive/alter1.q.out
/data/users/njain/hive_commit2/hive_commit2/ql/src/test/results/clientpositive/alter1.q.out
[junit] 778d777
[junit] < Resource ../data/files/TestSerDe.jar already added.
Philip, can you take care of that ?
> UDFs can't be loaded via "add jar" when jar is on HDFS
> ------------------------------------------------------
>
> Key: HIVE-1157
> URL: https://issues.apache.org/jira/browse/HIVE-1157
> Project: Hadoop Hive
> Issue Type: Improvement
> Components: Query Processor
> Reporter: Philip Zeyliger
> Priority: Minor
> Attachments: hive-1157.patch.txt, HIVE-1157.patch.v3.txt,
> HIVE-1157.patch.v4.txt, HIVE-1157.patch.v5.txt, HIVE-1157.v2.patch.txt,
> output.txt
>
>
> As discussed on the mailing list, it would be nice if you could use UDFs that
> are on jars on HDFS. The proposed implementation would be for "add jar" to
> recognize that the target file is on HDFS, copy it locally, and load it into
> the classpath.
> {quote}
> Hi folks,
> I have a quick question about UDF support in Hive. I'm on the 0.5 branch.
> Can you use a UDF where the jar which contains the function is on HDFS, and
> not on the local filesystem. Specifically, the following does not seem to
> work:
> # This is Hive 0.5, from svn
> $bin/hive
> Hive history file=/tmp/philip/hive_job_log_philip_201002081541_370227273.txt
> hive> add jar hdfs://localhost/FooTest.jar;
>
> Added hdfs://localhost/FooTest.jar to class path
> hive> create temporary function cube as 'com.cloudera.FooTestUDF';
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.FunctionTask
> Does this work for other people? I could probably fix it by changing "add
> jar" to download remote jars locally, when necessary (to load them into the
> classpath), or update URLClassLoader (or whatever is underneath there) to
> read directly from HDFS, which seems a bit more fragile. But I wanted to
> make sure that my interpretation of what's going on is right before I have at
> it.
> Thanks,
> -- Philip
> {quote}
> {quote}
> Yes that's correct. I prefer to download the jars in "add jar".
> Zheng
> {quote}
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.