Neelesh, Will it be possible for you to share your code?
Looks like in your UDF you are not handling NULL as input values. Thanks, Nitin On Thu, Feb 14, 2013 at 10:22 PM, neelesh gadhia <ngad...@yahoo.com> wrote: > Hi Dean, > > Thanks for your response. I reviewed the stack trace. As you mentioned > the error shows up at > org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:137) > > But this probably is a java class that comes with hadoop 1.1.1 and it > untouched. Do you think there is a bug with this Java Class for hadoop > 1.1.1? > > or is the way the UDF I have created using the file downloaded from - > https://issues.apache.org/jira/browse/HIVE-2361, may be causing the issue? > > > I read few forums that indicate the class may not be in the classpath for > hadoop. Although I confirmed that is not the case. > > Any further pointers or advise is appreciated. > > thanks, > Neelesh > > ------------------------------ > *From:* Dean Wampler <dean.wamp...@thinkbiganalytics.com> > *To:* user@hive.apache.org; neelesh gadhia <ngad...@yahoo.com> > *Sent:* Thursday, February 14, 2013 6:41 AM > *Subject:* Re: > > According to your stack trace, you have NullPointerException on line 137 > of your UDF. > > On Thu, Feb 14, 2013 at 2:28 AM, neelesh gadhia <ngad...@yahoo.com> wrote: > > Hello, > > I am a Newbie to using UDF's on hive. But implemented these GenericJDF ( > https://issues.apache.org/jira/browse/HIVE-2361 ) on hive 0.9.0 and > hadoop 1.1.1. Was able to add jar to hive > > hive> select * from emp; > OK > 1 10 1000 > 2 10 1200 > 3 12 1500 > 4 12 300 > 5 12 1800 > 6 20 5000 > 7 20 7000 > 8 20 10000 > Time taken: 0.191 seconds > > hive> add jar > /usr/local/Cellar/hive/0.9.0/libexec/lib/GenUDF.jar; > > Added /usr/local/Cellar/hive/0.9.0/libexec/lib/GenUDF.jar to class path > Added resource: /usr/local/Cellar/hive/0.9.0/libexec/lib/GenUDF.jar > > hive> create temporary function nexr_sum as > 'com.nexr.platform.analysis.udf.GenericUDFSum'; > OK > Time taken: 0.012 seconds > > > and kicked the sample sql shown below. > > SELECT t.empno, t.deptno, t.sal, nexr_sum(hash(t.deptno),t.sal) as sal_sum > FROM ( > select a.empno, a.deptno, a.sal from emp a > distribute by hash(a.deptno) > sort BY a.deptno, a.empno > ) t; > > The sql failed with errors. Any pointers or advise towards resolving this > is much appreciated. > > 2013-02-13 23:30:18,925 INFO org.apache.hadoop.mapred.JobTracker: Adding > task (REDUCE) 'attempt_201302132324_0002_r_000000_3' to tip > task_201302132324_0002_r_000000, for tracker > 'tracker_192.168.0.151:localhost/127.0.0.1:50099' > 2013-02-13 23:30:18,925 INFO org.apache.hadoop.mapred.JobTracker: Removing > task 'attempt_201302132324_0002_r_000000_2' > 2013-02-13 23:30:26,484 INFO org.apache.hadoop.mapred.TaskInProgress: > Error from attempt_201302132324_0002_r_000000_3: > java.lang.RuntimeException: Error in configuring object > at > org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93) > at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) > at > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) > at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:486) > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) > ... 9 more > Caused by: java.lang.RuntimeException: Reduce operator initialization > failed > at > org.apache.hadoop.hive.ql.exec.ExecReducer.configure(ExecReducer.java:157) > ... 14 more > Caused by: java.lang.NullPointerException > at > org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:137) > at > org.apache.hadoop.hive.ql.exec.Operator.initEvaluators(Operator.java:896) > at > org.apache.hadoop.hive.ql.exec.Operator.initEvaluatorsAndReturnStruct(Operator.java:922) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:60) > at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:357) > at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:433) > at > org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:389) > at > org.apache.hadoop.hive.ql.exec.ExtractOperator.initializeOp(ExtractOperator.java:40) > at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:357) > at > org.apache.hadoop.hive.ql.exec.ExecReducer.configure(ExecReducer.java:150) > ... 14 more > 2013-02-13 23:30:29,819 INFO org.apache.hadoop.mapred.TaskInProgress: > TaskInProgress task_201302132324_0002_r_000000 has failed 4 times. > 2013-02-13 23:30:29,820 INFO org.apache.hadoop.mapred.JobInProgress: > TaskTracker at '192.168.0.151' turned 'flaky' > .... 12 more lines.. > > Tried different function "GenericUDFMax".. same error. > > Any pointers/advise, what could be wrong? > > > > > -- > *Dean Wampler, Ph.D.* > thinkbiganalytics.com > +1-312-339-1330 > > > > -- Nitin Pawar