I have followed all of the instructions 
here<https://drill.apache.org/docs/tutorial-develop-a-simple-function/> and 
also 
here<https://drill.apache.org/docs/manually-adding-custom-functions-to-drill/> 
as closely as possible, but unfortunately Drill is still not finding my custom 
UDF.

I have checked that:

·         My source and binary jars are present in jars/3rdparty

·         My jars both have “drill-module.conf” in their root, and that file’s 
contents are:

o    drill.classpath.scanning.packages += "path.to.my.package"

o    but with my real package, which holds drill functions.

·          I have removed the drill.exec.udf section from my 
drill-override.conf file.

·          I have configured my pom to build using ‘jar-no-fork’ like in your 
example.

·          My function implements DrillSimpleFunc and is annotated with 
FunctionTemplate.  It’s scope is simple, and uses “NULL_IF_NULL”

·          My function has a NullableVarCharHolder input parameter, 
NullableVarCharHolder output parameter, and also accepts a @Inject DrillBuf 
parameter.  It is expected to be called with a single string argument.



From the Drill UI, I keep getting this error:



VALIDATION ERROR: From line 1, column 8 to line 1, column 29: No match found 
for function signature …()



SQL Query null



I have tried issuing the query with my function in all caps and also 
lower-case.  In the logs I see that my 3rdparty jar is the first in the list of 
scanning jars, and the appropriate package is listed in the scanning packages.  
The logs indicate that 433 functions were loaded upon startup.  For some reason 
the logs mention loading functions from the hive UDF jars, but not mine.



Other details:

·          I am running zookeeper separately from Drill, but on the same node.  
I use drillbit.sh to run, so it’s like a cluster of one.

·          This is on AWS.

·          I did have a drill.exec.udf section defined previously, but it is 
not defined in drill-override now.  I wonder if ZK persisted those values from 
a previous run and that is still getting used.

·          I am not running Hadoop, there is no HDFS that I can add dynamic UDF 
jars to.

·          I am using drill 1.10.



I have also tried setting “exec.udf.enable_dynamic_support” to false and 
restarting, but that did not resolve the issue.



I have noticed one unrelated problem, the paths used for udfs on the file 
system do not match what I set in drill-override.conf, I think drill is 
prepending them with a temp directory even though I provided an absolute path.

Questions:

1.  Does anybody know what I am doing wrong?

2.  Can I use dynamic UDFs without HDFS?

3.  Are there more troubleshooting techniques I can use here?  How can I list 
all of the known UDFs and their jars?



Michael Knapp
________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.

Reply via email to