Thanks for your advice Charles.

First, I did have drill-module.conf in src/main/resources, which is the reason 
that it’s in the root of my jar file.  I thought it would be better to check 
the jar file itself instead of my project.

I did back out all my UDFs code and reached a point where it was recognized.  I 
even got it to work where it just appends one string to the input.  What I am 
trying to do now, is to encrypt the input with a cipher.  Drill is p***ing me 
off with how it never works and never delivers a useful error message in the 
logs when this fails.  I believe it was a poor choice to use runtime 
compilation and off heap memory for this.

At first I was depending on an external jar to do this, but in my 
troubleshooting, I decided to copy in the core logic.

I attached the latest code I have tried that is failing.

Michael Knapp

On 5/9/17, 12:13 AM, "Charles Givre" <[email protected]> wrote:

    Hi Michael, 
    I’ve encountered this issue when developing Drill UDFs and sometimes it can 
mean that there is an error in the UDF itself.  What is particularly insidious 
about these kinds of errors is that the UDF will compile and build just fine, 
but when you try to use it in a query, Drill can’t find the function. 
    
    I would recommend first testing the UDF on Drill in embedded mode so that 
way you can minimize the things which can go wrong. Next, I would comment out 
the entirety of the eval() and next() functions, build the UDF and see if Drill 
recognizes the function.  If it does, then slowly start uncommenting lines to 
see what is breaking it.  
    
    One other thing,  I believe the drill-module.conf is supposed to be in the 
resources folder in your project.  Mine are always in 
<project>/src/main/resources.
    
    Can you share any of your code?
    — C
     
    
    > On May 8, 2017, at 17:10, Knapp, Michael <[email protected]> 
wrote:
    > 
    > I have followed all of the instructions 
here<https://drill.apache.org/docs/tutorial-develop-a-simple-function/> and 
also 
here<https://drill.apache.org/docs/manually-adding-custom-functions-to-drill/> 
as closely as possible, but unfortunately Drill is still not finding my custom 
UDF.
    > 
    > I have checked that:
    > 
    > ·         My source and binary jars are present in jars/3rdparty
    > 
    > ·         My jars both have “drill-module.conf” in their root, and that 
file’s contents are:
    > 
    > o    drill.classpath.scanning.packages += "path.to.my.package"
    > 
    > o    but with my real package, which holds drill functions.
    > 
    > ·          I have removed the drill.exec.udf section from my 
drill-override.conf file.
    > 
    > ·          I have configured my pom to build using ‘jar-no-fork’ like in 
your example.
    > 
    > ·          My function implements DrillSimpleFunc and is annotated with 
FunctionTemplate.  It’s scope is simple, and uses “NULL_IF_NULL”
    > 
    > ·          My function has a NullableVarCharHolder input parameter, 
NullableVarCharHolder output parameter, and also accepts a @Inject DrillBuf 
parameter.  It is expected to be called with a single string argument.
    > 
    > 
    > 
    > From the Drill UI, I keep getting this error:
    > 
    > 
    > 
    > VALIDATION ERROR: From line 1, column 8 to line 1, column 29: No match 
found for function signature …()
    > 
    > 
    > 
    > SQL Query null
    > 
    > 
    > 
    > I have tried issuing the query with my function in all caps and also 
lower-case.  In the logs I see that my 3rdparty jar is the first in the list of 
scanning jars, and the appropriate package is listed in the scanning packages.  
The logs indicate that 433 functions were loaded upon startup.  For some reason 
the logs mention loading functions from the hive UDF jars, but not mine.
    > 
    > 
    > 
    > Other details:
    > 
    > ·          I am running zookeeper separately from Drill, but on the same 
node.  I use drillbit.sh to run, so it’s like a cluster of one.
    > 
    > ·          This is on AWS.
    > 
    > ·          I did have a drill.exec.udf section defined previously, but it 
is not defined in drill-override now.  I wonder if ZK persisted those values 
from a previous run and that is still getting used.
    > 
    > ·          I am not running Hadoop, there is no HDFS that I can add 
dynamic UDF jars to.
    > 
    > ·          I am using drill 1.10.
    > 
    > 
    > 
    > I have also tried setting “exec.udf.enable_dynamic_support” to false and 
restarting, but that did not resolve the issue.
    > 
    > 
    > 
    > I have noticed one unrelated problem, the paths used for udfs on the file 
system do not match what I set in drill-override.conf, I think drill is 
prepending them with a temp directory even though I provided an absolute path.
    > 
    > Questions:
    > 
    > 1.  Does anybody know what I am doing wrong?
    > 
    > 2.  Can I use dynamic UDFs without HDFS?
    > 
    > 3.  Are there more troubleshooting techniques I can use here?  How can I 
list all of the known UDFs and their jars?
    > 
    > 
    > 
    > Michael Knapp
    > ________________________________________________________
    > 
    > The information contained in this e-mail is confidential and/or 
proprietary to Capital One and/or its affiliates and may only be used solely in 
performance of work or services for Capital One. The information transmitted 
herewith is intended only for use by the individual or entity to which it is 
addressed. If the reader of this message is not the intended recipient, you are 
hereby notified that any review, retransmission, dissemination, distribution, 
copying or other use of, or taking of any action in reliance upon this 
information is strictly prohibited. If you have received this communication in 
error, please contact the sender and delete the material from your computer.
    
    

________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.

Reply via email to