[ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cheng Lian resolved SPARK-4811.
-------------------------------
    Resolution: Duplicate

Although this ticket was opened earlier, I mark this a duplicate of SPARK-6708 
because SPARK-6708 gives clear reproduction steps.

> Custom UDTFs not working in Spark SQL
> -------------------------------------
>
>                 Key: SPARK-4811
>                 URL: https://issues.apache.org/jira/browse/SPARK-4811
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0, 1.1.1, 1.2.1, 1.3.0
>            Reporter: Saurabh Santhosh
>            Priority: Critical
>
> I am using the Thrift srever interface to Spark SQL and using beeline to 
> connect to it.
> I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
> following exception when using any custom UDTF.
> These are the steps i did :
> *Created a UDTF 'com.x.y.xxx'.*
> Registered the UDTF using following query : 
> *create temporary function xxx as 'com.x.y.xxx'*
> The registration went through without any errors. But when i tried executing 
> the UDTF i got the following error.
> *java.lang.ClassNotFoundException: xxx*
> Funny thing is that Its trying to load the function name instead of the 
> funtion class. The exception is at *line no: 81 in hiveudfs.scala*
> I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to