Hi Thanuja,

add-to-spark-classpath.xml is not related to UDFs. it is a configuration
file, which can be used to add jars to the spark classpath. but when you
put a jar into repository/components/lib, it will be added to the spark
classpath by default. so, there is no additional configs you have to do.

what are the errors/ exceptions thrown when you call UDFs?


On Mon, Aug 24, 2015 at 9:59 AM, Thanuja Uruththirakodeeswaran <
[email protected]> wrote:

> Hi Devs,
>
> I've used a UDF in spark query in DAS by following the below steps:
>
> 1. Create a jar file for spark UDF implementation and add it
> DAS_HOME/repository/components/lib.
> 2. Add the UDF class to *spark-udf-config.xml* which is in
> DAS_HOME/repository/conf.
>
> After that I tried the UDF in spark console and it worked fine. But now I
> got the latest changes and built a new pack. There same UDF doesn't work. I
> can see a new file called *add-to-spark-classpath.xml.*
> Do I need to any extra configuration for latest pack?
>
> Thanks.
>
> --
> Thanuja Uruththirakodeeswaran
> Software Engineer
> WSO2 Inc.;http://wso2.com
> lean.enterprise.middleware
>
> mobile: +94 774363167
>



-- 
*Niranda Perera*
Software Engineer, WSO2 Inc.
Mobile: +94-71-554-8430
Twitter: @n1r44 <https://twitter.com/N1R44>
https://pythagoreanscript.wordpress.com/
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to