Hi Devs, I've used a UDF in spark query in DAS by following the below steps:
1. Create a jar file for spark UDF implementation and add it DAS_HOME/repository/components/lib. 2. Add the UDF class to *spark-udf-config.xml* which is in DAS_HOME/repository/conf. After that I tried the UDF in spark console and it worked fine. But now I got the latest changes and built a new pack. There same UDF doesn't work. I can see a new file called *add-to-spark-classpath.xml.* Do I need to any extra configuration for latest pack? Thanks. -- Thanuja Uruththirakodeeswaran Software Engineer WSO2 Inc.;http://wso2.com lean.enterprise.middleware mobile: +94 774363167
_______________________________________________ Dev mailing list [email protected] http://wso2.org/cgi-bin/mailman/listinfo/dev
