Hi all,

In DAS, we can define spark UDFs[1] (User defined functions) which can be
used in spark scripts. Using DAS 3.0.0 or DAS 3.0.1, we can define UDFs in
our own java classes, then deploy those jar files into
/repository/components/lib and use the class methods which we created as
spark UDFs.


Other than that, users now can create Spark UDFs as OSGI components by
registering their custom UDFs with the following Interface from
carbon-analytics/analytics-processors.

*org.wso2.carbon.analytics.spark.core.udf.CarbonUDF*

This interface does not contain any method. The interface is solely used to
identify the registered classes as UDFs. By using this approach UDFs can be
installed as features into DAS without configuring the UDFs configuration
files in DAS also can be installed via p2 repo.

[1]
https://docs.wso2.com/display/DAS301/Creating+Spark+User+Defined+Functions

-- 
Gimantha Bandara
Software Engineer
WSO2. Inc : http://wso2.com
Mobile : +94714961919
_______________________________________________
Architecture mailing list
[email protected]
https://mail.wso2.org/cgi-bin/mailman/listinfo/architecture

Reply via email to