Oh, sorry, my bad, currently Spark SQL doesn’t provide the user interface for 
UDAF, but it can work seamlessly with Hive UDAF (via HiveContext).

I am also working on the UDAF interface refactoring, after that we can provide 
the custom interface for extension.

https://github.com/apache/spark/pull/3247


From: shahab [mailto:shahab.mok...@gmail.com]
Sent: Wednesday, March 11, 2015 1:44 AM
To: Cheng, Hao
Cc: user@spark.apache.org
Subject: Re: Registering custom UDAFs with HiveConetxt in SparkSQL, how?

Thanks Hao,
But my question concerns UDAF (user defined aggregation function ) not UDTF( 
user defined type function ).
I appreciate if you could point me to some starting point on UDAF development 
in Spark.

Thanks
Shahab

On Tuesday, March 10, 2015, Cheng, Hao 
<hao.ch...@intel.com<mailto:hao.ch...@intel.com>> wrote:
Currently, Spark SQL doesn’t provide interface for developing the custom UDTF, 
but it can work seamless with Hive UDTF.

I am working on the UDTF refactoring for Spark SQL, hopefully will provide an 
Hive independent UDTF soon after that.

From: shahab 
[mailto:shahab.mok...@gmail.com<javascript:_e(%7B%7D,'cvml','shahab.mok...@gmail.com');>]
Sent: Tuesday, March 10, 2015 5:44 PM
To: user@spark.apache.org<javascript:_e(%7B%7D,'cvml','user@spark.apache.org');>
Subject: Registering custom UDAFs with HiveConetxt in SparkSQL, how?

Hi,

I need o develop couple of UDAFs and use them in the SparkSQL. While UDFs can 
be registered as a function in HiveContext, I could not find any documentation 
of how UDAFs can be registered in the HiveContext?? so far what I have found is 
to make a JAR file, out of developed UDAF class, and then deploy the JAR file 
to SparkSQL .

But is there any way to avoid deploying the jar file and register it 
programmatically?


best,
/Shahab

Reply via email to