gt;> I am also working on the UDAF interface refactoring, after that we can
>>> provide the custom interface for extension.
>>>
>>>
>>>
>>> https://github.com/apache/spark/pull/3247
>>>
>>>
>>>
>>>
>>>
>>
>
>> I am also working on the UDAF interface refactoring, after that we can
>> provide the custom interface for extension.
>>
>>
>>
>> https://github.com/apache/spark/pull/3247
>>
>>
>>
>>
>>
>> *From:* shahab [mailto:shahab.mo
oping the custom
> UDTF, but it can work seamless with Hive UDTF.
>
>
>
> I am working on the UDTF refactoring for Spark SQL, hopefully will provide
> an Hive independent UDTF soon after that.
>
>
>
> *From:* shahab [mailto:shahab.mok...@gmail.com]
> *Sent:* Tues
/pull/3247
From: shahab [mailto:shahab.mok...@gmail.com]
Sent: Wednesday, March 11, 2015 1:44 AM
To: Cheng, Hao
Cc: user@spark.apache.org
Subject: Re: Registering custom UDAFs with HiveConetxt in SparkSQL, how?
Thanks Hao,
But my question concerns UDAF (user defined aggregation function ) not UDTF
ahab.mok...@gmail.com
> ]
> *Sent:* Tuesday, March 10, 2015 5:44 PM
> *To:* user@spark.apache.org
>
> *Subject:* Registering custom UDAFs with HiveConetxt in SparkSQL, how?
>
>
>
> Hi,
>
>
>
> I need o develop couple of UDAFs and use them in the SparkSQL. While UDF
: Tuesday, March 10, 2015 5:44 PM
To: user@spark.apache.org
Subject: Registering custom UDAFs with HiveConetxt in SparkSQL, how?
Hi,
I need o develop couple of UDAFs and use them in the SparkSQL. While UDFs can
be registered as a function in HiveContext, I could not find any documentation
of how UDAFs
Hi,
I need o develop couple of UDAFs and use them in the SparkSQL. While UDFs
can be registered as a function in HiveContext, I could not find any
documentation of how UDAFs can be registered in the HiveContext?? so far
what I have found is to make a JAR file, out of developed UDAF class, and
then