Hi,
For an aggregating UDF, use spark.udf.registerJavaUDAF(name, className).
Enrico
Am 23.04.23 um 23:42 schrieb Thomas Wang:
Hi Spark Community,
I have implemented a custom Spark Aggregator (a subclass to
|org.apache.spark.sql.expressions.Aggregator|). Now I'm trying to use
it in a PySpa
Hi Spark Community,
I have implemented a custom Spark Aggregator (a subclass to
org.apache.spark.sql.expressions.Aggregator). Now I'm trying to use it in a
PySpark application, but for some reason, I'm not able to trigger the
function. Here is what I'm doing, could someone help me take a look? Tha