Hello. I'm trying to write my own UDF in Scala which takes two
parameters of array<double> type and returns double. I used the next
prototype:
public DoubleWritable evaluate(ArrayWritable x, ArrayWritable y);
I've successfully registered all necessary jars and created temporary
function for my UDF.
And then..
hive> describe test;
OK
id int
oksa array<double>
hive> select myudf(oksa, oksa) from test;
FAILED: Unknown exception : Cannot get UDF for myudf [array<double>,
array<double>]
So what parameter type I have to use for array<double>?
Here is the full source:
package com.buerak.test;
import org.apache.hadoop.hive.ql.exec.UDF;
import scala.collection.JavaConversions._;
import org.apache.hadoop.io.DoubleWritable;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.io.ArrayWritable;
class AlwaysThree extends UDF {
def evaluate(x: ArrayWritable, y: ArrayWritable): DoubleWritable =
{
new DoubleWritable(3)
}
}