nnot cast
>> it to the case class because obviously the data does not contain the case
>> class inside.
>>
>>
>>
>> How would rewriting collect as a Spark UDAF help there?
>>
>>
>>
>> Thanks for your quick response!
>>
>> Fran
I have found that this even does not work with a struct as an input
parameter
def testUDF(expectedExposures: (Float, Float))= {
(expectedExposures._1 * expectedExposures._2 /expectedExposures._1)
}
sqlContext.udf.register("testUDF", testUDF _)
sqlContext.sql("select testUDF(struct(noofmonth
Hi
I am trying to define an UDF that can take an array of tuples as input
def effectiveExpectedExposure(expectedExposures: Seq[(Seq[Float],
Seq[Float])])=
expectedExposures.map(x=> x._1 * x._2).sum/expectedExposures.map(x=>
x._1).sum
sqlContext.udf.register("expectedPositiveExposure",
expectedPo