Hi

I wanted to know how to go about registering scala functions as UDFs using
spark sql

create temporary function statement.

Currently I do the following

/* convert prices to holding period returns */
object VaR extends Serializable {

def returns(prices :Seq[Double], horizon: Integer) : Seq[Double] = {
  (prices zip prices.drop(horizon)).map(x=>(x._2-x._1)/x._2)
}
}
sqlContext.udf.register("returns", returns _)

in my scala code

Regards
Deenar



*Think Reactive Ltd*
deenar.toras...@thinkreactive.co.uk
07714140812

Reply via email to