Divya:

https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-datetimestring-handling-time-intervals-and-udafs.html

The link gives a complete example of registering a udAf - user defined 
aggregate function. This is a complete example and this example should give you 
a complete idea of registering a UDF. If you still need a hand let us know.

Thanks
Kabeer.

Sent from Nylas 
N1<https://link.nylas.com/link/6fso0l6fy8gw777bnnxgxerwm/local-34e17349-235c/0?redirect=https%3A%2F%2Fnylas.com%2Fn1%3Fref%3Dn1>,
 the extensible, open source mail client.

On Jul 21 2016, at 8:13 am, Jacek Laskowski <ja...@japila.pl> wrote:

On Thu, Jul 21, 2016 at 5:53 AM, Mich Talebzadeh
<mich.talebza...@gmail.com> wrote:
> something similar

Is this going to be in Scala?

> def ChangeToDate (word : String) : Date = {
> //return
> TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(word,"dd/MM/yyyy"),"yyyy-MM-dd"))
> val d1 = Date.valueOf(ReverseDate(word))
> return d1
> }
> sqlContext.udf.register("ChangeToDate", ChangeToDate(_:String))

then...please use lowercase method names and *no* return please ;-)

BTW, no sqlContext as of Spark 2.0. Sorry.../me smiling nicely

Jacek

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to