Spark 1.6 ignoreNulls in first/last aggregate functions

2016-01-21 Thread emlyn
As I understand it, Spark 1.6 changes the behaviour of the first and last aggregate functions to take nulls into account (where they were ignored in 1.5). From SQL you can use "IGNORE NULLS" to get the old behaviour back. How do I ignore nulls

Re: Spark 1.6 ignoreNulls in first/last aggregate functions

2016-01-21 Thread emlyn
Turns out I can't use a user defined aggregate function, as they are not supported in Window operations. There surely must be some way to do a last_value with ignoreNulls enabled in Spark 1.6? Any ideas for workarounds? -- View this message in context: