The ultimate aim of my program is to be able to wrap an arbitrary Scala function (mostly will be statistics / customized rolling window metrics) in a UDF and evaluate them on DataFrames using the window functionality.
So my main question is how do I express that a UDF takes a Frame of rows from a DataFrame as an argument instead of just a single row? And what sort of arguments would the arbitrary Scala function need to take in order to handle the raw data from the Frame? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-evaluate-custom-UDF-over-window-tp24419.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org