1. Custom aggregators that do map-side combine.
This is something I'd hoping to add in Spark 1.5
2. UDFs with more than 22 arguments which is not supported by ScalaUdf,
and to avoid wrapping a Java function interface in one of 22 different
Scala function interfaces depending on the number
a UDF instead?
On Thu, Jun 11, 2015 at 8:08 PM, zsampson
zsamp...@palantir.commailto:zsamp...@palantir.com wrote:
I'm hoping for some clarity about when to expect String vs UTF8String when
using the Java DataFrames API.
In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was
once
instead?
On Thu, Jun 11, 2015 at 8:08 PM, zsampson zsamp...@palantir.com wrote:
I'm hoping for some clarity about when to expect String vs UTF8String when
using the Java DataFrames API.
In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was
once a String is now a UTF8String
I'm hoping for some clarity about when to expect String vs UTF8String when
using the Java DataFrames API.
In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was
once a String is now a UTF8String. The comments in the file and the related
commit message indicate that maybe