Re: When to expect UTF8String?

2015-06-12 Thread Michael Armbrust
1. Custom aggregators that do map-side combine. This is something I'd hoping to add in Spark 1.5 2. UDFs with more than 22 arguments which is not supported by ScalaUdf, and to avoid wrapping a Java function interface in one of 22 different Scala function interfaces depending on the number

RE: When to expect UTF8String?

2015-06-12 Thread Zack Sampson
a UDF instead? On Thu, Jun 11, 2015 at 8:08 PM, zsampson zsamp...@palantir.commailto:zsamp...@palantir.com wrote: I'm hoping for some clarity about when to expect String vs UTF8String when using the Java DataFrames API. In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was once

Re: When to expect UTF8String?

2015-06-11 Thread Michael Armbrust
instead? On Thu, Jun 11, 2015 at 8:08 PM, zsampson zsamp...@palantir.com wrote: I'm hoping for some clarity about when to expect String vs UTF8String when using the Java DataFrames API. In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was once a String is now a UTF8String

When to expect UTF8String?

2015-06-11 Thread zsampson
I'm hoping for some clarity about when to expect String vs UTF8String when using the Java DataFrames API. In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was once a String is now a UTF8String. The comments in the file and the related commit message indicate that maybe