I would love to see something like this.  The closest related ticket is
probably https://issues.apache.org/jira/browse/SPARK-7768 (though maybe
there are enough people using UDTs in their current form that we should
just make a new ticket)

A few thoughts:
 - even if you can do implicit search, we probably also want a registry for
Java users.
 - what is the output of the serializer going to be? one challenge here is
that encoders write directly into the tungsten format, which is not a
stable public API. Maybe this is more obvious if I understood MappedColumnType
better?

Either way, I'm happy to give further advice if you come up with a more
concrete proposal and put it on JIRA.

On Fri, Dec 2, 2016 at 4:03 PM, Erik LaBianca <erik.labia...@gmail.com>
wrote:

> Hi All,
>
> Apologies in advance for any confusing terminology, I’m still pretty new
> to Spark.
>
> I’ve got a bunch of Scala case class “domain objects” from an existing
> application. Many of them contain simple, but unsupported-by-spark types in
> them, such as case class Foo(timestamp: java.time.Instant). I’d like to be
> able to use these case classes directly in a DataSet, but can’t, since
> there’s no encoder available for java.time.Instant. I’d like to resolve
> that.
>
> I asked around on the gitter channel, and was pointed to the
> ScalaReflections class, which handles creating Encoder[T] for a variety of
> things, including case classes and their members. Barring a better
> solution, what I’d like is to be able to add some additional case
> statements to the serializerFor and deserializeFor methods, dispatching to
> something along the lines of the Slick MappedColumnType[1]. In an ideal
> scenario, I could provide these mappings via implicit search, but I’d be
> happy to settle for a registry of some sort too.
>
> Does this idea make sense, in general? I’m interested in taking a stab at
> the implementation, but Jakob recommended I surface it here first to see if
> there were any plans around this sort of functionality already.
>
> Thanks!
>
> —erik
>
> 1. http://slick.lightbend.com/doc/3.0.0/userdefined.html#
> using-custom-scalar-types-in-queries
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to