[ 
https://issues.apache.org/jira/browse/SPARK-18491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15673719#comment-15673719
 ] 

Damian Momot commented on SPARK-18491:
--------------------------------------

You don't have to break compatibility. java.sql.Timestamp can still be 
internal/default type. At the same time it could be possible to define type as:

{code}
case class Test(id: String, timestamp: org.joda.time.Instant)
{code}

And Dataset[Test] would infer schema as TimestampType

As you told if it was possible to write custom encoders this could be easily 
extended, but from my short findings there isn't way to do that yet?

> Spark uses mutable classes for date/time types mapping
> ------------------------------------------------------
>
>                 Key: SPARK-18491
>                 URL: https://issues.apache.org/jira/browse/SPARK-18491
>             Project: Spark
>          Issue Type: Improvement
>            Reporter: Damian Momot
>            Priority: Minor
>
> TimestampType is mapped to java.sql.Timestamp
> DateType is mapped to java.sql.Date
> Those both java types are mutable and thus their usage is highly discourage, 
> especially in distributed computing which uses lazy, functional approach
> Mapping to immutable joda times should be enough for now (until scala 2.12 + 
> jdk8 java.time is available for spark)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to