Yes, SPARK-3853 just got merged 11 days ago. It should be OK in 1.2.0. And for 
the first approach, It would be ok after SPARK-4003 is merged.

-----Original Message-----
From: tridib [mailto:tridib.sama...@live.com] 
Sent: Tuesday, October 21, 2014 11:09 AM
To: u...@spark.incubator.apache.org
Subject: RE: spark sql: timestamp in json - fails

Spark 1.1.0



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-timestamp-in-json-fails-tp16864p16888.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to