lvyanquan created SPARK-44696:
---------------------------------

             Summary: Support deal with microsecond for `from_json` function
                 Key: SPARK-44696
                 URL: https://issues.apache.org/jira/browse/SPARK-44696
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.4.1
            Reporter: lvyanquan


Currently, `from_json` function will treat all input timestamp as second 
precise.
However, if the input is millisecond or microsecond, this function will return 
wrong result and throw no exception, which will lead to misunderstanding.

Example:
{code:java}
spark-sql> SELECT from_json('{"a":1, "b":1691241070}', 'a INT, b TIMESTAMP');
{"a":1,"b":2023-08-05 21:11:10}

spark-sql> SELECT from_json('{"a":1, "b":1691241070000}', 'a INT, b TIMESTAMP');
{"a":1,"b":+55563-04-19 18:06:40}

spark-sql> SELECT from_json('{"a":1, "b":1691241070000000}', 'a INT, b 
TIMESTAMP');
{"a":1,"b":-183707-06-27 20:24:24.251328}{code}
 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to