This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.2 by this push:
     new be19270  [SPARK-36429][SQL] JacksonParser should throw exception when 
data type unsupported
be19270 is described below

commit be192708809de363a04895e62bc1ca1216658395
Author: gengjiaan <gengji...@360.cn>
AuthorDate: Fri Aug 6 12:53:04 2021 +0800

    [SPARK-36429][SQL] JacksonParser should throw exception when data type 
unsupported
    
    ### What changes were proposed in this pull request?
    Currently, when `set spark.sql.timestampType=TIMESTAMP_NTZ`, the behavior 
is different between `from_json` and `from_csv`.
    ```
    -- !query
    select from_json('{"t":"26/October/2015"}', 't Timestamp', 
map('timestampFormat', 'dd/MMMMM/yyyy'))
    -- !query schema
    struct<from_json({"t":"26/October/2015"}):struct<t:timestamp_ntz>>
    -- !query output
    {"t":null}
    ```
    
    ```
    -- !query
    select from_csv('26/October/2015', 't Timestamp', map('timestampFormat', 
'dd/MMMMM/yyyy'))
    -- !query schema
    struct<>
    -- !query output
    java.lang.Exception
    Unsupported type: timestamp_ntz
    ```
    
    We should make `from_json` throws exception too.
    This PR fix the discussion below
    https://github.com/apache/spark/pull/33640#discussion_r682862523
    
    ### Why are the changes needed?
    Make the behavior of `from_json` more reasonable.
    
    ### Does this PR introduce _any_ user-facing change?
    'Yes'.
    from_json throwing Exception when we set 
spark.sql.timestampType=TIMESTAMP_NTZ.
    
    ### How was this patch tested?
    Tests updated.
    
    Closes #33654 from beliefer/SPARK-36429.
    
    Authored-by: gengjiaan <gengji...@360.cn>
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
---
 .../scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala  | 8 ++------
 .../resources/sql-tests/results/timestampNTZ/timestamp.sql.out    | 5 +++--
 2 files changed, 5 insertions(+), 8 deletions(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
index 04a0f1a..2761c52 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
@@ -330,12 +330,8 @@ class JacksonParser(
     case udt: UserDefinedType[_] =>
       makeConverter(udt.sqlType)
 
-    case _ =>
-      (parser: JsonParser) =>
-        // Here, we pass empty `PartialFunction` so that this case can be
-        // handled as a failed conversion. It will throw an exception as
-        // long as the value is not null.
-        parseJsonToken[AnyRef](parser, 
dataType)(PartialFunction.empty[JsonToken, AnyRef])
+    // We don't actually hit this exception though, we keep it for 
understandability
+    case _ => throw QueryExecutionErrors.unsupportedTypeError(dataType)
   }
 
   /**
diff --git 
a/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out 
b/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
index b8a6800..c6de535 100644
--- 
a/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
+++ 
b/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
@@ -642,9 +642,10 @@ You may get a different result due to the upgrading of 
Spark 3.0: Fail to recogn
 -- !query
 select from_json('{"t":"26/October/2015"}', 't Timestamp', 
map('timestampFormat', 'dd/MMMMM/yyyy'))
 -- !query schema
-struct<from_json({"t":"26/October/2015"}):struct<t:timestamp_ntz>>
+struct<>
 -- !query output
-{"t":null}
+java.lang.Exception
+Unsupported type: timestamp_ntz
 
 
 -- !query

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to