HyukjinKwon opened a new pull request #33608:
URL: https://github.com/apache/spark/pull/33608


   ### What changes were proposed in this pull request?
   
   This PR proposes to fail properly so JSON parser can proceed and parse the 
input with the permissive mode.
   Previously, we passed `null`s as are, the root `InternalRow`s became 
`null`s, and it causes the query fails even with permissive mode on.
   Now, we fail explicitly if `null` is passed when the input array contains 
`null`.
   
   Note that this is consistent with non-array JSON input:
   
   **Permissive mode:**
   
   ```scala
   spark.read.json(Seq("""{"a": "str"}""", """null""").toDS).collect()
   ```
   ```
   res0: Array[org.apache.spark.sql.Row] = Array([str], [null])
   ```
   
   **Failfast mode**:
   
   ```scala
   spark.read.option("mode", "failfast").json(Seq("""{"a": "str"}""", 
"""null""").toDS).collect()
   ```
   ```
   org.apache.spark.SparkException: Malformed records are detected in record 
parsing. Parse Mode: FAILFAST. To process malformed records as null result, try 
setting the option 'mode' as 'PERMISSIVE'.
        at 
org.apache.spark.sql.catalyst.util.FailureSafeParser.parse(FailureSafeParser.scala:70)
        at 
org.apache.spark.sql.DataFrameReader.$anonfun$json$7(DataFrameReader.scala:540)
        at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
   ```
   
   ### Why are the changes needed?
   
   To make the permissive mode to proceed and parse without throwing an 
exception.
   
   ### Does this PR introduce _any_ user-facing change?
   
   **Permissive mode:**
   
   ```scala
   spark.read.json(Seq("""[{"a": "str"}, null]""").toDS).collect()
   ```
   
   Before:
   
   ```
   java.lang.NullPointerException
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown
 Source)
        at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
        at 
org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:759)
   ```
   
   After:
   
   ```
   res0: Array[org.apache.spark.sql.Row] = Array([null])
   ```
   
   NOTE that this behaviour is consistent when JSON object is malformed:
   
   ```scala
   spark.read.schema("a int").json(Seq("""[{"a": 123}, {123123}, {"a": 
123}]""").toDS).collect()
   ```
   
   ```
   res0: Array[org.apache.spark.sql.Row] = Array([null])
   ```
   
   Since we're parsing _one_ JSON array, related records all fail together.
   
   **Failfast mode:**
   
   
   ```scala
   spark.read.option("mode", "failfast").json(Seq("""[{"a": "str"}, 
null]""").toDS).collect()
   ```
   
   Before:
   
   ```
   java.lang.NullPointerException
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown
 Source)
        at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
        at 
org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:759)
   ```
   
   After:
   
   ```
   org.apache.spark.SparkException: Malformed records are detected in record 
parsing. Parse Mode: FAILFAST. To process malformed records as null result, try 
setting the option 'mode' as 'PERMISSIVE'.
        at 
org.apache.spark.sql.catalyst.util.FailureSafeParser.parse(FailureSafeParser.scala:70)
        at 
org.apache.spark.sql.DataFrameReader.$anonfun$json$7(DataFrameReader.scala:540)
        at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
   ```
   
   
   ### How was this patch tested?
   
   Manually tested, and unit test was added.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to