[
https://issues.apache.org/jira/browse/SPARK-20314?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15975692#comment-15975692
]
Eric Wasserman commented on SPARK-20314:
----------------------------------------
{code:title=JsonParseError.scala|borderStyle=solid}
// Demonstrates case in which `json_tuple` when parsing a valid String (that
contains invalid JSON) throws an exception rather than returning null
import spark.implicits._
import org.apache.spark.sql.functions._
val badJson = new String("\0\0\0A\1AAA")
val badDF = List(badJson).toDF
badDF.select(json_tuple('value, "whatever")).show
{code}
> Inconsistent error handling in JSON parsing SQL functions
> ---------------------------------------------------------
>
> Key: SPARK-20314
> URL: https://issues.apache.org/jira/browse/SPARK-20314
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.1.0
> Reporter: Eric Wasserman
>
> Most parse errors in the JSON parsing SQL functions (e.g. json_tuple,
> get_json_object) will return a null(s) if the JSON is badly formed. However,
> if Jackson determines that the string includes invalid characters it will
> throw an exception (java.io.CharConversionException: Invalid UTF-32
> character) that Spark does not catch. This creates a robustness problem in
> that these functions cannot be used at all when there may be dirty data as
> these exceptions will kill the jobs.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]