planga82 commented on a change in pull request #33672:
URL: https://github.com/apache/spark/pull/33672#discussion_r685615448



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala
##########
@@ -402,7 +402,11 @@ class DataFrameReader private[sql](sparkSession: 
SparkSession) extends Logging {
    * @since 2.0.0
    */
   @scala.annotation.varargs
-  def json(paths: String*): DataFrame = format("json").load(paths : _*)
+  def json(paths: String*): DataFrame = {
+    userSpecifiedSchema.foreach(
+      ExprUtils.checkJsonSchema(_).foreach(e => throw new 
AnalysisException(e)))

Review comment:
       One possible solution, until we have a more generic solution for 
`TypeCheckFailure`, is modify checkJsonSchema to return the exception instead 
of the message `def checkJsonSchema(schema: DataType): 
Option[AnalysisException]` and for `TypeCheckFailure` get the message from the 
exception, and in other cases throw the exception.
   I'm looking how to add a new exception to `QueryCompilationErrors` properly 
to push the code with the possible solution.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to