[GitHub] [spark] MaxGekk commented on a diff in pull request #37969: [SPARK-40530][SQL] Add error-related developer APIs
MaxGekk commented on code in PR #37969: URL: https://github.com/apache/spark/pull/37969#discussion_r978508477 ## core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala: ## @@ -321,4 +323,22 @@ class SparkThrowableSuite extends SparkFunSuite { | } |}""".stripMargin) } + + test("overwrite error classes") { +withTempDir { dir => + val json = new File(dir, "errors.json") + FileUtils.writeStringToFile(json, +""" + |{ + | "DIVIDE_BY_ZERO" : { + |"message" : [ + | "abc" + |] + | } + |} + |""".stripMargin) + val reader = new ErrorClassesJsonReader(Seq(errorJsonFilePath.toUri.toURL, json.toURL)) Review Comment: Do you use the existing JSON file? How can you guarantee that one JSON overrides the standard one? Maybe you will add small one for testing to check that it overrides error-classes.json? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] MaxGekk commented on a diff in pull request #37969: [SPARK-40530][SQL] Add error-related developer APIs
MaxGekk commented on code in PR #37969: URL: https://github.com/apache/spark/pull/37969#discussion_r977867254 ## core/src/main/scala/org/apache/spark/SparkThrowableHelper.scala: ## @@ -178,9 +90,7 @@ private[spark] object SparkThrowableHelper { val errorSubClass = e.getErrorSubClass if (errorSubClass != null) g.writeStringField("errorSubClass", errorSubClass) if (format == STANDARD) { -val errorInfo = errorClassToInfoMap.getOrElse(errorClass, - throw SparkException.internalError(s"Cannot find the error class '$errorClass'")) -g.writeStringField("message", errorInfo.messageFormat) +g.writeStringField("message", e.getMessage) Review Comment: > The STANDARD mode is a combination of PRETTY and MINIMAL mode From where it comes? In the current implementation, the info is orthogonal. If you need, you can substitute params yourself as you want. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] MaxGekk commented on a diff in pull request #37969: [SPARK-40530][SQL] Add error-related developer APIs
MaxGekk commented on code in PR #37969: URL: https://github.com/apache/spark/pull/37969#discussion_r977867254 ## core/src/main/scala/org/apache/spark/SparkThrowableHelper.scala: ## @@ -178,9 +90,7 @@ private[spark] object SparkThrowableHelper { val errorSubClass = e.getErrorSubClass if (errorSubClass != null) g.writeStringField("errorSubClass", errorSubClass) if (format == STANDARD) { -val errorInfo = errorClassToInfoMap.getOrElse(errorClass, - throw SparkException.internalError(s"Cannot find the error class '$errorClass'")) -g.writeStringField("message", errorInfo.messageFormat) +g.writeStringField("message", e.getMessage) Review Comment: > The STANDARD mode is a combination of PRETTY and MINIMAL mode From where it comes? In the current implementation, the info is orthogonal. If you need, you can substitute params your self as you want. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org