MaxGekk commented on code in PR #49807:
URL: https://github.com/apache/spark/pull/49807#discussion_r1942525322


##########
sql/api/src/main/scala/org/apache/spark/sql/errors/DataTypeErrors.scala:
##########
@@ -124,10 +124,9 @@ private[sql] object DataTypeErrors extends 
DataTypeErrorsBase {
 
   def negativeScaleNotAllowedError(scale: Int): Throwable = {
     val sqlConf = 
QuotingUtils.toSQLConf("spark.sql.legacy.allowNegativeScaleOfDecimal")
-    SparkException.internalError(
-      s"Negative scale is not allowed: ${scale.toString}." +
-        s" Set the config ${sqlConf}" +
-        " to \"true\" to allow it.")
+    new AnalysisException(
+      errorClass = "NEGATIVE_SCALE_DISALLOWED",
+      messageParameters = Map("scale" -> scale.toString, "sqlConf" -> sqlConf))

Review Comment:
   It would be better to format `scale` as a SQL value. How about to quote it 
by `toSQLValue`.



##########
common/utils/src/main/resources/error/error-conditions.json:
##########
@@ -3923,6 +3923,13 @@
     ],
     "sqlState" : "0A000"
   },
+  "NEGATIVE_SCALE_DISALLOWED": {
+    "message": [
+      "Negative scale is not allowed: '<scale>'.",
+      "Set the config <sqlConf> to 'true' to allow it."

Review Comment:
   Let's follow the existing convention SQL conf values in 
`error-conditions.json`:
   ```suggestion
         "Negative scale is not allowed: '<scale>'. Set the config <sqlConf> to 
\"true\" to allow it."
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to