[GitHub] [spark] itholic commented on a diff in pull request #39282: [SPARK-41581][SQL] Assign name to _LEGACY_ERROR_TEMP_1230

2023-01-04 Thread GitBox


itholic commented on code in PR #39282:
URL: https://github.com/apache/spark/pull/39282#discussion_r1062069986


##
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala:
##
@@ -680,6 +681,18 @@ class QueryCompilationErrorsSuite
   context = ExpectedContext("", "", 7, 13, "CAST(1)")
 )
   }
+
+  test("NEGATIVE_SCALE_NOT_ALLOWED: negative scale for Decimal is not 
allowed") {

Review Comment:
   Thanks! Moved  and migrate the existing test into `checkError`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] itholic commented on a diff in pull request #39282: [SPARK-41581][SQL] Assign name to _LEGACY_ERROR_TEMP_1230

2023-01-02 Thread GitBox


itholic commented on code in PR #39282:
URL: https://github.com/apache/spark/pull/39282#discussion_r1060280460


##
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala:
##
@@ -680,6 +680,18 @@ class QueryCompilationErrorsSuite
   context = ExpectedContext("", "", 7, 13, "CAST(1)")
 )
   }
+
+  test("NEGATIVE_SCALE_NOT_ALLOWED: negative scale for Decimal is not 
allowed") {
+withSQLConf(SQLConf.LEGACY_ALLOW_NEGATIVE_SCALE_OF_DECIMAL_ENABLED.key -> 
"false") {
+  checkError(
+exception = intercept[AnalysisException] (Decimal(BigDecimal("98765"), 
5, -3)),

Review Comment:
   Just switched into `INTERNAL_ERROR`. Thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] itholic commented on a diff in pull request #39282: [SPARK-41581][SQL] Assign name to _LEGACY_ERROR_TEMP_1230

2023-01-01 Thread GitBox


itholic commented on code in PR #39282:
URL: https://github.com/apache/spark/pull/39282#discussion_r1059856462


##
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala:
##
@@ -680,6 +680,18 @@ class QueryCompilationErrorsSuite
   context = ExpectedContext("", "", 7, 13, "CAST(1)")
 )
   }
+
+  test("NEGATIVE_SCALE_NOT_ALLOWED: negative scale for Decimal is not 
allowed") {
+withSQLConf(SQLConf.LEGACY_ALLOW_NEGATIVE_SCALE_OF_DECIMAL_ENABLED.key -> 
"false") {
+  checkError(
+exception = intercept[AnalysisException] (Decimal(BigDecimal("98765"), 
5, -3)),

Review Comment:
   I tried test it with SQL, by using `sql("SELECT cast(98765 as decimal(5, 
-3))")`.
   But I couldn't make it since it raises "PARSE_SYNTAX_ERROR" instead of 
"NEGATIVE_SCALE_NOT_ALLOWED" as below:
   
   ```scala
   scala> sql("SELECT cast(98765 as decimal(5, -3))")
   org.apache.spark.sql.catalyst.parser.ParseException:
   [PARSE_SYNTAX_ERROR] Syntax error at or near '-': extra input '-'(line 1, 
pos 32)
   
   == SQL ==
   SELECT cast(98765 as decimal(5, -3))
   ^^^
   
 at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:306)
 at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:144)
 at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:52)
 at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:89)
 at org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:627)
 at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
 at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:624)
 at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:809)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:622)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:653)
 ... 49 elided
   ```
   
   Seems like it blocked when parsing phase before reaching the point that 
raises NEGATIVE_SCALE_NOT_ALLOWED error ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] itholic commented on a diff in pull request #39282: [SPARK-41581][SQL] Assign name to _LEGACY_ERROR_TEMP_1230

2023-01-01 Thread GitBox


itholic commented on code in PR #39282:
URL: https://github.com/apache/spark/pull/39282#discussion_r1059856462


##
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala:
##
@@ -680,6 +680,18 @@ class QueryCompilationErrorsSuite
   context = ExpectedContext("", "", 7, 13, "CAST(1)")
 )
   }
+
+  test("NEGATIVE_SCALE_NOT_ALLOWED: negative scale for Decimal is not 
allowed") {
+withSQLConf(SQLConf.LEGACY_ALLOW_NEGATIVE_SCALE_OF_DECIMAL_ENABLED.key -> 
"false") {
+  checkError(
+exception = intercept[AnalysisException] (Decimal(BigDecimal("98765"), 
5, -3)),

Review Comment:
   I tried test it with SQL, by using `sql("SELECT cast(98765 as decimal(5, 
-3))")`.
   But I couldn't make it since it raises "PARSE_SYNTAX_ERROR" instead of 
"NEGATIVE_SCALE_NOT_ALLOWED" as below:
   
   ```
   scala> sql("SELECT cast(98765 as decimal(5, -3))")
   org.apache.spark.sql.catalyst.parser.ParseException:
   [PARSE_SYNTAX_ERROR] Syntax error at or near '-': extra input '-'(line 1, 
pos 32)
   
   == SQL ==
   SELECT cast(98765 as decimal(5, -3))
   ^^^
   
 at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:306)
 at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:144)
 at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:52)
 at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:89)
 at org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:627)
 at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
 at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:624)
 at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:809)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:622)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:653)
 ... 49 elided
   ```



##
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala:
##
@@ -680,6 +680,18 @@ class QueryCompilationErrorsSuite
   context = ExpectedContext("", "", 7, 13, "CAST(1)")
 )
   }
+
+  test("NEGATIVE_SCALE_NOT_ALLOWED: negative scale for Decimal is not 
allowed") {
+withSQLConf(SQLConf.LEGACY_ALLOW_NEGATIVE_SCALE_OF_DECIMAL_ENABLED.key -> 
"false") {
+  checkError(
+exception = intercept[AnalysisException] (Decimal(BigDecimal("98765"), 
5, -3)),

Review Comment:
   I tried test it with SQL, by using `sql("SELECT cast(98765 as decimal(5, 
-3))")`.
   But I couldn't make it since it raises "PARSE_SYNTAX_ERROR" instead of 
"NEGATIVE_SCALE_NOT_ALLOWED" as below:
   
   ```scala
   scala> sql("SELECT cast(98765 as decimal(5, -3))")
   org.apache.spark.sql.catalyst.parser.ParseException:
   [PARSE_SYNTAX_ERROR] Syntax error at or near '-': extra input '-'(line 1, 
pos 32)
   
   == SQL ==
   SELECT cast(98765 as decimal(5, -3))
   ^^^
   
 at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:306)
 at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:144)
 at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:52)
 at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:89)
 at org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:627)
 at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
 at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:624)
 at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:809)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:622)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:653)
 ... 49 elided
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org