davidm-db commented on code in PR #47462:
URL: https://github.com/apache/spark/pull/47462#discussion_r1714418202


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/SqlScriptingParserSuite.scala:
##########
@@ -294,7 +294,21 @@ class SqlScriptingParserSuite extends SparkFunSuite with 
SQLHelper {
         parameters = Map("varName" -> "`testVariable`", "lineNumber" -> "4"))
   }
 
-  // TODO Add test for INVALID_VARIABLE_DECLARATION.NOT_ALLOWED_IN_SCOPE 
exception
+  test("declare in wrong scope") {
+    val sqlScriptText =
+      """
+        |BEGIN
+        | IF 1=1 THEN
+        |   DECLARE testVariable INTEGER;
+        | END IF;
+        |END""".stripMargin
+    checkError(
+      exception = intercept[SqlScriptingException] {
+        parseScript(sqlScriptText)
+      },
+      errorClass = "INVALID_VARIABLE_DECLARATION.NOT_ALLOWED_IN_SCOPE",
+      parameters = Map("varName" -> "`testVariable`", "lineNumber" -> "4"))

Review Comment:
   as far as my understanding goes here, query context is the context of the 
expression/query plan, whereas SQL Scripting interpreter uses a new/separate 
node hierarchy (for both logical and execution nodes). for this purpose, we 
have created a new `SqlScriptingException` class that relies on `Origin` to 
create proper error messaging, i.e. exceptions in the "{LINE} [ERROR_CLASS] 
exception message" format.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to