miland-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1927576695


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala:
##########
@@ -159,44 +159,113 @@ class AstBuilder extends DataTypeAstBuilder
     script
   }
 
-  private def visitCompoundBodyImpl(
-      ctx: CompoundBodyContext,
-      label: Option[String],
-      allowVarDeclare: Boolean,
-      labelCtx: SqlScriptingLabelContext,
-      isScope: Boolean): CompoundBody = {
-    val buff = ListBuffer[CompoundPlanStatement]()
-    ctx.compoundStatements.forEach(
-      compoundStatement => buff += 
visitCompoundStatementImpl(compoundStatement, labelCtx))
+  private def assertSqlState(sqlState: String): Unit = {
+    val sqlStateRegex = "^[A-Za-z0-9]{5}$".r
+    if (sqlStateRegex.findFirstIn(sqlState).isEmpty
+      || sqlState.startsWith("00")
+      || sqlState.startsWith("01")
+      || sqlState.startsWith("XX")) {
+      throw SqlScriptingErrors.invalidSqlStateValue(CurrentOrigin.get, 
sqlState)
+    }
+  }
 
-    val compoundStatements = buff.toList
+  override def visitConditionValue(ctx: ConditionValueContext): String = {

Review Comment:
   I think I addressed this and previous comments regarding conditions and 
sqlStates. I did implement it in a different way than proposed `trait 
ScriptingErrorConditionValue`. Idea is to have different sets/maps for:
   - conditions
   - sqlStates
   - SQLEXCEPTION
   - NOT FOUND
   
   This will lead to easier search for most appropriate handler. Logic for 
choosing most appropriate handler is explained in the ref spec and in some of 
the comments in this PR. 
   `CONDITIONS >= SQLSTATES >= SQLEXCEPTION/NOT FOUND`
   
   Please take a look at the last commit to see the proposed change. I seek for 
advice about name of the 2 new classes that are introduces to hold maps 
(handler -> ...), and in execution (sqlStates, conditions, ... -> handlers).
   
   Regarding the corner cases mentioned in the comment above:
   1. `DECLARE EXIT HANDLER FOR SQLSTATE 'KP042', KP042, ...` this should work 
and there is no need to enforce condition `KP042` to have different SQL state 
because this handler will be used anyway and behavior is as expected.
   2. `DECLARE EXIT HANDLER FOR SQLSTATE 'KP042', abc, ...` this will still 
work normally even if `abc` has `SQLSTATE 'KP042'` because this handler will be 
used anyway and behavior is as expected.
   3. `DECLARE EXIT HANDLER FOR SQLEXCEPTION,`SQLEXCEPTION`, ...` this is 
allowed and will work as expected.
   
   I am open to forbidding some of this things when we have `CONDITION` 
resolution like the one for `LOCAL VARIABLES` and `SIGNAL/RESIGNAL` statements.
   
   We covered the most important question 
(https://github.com/apache/spark/pull/49427#discussion_r1924854023) regarding 
the correctness. With this implementation, this problem is fixed and for 
`DECLARE KP042 CONDITION FOR SQLSTATE '31000'`, `KP042` is the name of the 
error condition, and we **will NOT hit** this condition if the actual error SQL 
state is KP042.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to