srielau commented on code in PR #44093:
URL: https://github.com/apache/spark/pull/44093#discussion_r1419132503


##########
common/utils/src/main/resources/error/error-classes.json:
##########
@@ -1005,6 +1005,12 @@
     ],
     "sqlState" : "42702"
   },
+  "EXEC_IMMEDIATE_DUPLICATE_ARGUMENT_ALIASES" : {
+    "message" : [
+      "Using statement contains multiple arguments with same alias 
(<aliases>)."
+    ],
+    "sqlState" : "42702"

Review Comment:
   ```suggestion
       "sqlState" : "42701"
   ```
   
   It's really about duplicate assignments I think.



##########
common/utils/src/main/resources/error/error-classes.json:
##########
@@ -2085,6 +2091,12 @@
     },
     "sqlState" : "22023"
   },
+  "INVALID_PARAMETRIZED_QUERY" : {
+    "message" : [
+      "Parametrize query must either use positional, or named parameters, but 
not both."

Review Comment:
   ```suggestion
         "Parameterized query must either use positional, or named parameters, 
but not both."
   ```



##########
common/utils/src/main/resources/error/error-classes.json:
##########
@@ -2085,6 +2091,12 @@
     },
     "sqlState" : "22023"
   },
+  "INVALID_PARAMETRIZED_QUERY" : {
+    "message" : [
+      "Parametrize query must either use positional, or named parameters, but 
not both."
+    ],
+    "sqlState" : "42609"

Review Comment:
   42609 is for situations like ? = ? where Db2 can't to late binding because 
it can't derive the type from context.
   42613 seems most appropriate "clauses are mutually exclusive" (although 
parameter markers really aren't clauses...)



##########
common/utils/src/main/resources/error/error-classes.json:
##########
@@ -2338,6 +2356,12 @@
     ],
     "sqlState" : "42000"
   },
+  "INVALID_VARIABLE_TYPE_FOR_QUERY_EXECUTE_IMMEDIATE" : {
+    "message" : [
+      "Variable type must be \"STRING\" but got <varType>"

Review Comment:
   ```suggestion
         "Variable type must be string type but got <varType>"
   ```
   
   Thinking ahead at CHAR and VARCHAR...



##########
common/utils/src/main/resources/error/error-classes.json:
##########
@@ -2085,6 +2091,12 @@
     },
     "sqlState" : "22023"
   },
+  "INVALID_PARAMETRIZED_QUERY" : {

Review Comment:
   Isn't this a situation we already have from spark.sql("...", args)?



##########
common/utils/src/main/resources/error/error-classes.json:
##########
@@ -2338,6 +2356,12 @@
     ],
     "sqlState" : "42000"
   },
+  "INVALID_VARIABLE_TYPE_FOR_QUERY_EXECUTE_IMMEDIATE" : {
+    "message" : [
+      "Variable type must be \"STRING\" but got <varType>"

Review Comment:
   Isn't this just a generic DATATYPE_MISMATCH.NON_STRING_TYPE 42K09?
   Specifically if later we want to allow generic expressions, so it need not 
be a variable.



##########
sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseParser.g4:
##########
@@ -230,6 +231,23 @@ statement
     | unsupportedHiveNativeCommands .*?                                
#failNativeCommand
     ;
 
+executeImmediate
+    : EXECUTE IMMEDIATE queryParam=executeImmediateQueryParam (INTO LEFT_PAREN 
targetVariable=multipartIdentifierList RIGHT_PAREN)? (USING 
params=executeImmediateArgumentSeq)?

Review Comment:
   I just double checked that while Snowflake requires USING ( ), Google does 
has no (. ).
   Also PREPARE/EXECUTE does NOT use ( ).
   Can we make parens optional for USING and remove them for INTO (or also make 
them optional)?



##########
sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseLexer.g4:
##########
@@ -217,6 +217,7 @@ HOURS: 'HOURS';
 IDENTIFIER_KW: 'IDENTIFIER';
 IF: 'IF';
 IGNORE: 'IGNORE';
+IMMEDIATE: 'IMMEDIATE';

Review Comment:
   It should be reserved keyword in  when we are strict ansi, but generally it 
shoudl be non-reserved.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to