AngersZhuuuu opened a new pull request #34721:
URL: https://github.com/apache/spark/pull/34721


   ### What changes were proposed in this pull request?
   In current spark-sql, when use -e and -f, it can't support nested bracketed 
comment such as
    ```
   /* SELECT /*+ BROADCAST(b) */ 4;
   */
   SELECT  1
   ;
   ```
   When run `spark-sql -f` with `--verbose` got below error
   ```
   park master: yarn, Application Id: application_1632999510150_6968442
   /* sielect /* BROADCAST(b) */ 4
   Error in query:
   mismatched input '4' expecting {'(', 'ADD', 'ALTER', 'ANALYZE', 'CACHE', 
'CLEAR', 'COMMENT', 'COMMIT', 'CREATE', 'DELETE', 'DESC', 'DESCRIBE', 'DFS', 
'DROP', 'EXPLAIN', 'EXPORT', 'FROM', 'GRANT', 'IMPORT', 'INSERT', 'LIST', 
'LOAD', 'LOCK', 'MAP', 'MERGE', 'MSCK', 'REDUCE', 'REFRESH', 'REPLACE', 
'RESET', 'REVOKE', 'ROLLBACK', 'SELECT', 'SET', 'SHOW', 'START', 'TABLE', 
'TRUNCATE', 'UNCACHE', 'UNLOCK', 'UPDATE', 'USE', 'VALUES', 'WITH'}(line 1, pos 
30)
   
   == SQL ==
   /* sielect /* BROADCAST(b) */ 4
   ------------------------------^^^
   ```
   
   
   
   ### Why are the changes needed?
   In spark #37389 we support nested bracketed comment in SQL, here for 
spark-sql we should support too.
   
   ### Does this PR introduce _any_ user-facing change?
   User can use nested bracketed comment in spark-sql
   
   
   ### How was this patch tested?
   
   Since spark-sql  console mode have special logic about handle `;`
   ```
       while (line != null) {
         if (!line.startsWith("--")) {
           if (prefix.nonEmpty) {
             prefix += '\n'
           }
   
           if (line.trim().endsWith(";") && !line.trim().endsWith("\\;")) {
             line = prefix + line
             ret = cli.processLine(line, true)
             prefix = ""
             currentPrompt = promptWithCurrentDB
           } else {
             prefix = prefix + line
             currentPrompt = continuedPromptWithDBSpaces
           }
         }
         line = reader.readLine(currentPrompt + "> ")
       }
   ```
   
   If we write sql as below
   ```
   /* SELECT /*+ BROADCAST(b) */ 4\\;
   */
   SELECT  1
   ;
   ```
   the `\\;` is escaped.
   
   
   Manuel  test wit spark-sql -f
   ```
   (spark.submit.pyFiles,)
   (spark.submit.deployMode,client)
   (spark.master,local[*])
   Classpath elements:
   
   
   
   Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   21/11/26 16:32:08 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   21/11/26 16:32:10 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout 
does not exist
   21/11/26 16:32:10 WARN HiveConf: HiveConf of name hive.stats.retries.wait 
does not exist
   21/11/26 16:32:13 WARN ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 2.3.0
   21/11/26 16:32:13 WARN ObjectStore: setMetaStoreSchemaVersion called but 
recording version is disabled: version = 2.3.0, comment = Set by MetaStore 
yi.zhu@10.12.189.175
   Spark master: local[*], Application Id: local-1637915529831
   /* select /* BROADCAST(b) */ 4;
   */
   select  1
   
   1
   Time taken: 3.851 seconds, Fetched 1 row(s)
   C02D45VVMD6T:spark yi.zhu$
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to