Yuming Wang created SPARK-33128: ----------------------------------- Summary: mismatched input since Spark 3.0 Key: SPARK-33128 URL: https://issues.apache.org/jira/browse/SPARK-33128 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 3.0.1, 3.0.0, 3.1.0 Reporter: Yuming Wang
Spark 2.4: {noformat} Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.4 /_/ Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_221) Type in expressions to have them evaluated. Type :help for more information. scala> spark.sql("SELECT 1 UNION SELECT 1 UNION ALL SELECT 1").show +---+ | 1| +---+ | 1| | 1| +---+ {noformat} Spark 3.x: {noformat} Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.1.0-SNAPSHOT /_/ Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 14.0.1) Type in expressions to have them evaluated. Type :help for more information. scala> spark.sql("SELECT 1 UNION SELECT 1 UNION ALL SELECT 1").show org.apache.spark.sql.catalyst.parser.ParseException: mismatched input 'SELECT' expecting {<EOF>, ';'}(line 1, pos 15) == SQL == SELECT 1 UNION SELECT 1 UNION ALL SELECT 1 ---------------^^^ at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:263) at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:130) at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:51) at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:81) at org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:610) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:610) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:769) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:607) ... 47 elided {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org