kanika dhuria created SPARK-17753:
-------------------------------------

             Summary: Simple case in spark sql throws ParserException
                 Key: SPARK-17753
                 URL: https://issues.apache.org/jira/browse/SPARK-17753
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.0.0
            Reporter: kanika dhuria


Simple case in sql throws parser exception in spark 2.0.
The following query fails in spark 2.0 

spark.sql("SELECT alias.p_double as a0, alias.p_text as a1, NULL as a2 FROM 
hadoop_tbl_all alias WHERE  (1 = (CASE ('aaaaabbbbb' = alias.p_text) OR (8 <= 
LENGTH(alias.p_text)) WHEN TRUE THEN 1  WHEN FALSE THEN 0 ELSE CAST(NULL AS 
INT) END))")

org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'EQ' expecting {<EOF>, '.', '[', 'GROUP', 'ORDER', 'HAVING', 
'LIMIT', 'OR', 'AND', 'IN', NOT, 'BETWEEN', 'LIKE', RLIKE, 'IS', 'WINDOW', 
'UNION', 'EXCEPT', 'INTERSECT', EQ, '<=>', '<>', '!=', '<', LTE, '>', GTE, '+', 
'-', '*', '/', '%', 'DIV', '&', '|', '^', 'SORT', 'CLUSTER', 'DISTRIBUTE'}(line 
1, pos 111)

== SQL ==
SELECT alias.p_double as a0, alias.p_text as a1, NULL as a2 FROM hadoop_tbl_all 
alias WHERE  CASE 'aaaaabbbbb' EQ alias.p_text  WHEN TRUE=TRUE THEN 1  WHEN 
TRUE=FALSE THEN 0 ELSE CAST(NULL AS INT) END
---------------------------------------------------------------------------------------------------------------^^^

  at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:197)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:99)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
  ... 48 elided




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to