Takeshi Yamamuro created SPARK-22553:
----------------------------------------

             Summary: Drop FROM in nonReserved
                 Key: SPARK-22553
                 URL: https://issues.apache.org/jira/browse/SPARK-22553
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 2.2.0
            Reporter: Takeshi Yamamuro
            Priority: Trivial


A simple query below throws a misleading error because nonReserved has `SELECT` 
in SqlBase.q4:
{code}
scala> Seq((1, 2)).toDF("a", "b").createTempView("t")
scala> sql("select a, count(1), from t group by 1").show
org.apache.spark.sql.AnalysisException: cannot resolve '`a`' given input 
columns: []; line 1 pos 7;
'Aggregate [unresolvedordinal(1)], ['a, count(1) AS count(1)#13L, 'from AS t#11]
+- OneRowRelation$

  at 
org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
  at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:88)
  at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:85)
  at 
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
{code}

I know nonReserved currently has `SELECT` because of the historical reason 
(https://github.com/apache/spark/pull/18079#discussion_r118842186). But, since 
IMHO this is a kind of common mistakes (This message annoyed me a few days ago 
in large SQL queries...), it might be worth dropping it in the reserved.

FYI: In postgresql throws an explicit error in this case:
{code}
postgres=# select a, count(1), from test group by b;
ERROR:  syntax error at or near "from" at character 21
STATEMENT:  select a, count(1), from test group by b;
ERROR:  syntax error at or near "from"
LINE 1: select a, count(1), from test group by b;
{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to