[
https://issues.apache.org/jira/browse/SPARK-2339?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14952443#comment-14952443
]
Huaxin Gao commented on SPARK-2339:
-----------------------------------
Hi Yin,
I am looking at jira 10754 (https://issues.apache.org/jira/browse/SPARK-10754)
and it complains that the table name and column name are case sensitive.
Did you already add doc to RegisterXXXTable that the table names are case
sensitive? If not, I will probably add one. Thanks a lot!!
> SQL parser in sql-core is case sensitive, but a table alias is converted to
> lower case when we create Subquery
> --------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-2339
> URL: https://issues.apache.org/jira/browse/SPARK-2339
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.0.0
> Reporter: Yin Huai
> Assignee: Yin Huai
> Fix For: 1.0.2, 1.1.0
>
>
> Reported by
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Join-throws-exception-td8599.html
> After we get the table from the catalog, because the table has an alias, we
> will temporarily insert a Subquery. Then, we convert the table alias to lower
> case no matter if the parser is case sensitive or not.
> To see the issue ...
> {code}
> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
> import sqlContext.createSchemaRDD
> case class Person(name: String, age: Int)
> val people =
> sc.textFile("examples/src/main/resources/people.txt").map(_.split(",")).map(p
> => Person(p(0), p(1).trim.toInt))
> people.registerAsTable("people")
> sqlContext.sql("select PEOPLE.name from people PEOPLE")
> {code}
> The plan is ...
> {code}
> == Query Plan ==
> Project ['PEOPLE.name]
> ExistingRdd [name#0,age#1], MapPartitionsRDD[4] at mapPartitions at
> basicOperators.scala:176
> {code}
> You can find that "PEOPLE.name" is not resolved.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]