GitHub user dongjoon-hyun opened a pull request: https://github.com/apache/spark/pull/14970
[SPARK-11301][SQL] Fix case sensitivity for filter on partitioned col⦠## What changes were proposed in this pull request? The issue was originally registered at SPARK-11301 against 1.6.0, but Spark 1.6.2 still handles **partitioned column name** in a case-sensitive way always. This is incorrect like the following. ```scala scala> sql("CREATE TABLE t(a int) PARTITIONED BY (b string) STORED AS PARQUET") scala> sql("INSERT INTO TABLE t PARTITION(b='P') SELECT * FROM (SELECT 1) t") scala> sql("INSERT INTO TABLE t PARTITION(b='Q') SELECT * FROM (SELECT 2) t") scala> sql("SELECT * FROM t WHERE B='P'").show +---+---+ | a| b| +---+---+ | 1| P| | 2| Q| +---+---+ ``` The result is the same with `set spark.sql.caseSensitive=false`. Here is the result in [Databricks CE](https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/6660119172909095/3421754458488607/5162191866050912/latest.html) . ## How was this patch tested? Pass the Jenkins test with a modified test. You can merge this pull request into a Git repository by running: $ git pull https://github.com/dongjoon-hyun/spark SPARK-11301 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/14970.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #14970 ---- commit a86de9b6958ba82d2b9bfcd314c28340f6d18467 Author: Dongjoon Hyun <dongj...@apache.org> Date: 2016-09-06T06:56:31Z [SPARK-11301][SQL] Fix case sensitivity for filter on partitioned columns ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org