[ 
https://issues.apache.org/jira/browse/SPARK-17680?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-17680:
----------------------------
    Description: 
Spark SQL supports Unicode characters for column names when specified within 
backticks(`). When the Hive support is enabled, the version of the Hive 
metastore must be higher than 0.12, See the JIRA: 
https://issues.apache.org/jira/browse/HIVE-6013 Hive metastore supports Unicode 
characters for column names since 0.13.

In Spark SQL, table comments, and view comments always allow Unicode characters 
without backticks.


  was:
When the version of the Hive metastore is higher than 0.12, Spark SQL supports 
Unicode characters for column names when specified within backticks(`). See the 
JIRA: https://issues.apache.org/jira/browse/HIVE-6013 Hive metastore supports 
Unicode characters for column names since 0.13 

In Spark SQL, table comments, and view comments always allow Unicode characters 
without backticks.



> Unicode Character Support for Column Names and Comments
> -------------------------------------------------------
>
>                 Key: SPARK-17680
>                 URL: https://issues.apache.org/jira/browse/SPARK-17680
>             Project: Spark
>          Issue Type: Test
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Xiao Li
>
> Spark SQL supports Unicode characters for column names when specified within 
> backticks(`). When the Hive support is enabled, the version of the Hive 
> metastore must be higher than 0.12, See the JIRA: 
> https://issues.apache.org/jira/browse/HIVE-6013 Hive metastore supports 
> Unicode characters for column names since 0.13.
> In Spark SQL, table comments, and view comments always allow Unicode 
> characters without backticks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to