[ 
https://issues.apache.org/jira/browse/SPARK-27017?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16786789#comment-16786789
 ] 

Chakravarthi commented on SPARK-27017:
--------------------------------------

[~uNxe] could you provide queries you have used,because Hive also does not 
allow special character in column name .

> Creating orc table with special symbols in column name via spark.sql
> --------------------------------------------------------------------
>
>                 Key: SPARK-27017
>                 URL: https://issues.apache.org/jira/browse/SPARK-27017
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Shell
>    Affects Versions: 2.3.0
>            Reporter: Henryk Cesnolovic
>            Priority: Major
>
> Issue is creating orc table with special symbols in column name in spark with 
> hive support. Example:
> _spark.sql("Create table abc_orc (`Column with speci@l symbo|s`string) stored 
> as orc")_ 
> throws  org.apache.spark.sql.AnalysisException: Column name "Column with 
> speci@l symbo|s" contains invalid character(s). Please use alias to rename it.
> It's interesting, because in Hive we can create such table and after that in 
> spark we can select data from that table and it resolves schema correctly. 
> My question is, is it correct behaviour of spark and if so, what is the 
> reason of that behaviour?
>   
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to