[ 
https://issues.apache.org/jira/browse/SPARK-17198?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15434191#comment-15434191
 ] 

Dongjoon Hyun commented on SPARK-17198:
---------------------------------------

Hi, [~tuming].
The reported error scenario in HIVE-11312 seems to work without problems in 
Spark 2.0 like the following.

{code}
scala> sql("create table orc_test( col1 string, col2 char(10)) stored as orc 
tblproperties ('orc.compress'='NONE')")
scala> sql("insert into orc_test values ('val1', '1')")
scala> sql("select * from orc_test where col2='1'").show
+----+----+
|col1|col2|
+----+----+
|val1|   1|
+----+----+
scala> spark.version
res3: String = 2.0.0
{code}

Could you give us some reproducible examples?

> ORC fixed char literal filter does not work
> -------------------------------------------
>
>                 Key: SPARK-17198
>                 URL: https://issues.apache.org/jira/browse/SPARK-17198
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.1
>            Reporter: tuming
>
> I have got wrong result when I run the following query in SparkSQL. 
> select * from orc_table where char_col ='5LZS';
> Table orc_table is a ORC format table.
> Column char_col is defined as char(6). 
> The hive record reader will return a char(6) string to the spark. And the 
> spark has no fixed char type. All fixed char type attributes are converted to 
> String by default. Meanwhile the constant literal is parsed to a string 
> Literal.  So it won't return true forever while doing the equal comparison. 
> For instance: '5LZS'=='5LZS  '.
> But I can get correct result in Hive using same data and sql string because 
> hive append spaces for those constant literal. Please refer to:
> https://issues.apache.org/jira/browse/HIVE-11312
> I found there is no such patch for spark.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to