[ https://issues.apache.org/jira/browse/SPARK-10503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen updated SPARK-10503: ------------------------------- Description: Query an ORC table in Hive using the following SQL statement via the SPARKSQL thrift-server. The row were rnum=0 has a c1 value of null. The resultset returned by SPARK includes a row where rnum=0 and c1=0 which is incorrect {code} select tint.rnum, tint.rnum from tint where tint.cint in ( tint.cint ) table in Hive create table if not exists TINT ( RNUM int , CINT smallint ) TERMINATED BY '\n' STORED AS orc ; data loaded into ORC table is 0|\N 1|-1 2|0 3|1 4|10 {code} was: Query an ORC table in Hive using the following SQL statement via the SPARKSQL thrift-server. The row were rnum=0 has a c1 value of null. The resultset returned by SPARK includes a row where rnum=0 and c1=0 which is incorrect select tint.rnum, tint.rnum from tint where tint.cint in ( tint.cint ) table in Hive create table if not exists TINT ( RNUM int , CINT smallint ) TERMINATED BY '\n' STORED AS orc ; data loaded into ORC table is 0|\N 1|-1 2|0 3|1 4|10 > incorrect predicate evaluation involving NULL value > --------------------------------------------------- > > Key: SPARK-10503 > URL: https://issues.apache.org/jira/browse/SPARK-10503 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: N Campbell > > Query an ORC table in Hive using the following SQL statement via the SPARKSQL > thrift-server. The row were rnum=0 has a c1 value of null. The resultset > returned by SPARK includes a row where rnum=0 and c1=0 which is incorrect > {code} > select tint.rnum, tint.rnum from tint where tint.cint in ( tint.cint ) > table in Hive > create table if not exists TINT ( RNUM int , CINT smallint ) > TERMINATED BY '\n' > STORED AS orc ; > data loaded into ORC table is > 0|\N > 1|-1 > 2|0 > 3|1 > 4|10 > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org