[
https://issues.apache.org/jira/browse/SPARK-12270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-12270:
------------------------------------
Assignee: Apache Spark
> JDBC Where clause comparison doesn't work for DB2 char(n)
> ----------------------------------------------------------
>
> Key: SPARK-12270
> URL: https://issues.apache.org/jira/browse/SPARK-12270
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: Huaxin Gao
> Assignee: Apache Spark
> Priority: Minor
>
> I am doing some Spark jdbc test against DB2. My test is like this:
> {code}
> conn.prepareStatement(
> "create table people (name char(32)").executeUpdate()
> conn.prepareStatement("insert into people values
> ('fred')").executeUpdate()
> sql(
> s"""
> |CREATE TEMPORARY TABLE foobar
> |USING org.apache.spark.sql.jdbc
> |OPTIONS (url '$url', dbtable 'PEOPLE', user 'testuser', password
> 'testpassword')
> """.stripMargin.replaceAll("\n", " "))
> val df = sqlContext.sql("SELECT * FROM foobar WHERE NAME = 'fred'")
> {code}
> I am expecting to see one row with content 'fred' in df. However, there is no
> row returned. If I changed the data type to varchar (32) in the create table
> ddl , then I can get the row back correctly. The cause of the problem is that
> for data type char (num), DB2 defines it as fixed-length character strings,
> so if I have char (32), when doing "SELECT * FROM foobar WHERE NAME =
> 'fred'", DB2 returns 'fred' padded with 28 empty space. Spark treats "fred'
> padded with empty space not the same as 'fred' so df doesn't have any row. If
> I have varchar (32), DB2 just returns 'fred' for the select statement and df
> has the right row. In order to make DB2 char (num) works for spark, I suggest
> to change spark code to trim the empty space after get the data from
> database.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]