[
https://issues.apache.org/jira/browse/SPARK-11474?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Reynold Xin resolved SPARK-11474.
---------------------------------
Resolution: Fixed
Assignee: Huaxin Gao
Fix Version/s: 1.6.0
1.5.3
> Options to jdbc load are lower cased
> ------------------------------------
>
> Key: SPARK-11474
> URL: https://issues.apache.org/jira/browse/SPARK-11474
> Project: Spark
> Issue Type: Bug
> Components: Input/Output
> Affects Versions: 1.5.1
> Environment: Linux & Mac
> Reporter: Stephen Samuel
> Assignee: Huaxin Gao
> Priority: Minor
> Fix For: 1.5.3, 1.6.0
>
>
> We recently upgraded from spark 1.3.0 to 1.5.1 and one of the features we
> wanted to take advantage of was the fetchSize added to the jdbc data frame
> reader.
> In 1.5.1 there appears to be a bug or regression, whereby an options map has
> its keys lowercased. This means the existing properties from prior to 1.4 are
> ok, such as dbtable, url and driver, but the newer fetchSize gets converted
> to fetchsize.
> To re-produce:
> val conf = new SparkConf(true).setMaster("local").setAppName("fetchtest")
> val sc = new SparkContext(conf)
> val sql = new SQLContext(sc)
> val options = Map("url" -> ...., "driver" -> ...., "fetchSize" -> ....)
> val df = sql.load("jdbc", options)
> Breakpoint at line 371 in JDBCRDD and you'll see the options are all
> lowercased, so:
> val fetchSize = properties.getProperty("fetchSize", "0").toInt
> results in 0
> Now I know sql.load is deprecated, but this might be occuring on other
> methods too. The workaround is to use the java.util.Properties overload,
> which keeps the case sensitive keys.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]