[ https://issues.apache.org/jira/browse/SPARK-18419?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-18419: ---------------------------------- Affects Version/s: 2.0.2 > `JDBCRelation.insert` should not remove Spark options > ----------------------------------------------------- > > Key: SPARK-18419 > URL: https://issues.apache.org/jira/browse/SPARK-18419 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.2 > Reporter: Dongjoon Hyun > > Currently, `JDBCRelation.insert` removes Spark options too early by > mistakenly using `asConnectionProperties`. Spark options like `numPartitions` > should be passed into `DataFrameWriter.jdbc` correctly. This bug have been > hidden because `JDBCOptions.asConnectionProperties` fails to filter out the > mixed-case options. This issue aims to fix both. > *JDBCRelation.insert* > {code} > override def insert(data: DataFrame, overwrite: Boolean): Unit = { > val url = jdbcOptions.url > val table = jdbcOptions.table > - val properties = jdbcOptions.asConnectionProperties > + val properties = jdbcOptions.asProperties > data.write > .mode(if (overwrite) SaveMode.Overwrite else SaveMode.Append) > .jdbc(url, table, properties) > {code} > *JDBCOptions.asConnectionProperties* > {code} > scala> import org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions > import org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions > scala> import org.apache.spark.sql.catalyst.util.CaseInsensitiveMap > import org.apache.spark.sql.catalyst.util.CaseInsensitiveMap > scala> new JDBCOptions(Map("url" -> "jdbc:mysql://localhost:3306/temp", > "dbtable" -> "t1", "numPartitions" -> "10")).asConnectionProperties > res0: java.util.Properties = {numpartitions=10} > scala> new JDBCOptions(new CaseInsensitiveMap(Map("url" -> > "jdbc:mysql://localhost:3306/temp", "dbtable" -> "t1", "numPartitions" -> > "10"))).asConnectionProperties > res1: java.util.Properties = {numpartitions=10} > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org