[ 
https://issues.apache.org/jira/browse/SPARK-11989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Reynold Xin updated SPARK-11989:
--------------------------------
    Assignee: Christian Kurz

> Spark JDBC write only works on techologies with transaction support
> -------------------------------------------------------------------
>
>                 Key: SPARK-11989
>                 URL: https://issues.apache.org/jira/browse/SPARK-11989
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.2
>            Reporter: Christian Kurz
>            Assignee: Christian Kurz
>   Original Estimate: 4h
>  Remaining Estimate: 4h
>
> Writing DataFrames out to a JDBC destination currently requires the JDBC 
> driver/ database to support transaction. This is because 
> spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
>  *always* calls commit()/rollback().
> Some technologies do not support transactions and their drivers will throw an 
> exception if commit()/rollback() is used. For these technologies (like 
> Progress JDBC Driver for Cassandra) this is a blocking problem.
> Prior to using transaction support JdbcUtils.scala needs to check whether the 
> drivers does support transaction. Check can be done via 
> conn.getMetaData().supportsDataManipulationTransactionsOnly() ||
> conn.getMetaData().supportsDataDefinitionAndDataManipulationTransactions()
> The working code change can be seen 
> [here|https://github.com/CK50/spark/blob/branch-1.6_non-transactional/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala]
> [Pull request|https://github.com/apache/spark/pull/9973]



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to