GitHub user fsauer65 opened a pull request:

    https://github.com/apache/spark/pull/22560

    [SPARK-25547][Spark Core] Pluggable JDBC connection factory

    ## What changes were proposed in this pull request?
    
    Allow for pluggable connection factories in the spark jdbc package.
    
    * new option in JDBCOptions called JDBC_CONNECTION_FACTORY_PROVIDER
    * changes to JdbcUtils.createConnectionFactory to use the above
    * when unspecified the existing DefaultConnectionFactoryProvider is used
    * provided an example use in PluggableConnectionFactoryExample
    
    Without these changes we had to copy most of the spark jdbc package into 
our own codebase
    to allow us to create our own connection factory in order to load balance 
queries against
    an AWS Aurora postures cluster.
    
    ## How was this patch tested?
    
    We use this at Kabbage to load balance queries against an AWS Aurora 
postures cluster
    using the code in the provided example. This code is commented out to not 
introduce
    a dependency on the AWS API library.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/KabbageInc/spark 
patch.pluggable-connection-factory

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22560.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22560
    
----
commit 6895e497e90897245347c2661a2d380aa42cf305
Author: Frank Sauer <fsauer@...>
Date:   2018-09-26T20:23:35Z

    added support for a pluggable connection factory for jdbc

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to