gaborgsomogyi commented on a change in pull request #33370:
URL: https://github.com/apache/spark/pull/33370#discussion_r671028756
##########
File path: sql/core/src/main/scala/org/apache/spark/sql/jdbc/README.md
##########
@@ -46,6 +46,12 @@ so they can be turned off and can be replaced with custom
implementation. All CP
which must be unique. One can set the following configuration entry in
`SparkConf` to turn off CPs:
`spark.sql.sources.disabledJdbcConnProviderList=name1,name2`.
+## How to enforce a specific JDBC connection provider?
+
+When more than one JDBC connection provider can handle a specific driver and
options, it is possible to
+disambiguate and enforce a particular CP for the JDBC data source. One can set
the DataFrameReader
Review comment:
+1 on this question.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]