sadikovi commented on a change in pull request #33370:
URL: https://github.com/apache/spark/pull/33370#discussion_r671064424



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/jdbc/README.md
##########
@@ -46,6 +46,12 @@ so they can be turned off and can be replaced with custom 
implementation. All CP
 which must be unique. One can set the following configuration entry in 
`SparkConf` to turn off CPs:
 `spark.sql.sources.disabledJdbcConnProviderList=name1,name2`.
 
+## How to enforce a specific JDBC connection provider?
+
+When more than one JDBC connection provider can handle a specific driver and 
options, it is possible to
+disambiguate and enforce a particular CP for the JDBC data source. One can set 
the DataFrameReader

Review comment:
       It is both. I meant a DataFrame option here.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to