Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/15292#discussion_r81685853
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
---
@@ -46,17 +45,18 @@ object JDBCRDD extends Logging {
* Takes a (schema, table) specification and returns the table's Catalyst
* schema.
*
- * @param url - The JDBC url to fetch information from.
- * @param table - The table name of the desired table. This may also be
a
- * SQL query wrapped in parentheses.
+ * @param options - JDBC options that contains url, table and other
information.
*
* @return A StructType giving the table's Catalyst schema.
* @throws SQLException if the table specification is garbage.
* @throws SQLException if the table contains an unsupported type.
*/
- def resolveTable(url: String, table: String, properties: Properties):
StructType = {
+ def resolveTable(options: JDBCOptions): StructType = {
+ val url = options.url
+ val table = options.table
+ val properties = options.asProperties
--- End diff --
`url`/`dbtable` are our Spark reserved option keys. To keep the external
behaviors consistent, we should not change them.
In addition, we should not pass them to the underlying JDBC drivers. That
means, they should be consumed only by Spark. However, if the underlying JDBC
drivers have such property key, users are not allowed to set them.
Let me know if you have any concern about it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]