Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21590#discussion_r197620329
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCOptions.scala
 ---
    @@ -65,13 +65,38 @@ class JDBCOptions(
       // Required parameters
       // ------------------------------------------------------------
       require(parameters.isDefinedAt(JDBC_URL), s"Option '$JDBC_URL' is 
required.")
    -  require(parameters.isDefinedAt(JDBC_TABLE_NAME), s"Option 
'$JDBC_TABLE_NAME' is required.")
    +
       // a JDBC URL
       val url = parameters(JDBC_URL)
    -  // name of table
    -  val table = parameters(JDBC_TABLE_NAME)
    +  val tableName = parameters.get(JDBC_TABLE_NAME)
    +  val query = parameters.get(JDBC_QUERY_STRING)
    --- End diff --
    
    Another option is to follow what we are doing in another PR: 
https://github.com/apache/spark/pull/21247 ? We are facing the same issue 
there. The options are shared by both read and write paths. However, the 
limitations are different. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to