Github user dilipbiswal commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21590#discussion_r197355161
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCOptions.scala
 ---
    @@ -65,13 +65,38 @@ class JDBCOptions(
       // Required parameters
       // ------------------------------------------------------------
       require(parameters.isDefinedAt(JDBC_URL), s"Option '$JDBC_URL' is 
required.")
    -  require(parameters.isDefinedAt(JDBC_TABLE_NAME), s"Option 
'$JDBC_TABLE_NAME' is required.")
    +
       // a JDBC URL
       val url = parameters(JDBC_URL)
    -  // name of table
    -  val table = parameters(JDBC_TABLE_NAME)
    +  val tableName = parameters.get(JDBC_TABLE_NAME)
    +  val query = parameters.get(JDBC_QUERY_STRING)
    --- End diff --
    
    @maropu Thank you for taking the time to think about this throughly. A 
couple of questions/comments.
    1) Looks like for read path we give precedence to dbtable over query. I 
feel its good to explicitly disallow this with a clear message in case of an 
ambiguity.
    2) Usage of lazy here (especially to trigger errors) makes me a little 
nervous. Like if we want to introduce a debug statement to print the variables 
in side the QueryOptions class, things will not work any more, right ? Thats 
the reason, i had opted to check for the "invalid query option in write path" 
in the write function itself (i.e when i am sure of the calling context). 
Perhaps that how its used every where in which case it may be okay to follow 
the same approach here. 
    
    I am okay with this. Lets get some opinion from @gatorsmile. Once i have 
the final set of comments, i will make the changes. Thanks again.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to