Github user gengliangwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21590#discussion_r196657947
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCOptions.scala
 ---
    @@ -65,13 +65,38 @@ class JDBCOptions(
       // Required parameters
       // ------------------------------------------------------------
       require(parameters.isDefinedAt(JDBC_URL), s"Option '$JDBC_URL' is 
required.")
    -  require(parameters.isDefinedAt(JDBC_TABLE_NAME), s"Option 
'$JDBC_TABLE_NAME' is required.")
    +
       // a JDBC URL
       val url = parameters(JDBC_URL)
    -  // name of table
    -  val table = parameters(JDBC_TABLE_NAME)
    +  val tableName = parameters.get(JDBC_TABLE_NAME)
    +  val query = parameters.get(JDBC_QUERY_STRING)
    +  // Following two conditions make sure that :
    +  // 1. One of the option (dbtable or query) must be specified.
    +  // 2. Both of them can not be specified at the same time as they are 
conflicting in nature.
    +  require(
    +    tableName.isDefined || query.isDefined,
    +    s"Option '$JDBC_TABLE_NAME' or '${JDBC_QUERY_STRING}' is required."
    +  )
    +
    +  require(
    +    !(tableName.isDefined && query.isDefined),
    +    s"Both '$JDBC_TABLE_NAME' and '$JDBC_QUERY_STRING' can not be 
specified."
    +  )
    +
    +  // table name or a table expression.
    +  val tableExpression = tableName.map(_.trim).getOrElse {
    +    // We have ensured in the code above that either dbtable or query is 
specified.
    +    query.get match {
    +      case subq if subq.nonEmpty => s"(${subq}) 
spark_gen_${curId.getAndIncrement()}"
    +      case subq => subq
    +    }
    +  }
    +
    +  require(tableExpression.nonEmpty,
    --- End diff --
    
    The error check and error message here are confusing. It seems telling user 
that the two options can be both specified.
    Maybe we should just check the defined one and improve the error message.
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to