Github user CK50 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10380#discussion_r48049616
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
 ---
    @@ -62,15 +62,12 @@ object JdbcUtils extends Logging {
       /**
        * Returns a PreparedStatement that inserts a row into table via conn.
        */
    -  def insertStatement(conn: Connection, table: String, rddSchema: 
StructType): PreparedStatement = {
    -    val sql = new StringBuilder(s"INSERT INTO $table VALUES (")
    -    var fieldsLeft = rddSchema.fields.length
    -    while (fieldsLeft > 0) {
    -      sql.append("?")
    -      if (fieldsLeft > 1) sql.append(", ") else sql.append(")")
    -      fieldsLeft = fieldsLeft - 1
    -    }
    -    conn.prepareStatement(sql.toString())
    +  def insertStatement(conn: Connection,
    --- End diff --
    
    @hvanhovell 
    It works fine on Oracle and on Cassandra (Progress JDBC driver for 
Cassandra).
    Commenting on other RDBMS: I was surprised that the column-free syntax is 
supported on so many databases. From all my work on different RDBMS, the syntax 
with column-names is much more the standard. - But I have not tested on other 
than Oracle and Cassandra.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to