Paul Wu created SPARK-21740:
-------------------------------

             Summary: DataFrame.write does not work with Phoenix JDBC Driver
                 Key: SPARK-21740
                 URL: https://issues.apache.org/jira/browse/SPARK-21740
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0, 2.0.0
            Reporter: Paul Wu


  The reason for this is that Phoenix JDBC driver does not support "INSERT", 
but "UPSERT".
Exception for the following program:
17/08/15 12:18:53 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
org.apache.phoenix.exception.PhoenixParserException: ERROR 601 (42P00): Syntax 
error. Encountered "INSERT" at line 1, column 1.
        at 
org.apache.phoenix.exception.PhoenixParserException.newException(PhoenixParserException.java:33)

{code:java}
public class HbaseJDBCSpark {

    private static final SparkSession sparkSession
            = SparkSession.builder()
                    .config("spark.sql.warehouse.dir", "file:///temp")
                    .config("spark.driver.memory", "5g")
                    .master("local[*]").appName("Spark2JdbcDs").getOrCreate();

    static final String JDBC_URL
            = "jdbc:phoenix:somehost:2181:/hbase-unsecure";

    public static void main(String[] args) {
        final Properties connectionProperties = new Properties();
        Dataset<Row> jdbcDF
                = sparkSession.read()
                        .jdbc(JDBC_URL, "javatest", connectionProperties);

        jdbcDF.show();

        String url = JDBC_URL;
        Properties p = new Properties();
        p.put("driver", "org.apache.phoenix.jdbc.PhoenixDriver");
        //p.put("batchsize", "100000");
        jdbcDF.write().mode(SaveMode.Append).jdbc(url, "javatest", p);
        sparkSession.close();

    }
    // Create variables

}
{code}




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to