Hi everyone,

I’m currently using SparkR to read data from a MySQL database, perform some 
calculations, and then write the results back to MySQL. Is it still true that 
Spark does not support UPDATE queries via JDBC? I’ve seen many posts on the 
internet that Spark’s DataFrameWriter does not support UPDATE queries via 
JDBC<https://issues.apache.org/jira/browse/SPARK-19335>. It will only “append” 
or “overwrite” to existing tables. The best advice I’ve found so far, for 
performing this update, is to write to a staging table in 
MySQL<https://stackoverflow.com/questions/34643200/spark-dataframes-upsert-to-postgres-table>
 and then perform the UPDATE query on the MySQL side.

Ideally, I’d like to handle the update during the write operation. Has anyone 
else encountered this limitation and have a better solution?

Thank you,

Jake

Reply via email to