Re: Incremental Updates and custom SQL via JDBC

2016-08-25 Thread Mich Talebzadeh
As far as I can tell Spark does not support update to ORC tables. This is because Spark needs to send heartbeat to Hive metadata and maintain in throughout DML transaction operation (delete, updates here) and that is not implemented. For the same token if you have performed DML on ORC table in

Re: Incremental Updates and custom SQL via JDBC

2016-08-24 Thread Sascha Schmied
Thank you for your answer. I’m using ORC transactional table right now. But i’m not stuck with that. When I send an SQL statement like the following, where old_5sek_agg and new_5sek_agg are registered temp tables, I’ll get an exception in spark. Same without subselect. sqlContext.sql("DELETE

Re: Incremental Updates and custom SQL via JDBC

2016-08-24 Thread Mich Talebzadeh
ental updates as described before and maybe also > want to send specific CREATE TABLE syntax for columnar store and time > table. > > Thank you very much in advance. I'm a little stuck on this one. > > Regards > Sascha > > > > -- > View this message in context: http:

Incremental Updates and custom SQL via JDBC

2016-08-24 Thread Oldskoola
/Incremental-Updates-and-custom-SQL-via-JDBC-tp27598.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org