Hi list,

With the help of Spark DataFrame API we can save a DataFrame into a
database table through insertIntoJDBC() call. However, I could not find any
info about how it handles the transactional guarantee. What if my program
gets killed during the processing? Would it end up in partial load?

Is it somehow possible to handle these kind of scenarios? Rollback or
something of that sort?

Many thanks.

P.S : I am using spark-1.3.1-bin-hadoop2.4 with java 1.7

[image: http://]
Tariq, Mohammad
about.me/mti
[image: http://]
<http://about.me/mti>

Reply via email to