Hi Tariq
You need to handle the transaction semantics yourself. You could for
example save from the dataframe to a staging table and then write to the
final table using a single atomic "INSERT INTO finalTable from
stagingTable" call. Remember to clear the staging table first to recover
from previo
Hi list,
With the help of Spark DataFrame API we can save a DataFrame into a
database table through insertIntoJDBC() call. However, I could not find any
info about how it handles the transactional guarantee. What if my program
gets killed during the processing? Would it end up in partial load?
Is