Hi

I'm running a Spark job in which I am appending new data into Parquet file.
At last, I make a log entry in my Dynamodb table stating the number of
records appended, time etc. Instead of one single entry in the database,
multiple entries are being made to it. Is it because of parallel execution
of code in workers? If it is so then how can I solve it so that it only
writes once.

*Thanks!*

*Cheers!*

Harsh Choudhary

Reply via email to