What are the plans to support savepoint state manipulation with batch jobs
natively in core Flink?

I've tried using the bravo tool [1]. It's pretty good at reading
savepoints, but writing seems hacky. For example I wonder what exactly
happens with the following lines:

val newOpState = writer.writeAll()
val newSavepoint = StateMetadataUtils.createNewSavepoint(savepoint, newOpState)
StateMetadataUtils.writeSavepointMetadata(savepointDir, newSavepoint)

Does it actually wait for the batch job to finish all its tasks and writes
metadata file then? I'm asking because this code didn't execute at all when
I tried to run it in k8s environment with a standalone-job.sh setup. (i.e.
the _metadata file did not get created)


[1] https://github.com/king/bravo

Reply via email to