Is it possible by anyhow to take advantage of the already processed portion
of the failed task so I can use the speculative execution to reassign only
the what is left from the original task to another node? if yes then how can
I read it from memroy?

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe e-mail:

  • speculative execution in spark john_test_test

Reply via email to