Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44496929
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala ---
@@ -93,6 +95,10 @@ private[spark] class IndexShuffleBlockResolver(conf:
SparkConf) extends ShuffleB
} {
out.close()
}
+ indexFile.deleteOnExit()
+ if (!tmp.renameTo(indexFile)) {
+ throw new IOException(s"fail to rename index file $tmp to
$indexFile")
--- End diff --
this will just kill the task, right? both tasks are actually just fine,
and in fact the overall job should continue if one of them succeeds. But
instead this will lead to the task getting retried, and potentially continuing
to fail up to 4 times, though its actually finished successfully from another
taskset? You could handle this in scheduler, but that would add some
complexity.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]