Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/7572#issuecomment-123476011
Hi all,
IMO - this patch is far to invasive to be considered for a backport. Of
course, it is always a judgement call, but here is how I think about it:
1. The DAG scheduler is by far the most brittle component in Spark in terms
of having unintended consequences when we fix something. Creating a new bug in
a maintenance release is the absolute worst thing we can do in terms of eroding
confidence in our release process. I take a very risk averse approach to
backports for this reason. This is also a very central component, every user
workload will be affected if there is an issue we didn't anticipate.
2. This bug is super annoying/confusing, no argument there, but it's a long
standing bug and not something that came up in a regressing way from earlier
versions of Spark.
So I think the cost/benefit here just isn't in favor of doing it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]