Github user markhamstra commented on the pull request:
https://github.com/apache/spark/pull/5396#issuecomment-90743681
I guess I just don't see things the same way. I don't have a problem with
the DAGScheduler being implemented with scoping idioms that may be unfamiliar
to a large number of coders. The DAGScheduler is about as far from being a
public API in Spark as you can get, and I don't really think we want to
sacrifice the additional safety that nested scopes provide within the
DAGScheduler just to make it easy for more developers to make changes to that
part of Spark's code. If you want an even more contrary point of view, a high
bar of expected familiarity with certain idioms actually can serve us well in
restricting just who is making changes to the DAGScheduler and what style and
concerns they are maintaining.
Quite simply, I don't find a function declared within the scope of another
function, for example, to be at all difficult to read -- it's just a common
idiom across several programming languages with which I am familiar. On the
other hand, flattening out those carefully nested scopes does actually make it
harder for me to read and reason about where and under what preconditions a
previously-nested function can and should be used.
I'm not saying that I won't consider any refactoring of the DAGScheduler or
the flattening of any scopes; but in order for me not to veto them, any such
changes will require a lot more justification than simply that making them
produces code that is more readable for a larger audience of developers.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]