Github user willb commented on the pull request:
https://github.com/apache/spark/pull/717#issuecomment-43000971
@rxin, the latest commit adds such a test.
I'll note that the code I have only finds toplevel `return` statements, so
it won't spuriously reject code like that example, but it also won't reject
code like this, which is an invalid use of `return`:
```scala
nums.map {x =>
// this return is invalid since it will transfer control outside the
closure
val foo = {y: Int => return 2; 1 }
foo(x)
}
```
I thought identifying toplevel `return` statements in closures represented
an acceptable tradeoff between complexity and user-friendliness in order to
provide better feedback in the common case (but not necessarily to exhaustively
flag all bogus closures, no matter how pathological). If this isn't the right
tradeoff, I could adapt the code to reject a higher percentage of closures with
nonlocal returns.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---