Github user sryza commented on the pull request:
https://github.com/apache/incubator-spark/pull/553#issuecomment-34830852
It's true that you may need it when running in yarn-client mode. And also
true that you will not when running spark-shell. Because it depends, I think
not requiring it is easiest. We could do something to try to figure out
whether spark-shell is being run, and printing an error in that case, but that
sounds like overkill to me. What do you think? The doc covers both cases and
sets it in the example that doesn't use spark-shell.