Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/294#issuecomment-39388854
I see, in that case, I think we can do the following:
* Leave in IPYTHON_OPTS as a way to pass options to IPython. Otherwise the
IPython Notebook won't work, and neither will the Pylab flags and stuff like
that.
* Add back the number of arguments = 0 check you added.
* In later versions of IPython that fix its startup bug, we can remove that
check and let you run a script through IPython too. I guess you can also do
`IPYTHON_OPTS="myscript.py" bin/pyspark`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---