Let me do it now. I appreciate the perfect easy understandable
documentation of spark!

The updated command will be like

PYSPARK_DRIVER_PYTHON=ipython PYSPARK_DRIVER_PYTHON_OPTS="notebook"
./bin/pyspark

When IPython notebook server is launched, you can create a new "Python
2" notebook from "Files" tab. Inside the notebook, you can input the
'%pylab inline' command as part of your notebook before you start to
try spark from IPython notebook.


Cheers.
Cong

2015-03-20 16:14 GMT-07:00 Matei Zaharia <matei.zaha...@gmail.com>:
> Feel free to send a pull request to fix the doc (or say which versions it's
> needed in).
>
> Matei
>
> On Mar 20, 2015, at 6:49 PM, Krishna Sankar <ksanka...@gmail.com> wrote:
>
> Yep the command-option is gone. No big deal, just add the '%pylab inline'
> command as part of your notebook.
> Cheers
> <k/>
>
> On Fri, Mar 20, 2015 at 3:45 PM, cong yue <yuecong1...@gmail.com> wrote:
>>
>> Hello :
>>
>> I tried ipython notebook with the following command in my enviroment.
>>
>> PYSPARK_DRIVER_PYTHON=ipython PYSPARK_DRIVER_PYTHON_OPTS="notebook
>> --pylab inline" ./bin/pyspark
>>
>> But it shows " --pylab inline" support is removed from ipython newest
>> version.
>> the log is as :
>> ---
>> $ PYSPARK_DRIVER_PYTHON=ipython PYSPARK_DRIVER_PYTHON_OPTS="notebook
>> --pylab inline" ./bin/pyspark
>> [E 15:29:43.076 NotebookApp] Support for specifying --pylab on the
>> command line has been removed.
>> [E 15:29:43.077 NotebookApp] Please use `%pylab inline` or
>> `%matplotlib inline` in the notebook itself.
>> --
>> I am using IPython 3.0.0. and only IPython works in my enviroment.
>> --
>> $ PYSPARK_DRIVER_PYTHON=ipython PYSPARK_DRIVER_PYTHON_OPTS="notebook
>> --pylab inline" ./bin/pyspark
>> --
>>
>> Does somebody have the same issue as mine? How do you solve it?
>>
>> Thanks,
>> Cong
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to