Re: "IPython is available, use IPython for PySparkInterpreter"
I think that's a good point - perhaps this shouldn't be a warning. From: Ruslan Dautkhanov <dautkha...@gmail.com> Sent: Monday, March 19, 2018 11:10:48 AM To: users Subject: "IPython is available, use IPython for PySparkInterpreter" We're getting " IPython is available, use IPython for PySparkInterpreter " warning each time we start %pyspark notebooks. Although there is no difference between %pyspark and %ipyspark afaik. At least we can use all ipython magic commands etc. (maybe becase we have zeppelin.pyspark.useIPython=true?) If that's the case, how we can disable "IPython is available, use IPython for PySparkInterpreter" warning ? -- Ruslan Dautkhanov
Re: "IPython is available, use IPython for PySparkInterpreter"
I already filed an issue: Checkout https://issues.apache.org/jira/browse/ZEPPELIN-3290 Jeff Zhang wanted to wait for other users feedback there. On 2018/03/19 18:10:48, Ruslan Dautkhanov <dautkha...@gmail.com> wrote: > We're getting " IPython is available, use IPython for PySparkInterpreter " > warning each time we start %pyspark notebooks. > > Although there is no difference between %pyspark and %ipyspark afaik. > At least we can use all ipython magic commands etc. > (maybe becase we have zeppelin.pyspark.useIPython=true?) > > If that's the case, how we can disable "IPython is available, use IPython > for PySparkInterpreter" warning ? > > > -- > Ruslan Dautkhanov >
"IPython is available, use IPython for PySparkInterpreter"
We're getting " IPython is available, use IPython for PySparkInterpreter " warning each time we start %pyspark notebooks. Although there is no difference between %pyspark and %ipyspark afaik. At least we can use all ipython magic commands etc. (maybe becase we have zeppelin.pyspark.useIPython=true?) If that's the case, how we can disable "IPython is available, use IPython for PySparkInterpreter" warning ? -- Ruslan Dautkhanov
Re: IPython is available, use IPython for PySparkInterpreter
Makes sense Thank you Jeff ! On Sun, Dec 10, 2017 at 11:24 PM Jeff Zhang <zjf...@gmail.com> wrote: > > I am afraid currently there's no way to make ipython as default of > %pyspark, but you can use %ipyspark to use ipython without this warning > message. > > But making ipython as default is on my plan, for now I try to keep > backward compatibility as much as possible, so only use ipython when it is > available, otherwise still use the old python interpreter implementation. > I will change ipython as default and the original python implementation as > fallback when ipython interpreter become much more mature. > > > > > Ruslan Dautkhanov <dautkha...@gmail.com>于2017年12月11日周一 下午1:20写道: > >> Getting "IPython is available, use IPython for PySparkInterpreter" >> warning after starting pyspark interpreter. >> >> How do I default %pyspark to ipython? >> >> Tried to change to >> "class": "org.apache.zeppelin.spark.PySparkInterpreter", >> to >> "class": "org.apache.zeppelin.spark.IPySparkInterpreter", >> in interpreter.json but this gets overwritten back to PySparkInterpreter. >> >> Also tried to change to zeppelin.pyspark.python to ipython with no luck >> too. >> >> Is there is a documented way to default pyspark interpreter to ipython? >> Glanced over PR-2474 but can't quickly get what I am missing. >> >> >> Thanks. >> >> >>
Re: IPython is available, use IPython for PySparkInterpreter
I am afraid currently there's no way to make ipython as default of %pyspark, but you can use %ipyspark to use ipython without this warning message. But making ipython as default is on my plan, for now I try to keep backward compatibility as much as possible, so only use ipython when it is available, otherwise still use the old python interpreter implementation. I will change ipython as default and the original python implementation as fallback when ipython interpreter become much more mature. Ruslan Dautkhanov <dautkha...@gmail.com>于2017年12月11日周一 下午1:20写道: > Getting "IPython is available, use IPython for PySparkInterpreter" > warning after starting pyspark interpreter. > > How do I default %pyspark to ipython? > > Tried to change to > "class": "org.apache.zeppelin.spark.PySparkInterpreter", > to > "class": "org.apache.zeppelin.spark.IPySparkInterpreter", > in interpreter.json but this gets overwritten back to PySparkInterpreter. > > Also tried to change to zeppelin.pyspark.python to ipython with no luck > too. > > Is there is a documented way to default pyspark interpreter to ipython? > Glanced over PR-2474 but can't quickly get what I am missing. > > > Thanks. > > >
IPython is available, use IPython for PySparkInterpreter
Getting "IPython is available, use IPython for PySparkInterpreter" warning after starting pyspark interpreter. How do I default %pyspark to ipython? Tried to change to "class": "org.apache.zeppelin.spark.PySparkInterpreter", to "class": "org.apache.zeppelin.spark.IPySparkInterpreter", in interpreter.json but this gets overwritten back to PySparkInterpreter. Also tried to change to zeppelin.pyspark.python to ipython with no luck too. Is there is a documented way to default pyspark interpreter to ipython? Glanced over PR-2474 but can't quickly get what I am missing. Thanks.