The error says it cannot find
"C:\installedsoftwares\spark-2.2.0-bin-hadoop2.7\bin\spark-submit2.cmd".
Please check the PATH environment variable on your Windows system.
Xi Shen
about.me/davidshen
On Sun, Aug 12, 2018 at 3:21 PM Divya Gehlot wrote:
>
> Hi ,
> I am getting below error When I try
Sure, we are very happy to have someone help verify zeppelin on windows.
Divya Gehlot 于2018年8月14日周二 上午10:42写道:
> Hi Jeff,
> As most of the people still uses windows machine for local mode .
> Can I be help to test it on windows ?
>
> Thanks,
> Divya
>
> On Mon, 13 Aug 2018 at 13:04, Jeff Zhang
Hi Jeff,
As most of the people still uses windows machine for local mode .
Can I be help to test it on windows ?
Thanks,
Divya
On Mon, 13 Aug 2018 at 13:04, Jeff Zhang wrote:
>
> I believe this is before 0.8, we made lots of changes in 0.8, but
> unfortunately we don't have resources to test
I believe this is before 0.8, we made lots of changes in 0.8, but
unfortunately we don't have resources to test it in windows.
Divya Gehlot 于2018年8月13日周一 上午10:01写道:
> Hi ,
> Thanks Jeff for prompt response !
> When browsed internet I stumbled upon many blog posts where people set up
> zeppelin
Hi ,
Thanks Jeff for prompt response !
When browsed internet I stumbled upon many blog posts where people set up
zeppelin and spark with python support on Windows Platform .
Then how does it work for them ?
Thanks,
Divya
On Sun, 12 Aug 2018 at 15:25, Jeff Zhang wrote:
>
> Sorry Divya, zeppelin
Sorry Divya, zeppelin don't have test for windows platform, there's no
official support for windows. I would suggest you to run zeppelin in linux
Divya Gehlot 于2018年8月12日周日 下午3:21写道:
> Hi ,
> I am getting below error When I try to run simple command
> %spark
> sc.version
>
>
>
Hi ,
I am getting below error When I try to run simple command
%spark
sc.version
java.lang.RuntimeException:
'C:\installedsoftwares\spark-2.2.0-bin-hadoop2.7\bin\spark-submit2.cmd"
--class org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
--jars "" --driver-java-options "' is not