Nassir created ZEPPELIN-2678:
--------------------------------

             Summary: Pyspark cell fails to execute, but normal spark code in 
scala executes fine
                 Key: ZEPPELIN-2678
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-2678
             Project: Zeppelin
          Issue Type: Bug
            Reporter: Nassir


Hi,

I have installed zeppelin for windows and can now run cells with the default 
spark interpreter - so scala code.

However, when I try to execute a pyspark cell e.g.

%pyspark
x = 5

I get an error: 

"failed to start pyspark"

Any ideas on what is going wrong here? I can see %pyspark as an interpreter 
under Spark in the Interpreter page.

Do I need to set some environment variables? I had anaconda installed for 
running python but did not add any environmental path variables?

Thanks



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to