Hi,

I have Jupyter installed on my windows machine.

I have Livy server 0.2 installed on the remote hadoop cluster where spark 
is also running.
I can successfully connect to the cluster via Livy and execute a snippet of 
code on the cluster.

I now want to connect via the notebook.

I start livy on the remote cluster
*./bin/livy-server*

I start the notebook on my windows machine as follows
*jupyter.exe notebook*

This launches the jupyter browser and I select python 2.




Once I select python 2 on the notebook I run the following
*%load_ext sparkmagic.magics*

then I run
*%manage_spark*

Here I enter the endpoint which is the same as the remote server where Livy 
is running
*<remote host>:8998*

this adds the endpoint. I then create a session.

*I give the below properties*
*{"conf": {"spark.master": "yarn-cluster", "spark.submit.deployMode": 
"cluster"}, "proxyUser": "<username>@<proxy>:8080"}*

On clicking Create session 
I get the below message on the notebook
*Starting Spark application*

I can see that the Livy server responds with the creation on the session ID

But on the notebook I get the following error

*Starting Spark application *
*ID**YARN Application ID**Kind**State**Spark UI**Driver log**Current 
session?*
*1* *None* *pyspark* *idle* *✔*


*---------------------------------------------------------------------------
SqlContextNotFoundException               Traceback (most recent call last)
c:\fast\python\2.7.12\lib\site-packages\hdijupyterutils\ipywidgetfactory.pyc in 
submit_clicked(self, button)
     63 
     64     def submit_clicked(self, button):
---> 65         self.parent_widget.run()

c:\fast\python\2.7.12\lib\site-packages\sparkmagic\controllerwidget\createsessionwidget.pyc
 in run(self)
     56 
     57         try:
---> 58             self.spark_controller.add_session(alias, endpoint, skip, 
properties)
     59         except ValueError as e:
     60             self.ipython_display.send_error("""Could not add session 
with

c:\fast\python\2.7.12\lib\site-packages\sparkmagic\livyclientlib\sparkcontroller.pyc
 in add_session(self, name, endpoint, skip_if_exists, properties)
     79         session = self._livy_session(http_client, properties, 
self.ipython_display)
     80         self.session_manager.add_session(name, session)
---> 81         session.start()
     82 
     83     def get_session_id_for_client(self, name):

c:\fast\python\2.7.12\lib\site-packages\sparkmagic\livyclientlib\livysession.pyc
 in start(self)
    154                     self.sql_context_variable_name = "sqlContext"
    155                 else:
--> 156                     raise SqlContextNotFoundException(u"Neither 
SparkSession nor HiveContext/SqlContext is available.")
    157         except Exception as e:
    158             
self._spark_events.emit_session_creation_end_event(self.guid, self.kind, 
self.id, self.status,

SqlContextNotFoundException: Neither SparkSession nor HiveContext/SqlContext is 
available.
*



Any help would be appreciated!

regards
Vivek

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/96b5a731-ec91-464a-8319-0556f3ad75e7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to