I would suggest that you file an issue under the jupyter-incubator/sparkmagic
repo https://github.com/jupyter-incubator/sparkmagic as there has been
discussion there about Livy and what level of support has been implemented.

  

Carol

  

**Carol Willing**

  

Research Software Engineer, Project Jupyter

Cal Poly San Luis Obispo

  

Director, Python Software Foundation

  

Strengths: Empathy, Relator, Ideation, Strategic, Learner

  

Sent from [Nylas N1](https://nylas.com/n1?ref=n1), the extensible, open source
mail client.

![](https://link.nylas.com/open/68p140bimyedpd78wkdi7pwew/local-
00e68e48-ffff?r=anVweXRlckBnb29nbGVncm91cHMuY29t)

  
On Dec 19 2016, at 9:19 am, Mickaël Gervais <[email protected]> wrote:  

> Hi,  
  
I've just seen your post, and I have the same error.  
Did you find a solution?  
I can see that the error is :  
  

>  
>  
>     <console>:14: error: not found: value sqlContext  
>  
>     Maybe it's a Livy problem.  
>  
>     Thanks.  
>  
>

>  
On Monday, November 28, 2016 at 7:27:24 AM UTC+1, Vivek Suvarna wrote:

>

>> Hi,

>>

>>  

>>

>> I have Jupyter installed on my windows machine.

>>

>>  

>>

>> I have Livy server 0.2 installed on the remote hadoop cluster where spark
is also running.

>>

>> I can successfully connect to the cluster via Livy and execute a snippet of
code on the cluster.

>>

>>  

>>

>> I now want to connect via the notebook.

>>

>>  

>>

>> I start livy on the remote cluster

>>

>> _**./bin/livy-server**_

>>

>>  

>>

>> I start the notebook on my windows machine as follows

>>

>> _**jupyter.exe notebook**_

>>

>>  

>>

>> This launches the jupyter browser and I select python 2.

>>

>>  

>>

>> ![](https://groups.google.com/group/jupyter/attach/6366da4f8323c/Auto%20Gen
erated%20Inline%20Image%201?part=0.1&authuser=0)

>>

>>  

>>

>>  

>>

>>  

>>

>> Once I select python 2 on the notebook I run the following

>>

>> **_%load_ext sparkmagic.magics_**

>>

>>  

>>

>> then I run

>>

>> **_%manage_spark_**

>>

>>  

>>

>> Here I enter the endpoint which is the same as the remote server where Livy
is running

>>

>> **_<remote host>:8998_**

>>

>>  

>>

>> this adds the endpoint. I then create a session.

>>

>> **_  
_**

>>

>> **_I give the below properties_**

>>

>> **_{"conf": {"spark.master": "yarn-cluster", "spark.submit.deployMode":
"cluster"}, "proxyUser": "<username>@<proxy>:8080"}_**

>>

>>  

>>

>> On clicking Create session

>>

>> I get the below message on the notebook

>>

>> **_Starting Spark application_**

>>

>>  

>>

>> I can see that the Livy server responds with the creation on the session ID

>>

>>  

>>

>> But on the notebook I get the following error

>>

>>  

>>

>> **_Starting Spark application _**

>>

>> **__**

>>

>> **_ID_****_YARN Application ID_****_Kind_****_State_****_Spark
UI_****_Driver log_****_Current session?_**

>>

>> **_1_**

>> **_None_**

>> **_pyspark_**

>> **_idle_**

>> **__**

>> **__**

>> **_✔_**

>>

>> **__**

>>  
>>  
>>     **_  
>>     _**

>>  
>>  
>>
**_---------------------------------------------------------------------------

>>     SqlContextNotFoundException               Traceback (most recent call
last)

>>     c:\fast\python\2.7.12\lib\site-
packages\hdijupyterutils\ipywidgetfactory.pyc in submit_clicked(self, button)

>>          63

>>          64     def submit_clicked(self, button):

>>     ---> 65         self.parent_widget.run()

>>  
>>     c:\fast\python\2.7.12\lib\site-
packages\sparkmagic\controllerwidget\createsessionwidget.pyc in run(self)

>>          56

>>          57         try:

>>     ---> 58             self.spark_controller.add_session(alias, endpoint,
skip, properties)

>>          59         except ValueError as e:

>>          60             self.ipython_display.send_error("""Could not add
session with

>>  
>>     c:\fast\python\2.7.12\lib\site-
packages\sparkmagic\livyclientlib\sparkcontroller.pyc in add_session(self,
name, endpoint, skip_if_exists, properties)

>>          79         session = self._livy_session(http_client, properties,
self.ipython_display)

>>          80         self.session_manager.add_session(name, session)

>>     ---> 81         session.start()

>>          82

>>          83     def get_session_id_for_client(self, name):

>>  
>>     c:\fast\python\2.7.12\lib\site-
packages\sparkmagic\livyclientlib\livysession.pyc in start(self)

>>         154                     self.sql_context_variable_name =
"sqlContext"

>>         155                 else:

>>     --> 156                     raise SqlContextNotFoundException(u"Neither
SparkSession nor HiveContext/SqlContext is available.")

>>         157         except Exception as e:

>>         158
self._spark_events.emit_session_creation_end_event(self.guid, self.kind,
[self.id](http://self.id), self.status,

>>  
>>     SqlContextNotFoundException: Neither SparkSession nor
HiveContext/SqlContext is available.

>>     _**

>>

>>  

>>

>>  

>>

>> Any help would be appreciated!

>>

>>  

>>

>> regards

>>

>> Vivek

>

> \--  
You received this message because you are subscribed to the Google Groups
"Project Jupyter" group.  
To unsubscribe from this group and stop receiving emails from it, send an
email to [[email protected]](mailto:jupyter+unsubscribe@goo
glegroups.com).  
To post to this group, send email to
[[email protected]](mailto:[email protected]).  
To view this discussion on the web visit [https://groups.google.com/d/msgid/ju
pyter/e3973675-ab05-4547-90a1-af49300e747e%40googlegroups.com](https://groups.
google.com/d/msgid/jupyter/e3973675-ab05-4547-90a1-af49300e747e%40googlegroups
.com?utm_medium=email&utm_source=footer).  
For more options, visit <https://groups.google.com/d/optout>.  

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/39wrymh94nwd2plqomxgpkw34-2147483647%40mailer.nylas.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to