Hi Carol,

Thnaks for your response.
Here is the link https://github.com/jupyter-incubator/sparkmagic/issues/305

Mickaël

On Monday, December 19, 2016 at 7:08:37 PM UTC+1, Carol Willing wrote:
>
> I would suggest that you file an issue under the 
> jupyter-incubator/sparkmagic repo 
> https://github.com/jupyter-incubator/sparkmagic as there has been 
> discussion there about Livy and what level of support has been implemented.
>
> Carol
>
> *Carol Willing*
>
> Research Software Engineer, Project Jupyter
> Cal Poly San Luis Obispo
>
> Director, Python Software Foundation
>
> Strengths: Empathy, Relator, Ideation, Strategic, Learner 
>
> Sent from Nylas N1 <https://nylas.com/n1?ref=n1>, the extensible, open 
> source mail client.
>
> On Dec 19 2016, at 9:19 am, Mickaël Gervais <[email protected] 
> <javascript:>> wrote: 
>
>> Hi,
>>
>> I've just seen your post, and I have the same error.
>> Did you find a solution?
>> I can see that the error is :
>>
>> <console>:14: error: not found: value sqlContext
>>
>> Maybe it's a Livy problem.
>>
>> Thanks.
>>
>>
>> On Monday, November 28, 2016 at 7:27:24 AM UTC+1, Vivek Suvarna wrote:
>>
>> Hi,
>>
>> I have Jupyter installed on my windows machine.
>>
>> I have Livy server 0.2 installed on the remote hadoop cluster where spark 
>> is also running.
>> I can successfully connect to the cluster via Livy and execute a snippet 
>> of code on the cluster.
>>
>> I now want to connect via the notebook.
>>
>> I start livy on the remote cluster
>> *./bin/livy-server*
>>
>> I start the notebook on my windows machine as follows
>> *jupyter.exe notebook*
>>
>> This launches the jupyter browser and I select python 2.
>>
>>
>>
>>
>> Once I select python 2 on the notebook I run the following
>> *%load_ext sparkmagic.magics*
>>
>> then I run
>> *%manage_spark*
>>
>> Here I enter the endpoint which is the same as the remote server where 
>> Livy is running
>> *<remote host>:8998*
>>
>> this adds the endpoint. I then create a session.
>>
>> *I give the below properties*
>> *{"conf": {"spark.master": "yarn-cluster", "spark.submit.deployMode": 
>> "cluster"}, "proxyUser": "<username>@<proxy>:8080"}*
>>
>> On clicking Create session 
>> I get the below message on the notebook
>> *Starting Spark application*
>>
>> I can see that the Livy server responds with the creation on the session 
>> ID
>>
>> But on the notebook I get the following error
>>
>> *Starting Spark application *
>> *ID**YARN Application ID**Kind**State**Spark UI**Driver log**Current 
>> session?*
>> *1* *None* *pyspark* *idle* *✔*
>>
>>
>> *---------------------------------------------------------------------------
>> SqlContextNotFoundException               Traceback (most recent call last)
>> c:\fast\python\2.7.12\lib\site-packages\hdijupyterutils\ipywidgetfactory.pyc 
>> in submit_clicked(self, button)
>>      63 
>>      64     def submit_clicked(self, button):
>> ---> 65         self.parent_widget.run()
>>
>> c:\fast\python\2.7.12\lib\site-packages\sparkmagic\controllerwidget\createsessionwidget.pyc
>>  in run(self)
>>      56 
>>      57         try:
>> ---> 58             self.spark_controller.add_session(alias, endpoint, skip, 
>> properties)
>>      59         except ValueError as e:
>>      60             self.ipython_display.send_error("""Could not add session 
>> with
>>
>> c:\fast\python\2.7.12\lib\site-packages\sparkmagic\livyclientlib\sparkcontroller.pyc
>>  in add_session(self, name, endpoint, skip_if_exists, properties)
>>      79         session = self._livy_session(http_client, properties, 
>> self.ipython_display)
>>      80         self.session_manager.add_session(name, session)
>> ---> 81         session.start()
>>      82 
>>      83     def get_session_id_for_client(self, name):
>>
>> c:\fast\python\2.7.12\lib\site-packages\sparkmagic\livyclientlib\livysession.pyc
>>  in start(self)
>>     154                     self.sql_context_variable_name = "sqlContext"
>>     155                 else:
>> --> 156                     raise SqlContextNotFoundException(u"Neither 
>> SparkSession nor HiveContext/SqlContext is available.")
>>     157         except Exception as e:
>>     158             
>> self._spark_events.emit_session_creation_end_event(self.guid, self.kind, 
>> self.id <http://self.id>, self.status,
>>
>> SqlContextNotFoundException: Neither SparkSession nor HiveContext/SqlContext 
>> is available.
>> *
>>
>>
>>
>> Any help would be appreciated!
>>
>> regards
>> Vivek
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Project Jupyter" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] <javascript:>.
>> To post to this group, send email to [email protected] 
>> <javascript:>.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/jupyter/e3973675-ab05-4547-90a1-af49300e747e%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/jupyter/e3973675-ab05-4547-90a1-af49300e747e%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/851fd0a1-fd61-4fc5-9596-18e9b6db19e4%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to