Hi Roland,

Thanks for your response. Your suggestion makes sense. But does that mean 
going with a custom server extension 
is the way to go?

I spent yesterday attempting to accomplish this solely though jupyter 
configuration options. For a running kernel with information

  "shell_port": 63391,
  "iopub_port": 63394,
  "stdin_port": 63392,
  "control_port": 63393,
  "hb_port": 63396,
  "ip": "",
  "key": "fff108f8-b1bf9b17aa337ce7d3d14565",
  "transport": "tcp",
  "signature_scheme": "hmac-sha256",
  "kernel_name": ""

I set the following variables in jupyter_notebook_config.py.

c.ConnectionFileMixin.connection_file = 
c.ConnectionFileMixin.control_port = 63393
c.ConnectionFileMixin.iopub_port = 63394
c.ConnectionFileMixin.stdin_port = 63392
c.ConnectionFileMixin.hb_port = 63396
c.ConnectionFileMixin.ip = ''
c.ConnectionFileMixin.shell_port = 63391
c.ConnectionFileMixin.transport = 'tcp'

However when I created a new notebook I get errors like

[I 12:53:44.981 NotebookApp] KernelRestarter: restarting kernel (1/5)
[D 12:53:44.982 NotebookApp] Starting kernel: [
'/Users/ebanner/.anaconda/envs/py36/bin/python', '-m', 'ipykernel_launcher', 
[D 12:53:44.986 NotebookApp] Connecting to: tcp://
Traceback (most recent call last):
  File "/Users/ebanner/.anaconda/envs/py36/lib/python3.6/runpy.py", line 193
, in _run_module_as_main
    "__main__", mod_spec)
  File "/Users/ebanner/.anaconda/envs/py36/lib/python3.6/runpy.py", line 85, 
in _run_code
    exec(code, run_globals)
, line 16, in <module>
, line 657, in launch_instance
  File "<decorator-gen-123>", line 2, in initialize
, line 87, in catch_config_error
    return method(app, *args, **kwargs)
, line 448, in initialize
, line 238, in init_sockets
    self.shell_port = self._bind_socket(self.shell_socket, self.shell_port)
, line 180, in _bind_socket
    s.bind("tcp://%s:%i" % (self.ip, port))
  File "zmq/backend/cython/socket.pyx", line 495, in zmq.backend.cython.
socket.Socket.bind (zmq/backend/cython/socket.c:5653)
  File "zmq/backend/cython/checkrc.pxd", line 25, in zmq.backend.cython.
checkrc._check_rc (zmq/backend/cython/socket.c:10014)
zmq.error.ZMQError: Address already in use

until it hits the maximum number of retries. Presumably this is because the 
ipykernel command attempts to create a *new* socket but since it already 
exists it fails to do so.

Thinking that I could "trick" jupyter into not using ipykernel and just use 
the existing sockets I set

c.KernelManager.kernel_cmd = [
'/Users/ebanner/.anaconda/envs/py36/bin/python', '/Users/ebanner/bar.py']

where bar.py contains

while True:

After creating another new notebook I get

[D 12:59:37.283 NotebookApp] Opening websocket /api/kernels/9a6c1e7d-12c4-
[D 12:59:37.283 NotebookApp] Connecting to: tcp://
[D 12:59:37.284 NotebookApp] Connecting to: tcp://
[D 12:59:37.284 NotebookApp] Connecting to: tcp://

which gave me hope! But then when I tried to execute a cell it just hangs. 
I did notice that http://localhost:8888/api/kernels has the following 

        id: "9b260abf-072f-48e2-b5f0-213d934418a2",
        name: "python3",
        last_activity: "2018-02-13T21:10:41.460182Z",
        execution_state: "starting",
        connections: 0

The execution_state being "starting" and connections being at 0 makes me 
think that bar.py isn't quite doing enough.

Digging into the code I discovered that the object returned from the call 
to ipykernel is a subprocess.Popen() object but all I could find is that 
is called on it 
I couldn't find any other interfaces it has to support.

I also tried setting c.NotebookApp.kernel_manager_class to my own custom 
kernel manager class

from tornado import gen, web
from jupyter_client import KernelManager
from notebook.services.kernels.kernelmanager import MappingKernelManager

class ExistingMappingKernelManager(MappingKernelManager):
    """A KernelManager that just connects to an existing kernel."""

    def start_kernel(self, kernel_id=None, path=None, **kwargs):
        kernel_id = 1
        km = KernelManager(kernel_name='python3')
        kc = km.client()
        except RuntimeError:
        raise gen.Return(kernel_id)

but when I create a new notebook I get

[E 13:03:58.692 NotebookApp] Unhandled error in API request
    Traceback (most recent call last):
, line 516, in wrapper
        result = yield gen.maybe_future(method(self, *args, **kwargs))
"/Users/ebanner/.anaconda/lib/python3.6/site-packages/tornado/gen.py", line 
1055, in run
        value = future.result()
, line 238, in result
      File "<string>", line 4, in raise_exc_info
"/Users/ebanner/.anaconda/lib/python3.6/site-packages/tornado/gen.py", line 
1063, in run
        yielded = self.gen.throw(*exc_info)
, line 75, in post
"/Users/ebanner/.anaconda/lib/python3.6/site-packages/tornado/gen.py", line 
1055, in run
        value = future.result()
, line 238, in result
      File "<string>", line 4, in raise_exc_info
"/Users/ebanner/.anaconda/lib/python3.6/site-packages/tornado/gen.py", line 
1069, in run
        yielded = self.gen.send(value)
, line 81, in create_session
        self.save_session(session_id, path=path, name=name, type=type, 
, line 125, in save_session
        return self.get_session(session_id=session_id)
, line 170, in get_session
        return self.row_to_model(row)
, line 209, in row_to_model
        raise KeyError
[E 13:03:58.708 NotebookApp] {
      "Host": "localhost:8888",
      "Connection": "keep-alive",
      "Content-Length": "93",
      "Accept": "application/json, text/javascript, */*; q=0.01",
      "Origin": "http://localhost:8888";,
      "X-Requested-With": "XMLHttpRequest",
      "User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) 
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.132 Safari/537.36",
      "Content-Type": "application/json",
      "Accept-Encoding": "gzip, deflate, br",
      "Accept-Language": "en-US,en;q=0.9",
      "Cookie": "_xsrf=2|e5e3448b|a67102a2af982e0bece9d040d46b1c87|1516914005; 
[E 13:03:58.710 NotebookApp] 500 POST /api/sessions (::1) 289.25ms referer=

Presumably this is because I failed to call super(MappingKernelManager, 
self).start_kernel(**kwargs) which presumably does some necessary setup 
work that I skipped.

At the end of the day I *did* get a very hacky solution working though with 
the following code snippet in a jupyter notebook

from jupyter_client import BlockingKernelClient

client = BlockingKernelClient()

Then I can do client.execute_interactive(code) as many times as I want!

One way to go is to make a magic command out of this which wraps cell code 
and passes it to client.execute_interactive() so I don't have to type 
client.execute_interactive() each time. However I still need to insert the 
magic into every single code cell, which I would rather not do.

So my question is can I attach a new jupyter notebook to an external 
IPython kernel solely though configurables? Am I close? If not what would 
be the best way to go? The closest example I can find to study is nb2kg 
<https://github.com/jupyter-incubator/nb2kg> which allows one to attach a 
jupyter notebook to a remote kernel. However the kernels were still created 
by a jupyter notebook server so I am afraid it is still not close enough to 
help me solve my use case.

I truly appreciate any help as I think I am running out of ideas to try.

On Sunday, February 11, 2018 at 10:56:03 PM UTC-8, Roland Weber wrote:
> Hello Edward,
> through either making a custom server extension 
>> <http://jupyter-notebook.readthedocs.io/en/stable/extending/handlers.html> 
>> or using the jupyter kernel gateway 
>> <https://github.com/jupyter/kernel_gateway>.
> The kernel gateway in its websocket personality doesn't work with 
> notebooks at all.
> In its notebook personality, it runs a configured notebook with one kernel 
> to serve HTTP requests.
> Since your declared intention is to associate a notebook with a kernel you 
> started, I don't think that KG is of any use in your scenario.
> cheers,
>   Roland

You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jupyter+unsubscr...@googlegroups.com.
To post to this group, send email to jupyter@googlegroups.com.
To view this discussion on the web visit 
For more options, visit https://groups.google.com/d/optout.

Reply via email to