Hello community,
here is the log from the commit of package python-nbclient for openSUSE:Factory
checked in at 2020-05-26 17:21:50
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-nbclient (Old)
and /work/SRC/openSUSE:Factory/.python-nbclient.new.2738 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-nbclient"
Tue May 26 17:21:50 2020 rev:2 rq:808511 version:0.2.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-nbclient/python-nbclient.changes
2020-04-27 23:31:31.854721312 +0200
+++
/work/SRC/openSUSE:Factory/.python-nbclient.new.2738/python-nbclient.changes
2020-05-26 17:22:04.536330207 +0200
@@ -1,0 +2,25 @@
+Sun Apr 26 19:33:24 UTC 2020 - Arun Persaud <[email protected]>
+
+- specfile:
+ * only build for python 3, add skip_python2
+ * updated requirement versions, add async_generator
+ * be more specific in %files section
+
+- update to version 0.2.0:
+ * Major Changes
+ + Async support is now available on the client. Methods that
+ support async have an async_ prefix and can be awaited #10 #35
+ #37 #38
+ + Dropped support for Python 3.5 due to async compatability issues
+ #34
+ + Notebook documents now include the new kernel timing fields #32
+ * Fixes
+ + Memory and process leaks from nbclient should now be fixed #34
+ + Notebook execution exceptions now include error information in
+ addition to the message #41
+ * Docs
+ + Added binder examples / tests #7
+ + Added changelog to docs #22
+ + Doc typo fixes #27 #30
+
+-------------------------------------------------------------------
Old:
----
nbclient-0.1.0.tar.gz
New:
----
nbclient-0.2.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-nbclient.spec ++++++
--- /var/tmp/diff_new_pack.1twp97/_old 2020-05-26 17:22:05.464332202 +0200
+++ /var/tmp/diff_new_pack.1twp97/_new 2020-05-26 17:22:05.468332211 +0200
@@ -19,7 +19,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%define skip_python2 1
Name: python-nbclient
-Version: 0.1.0
+Version: 0.2.0
Release: 0
Summary: A client library for executing notebooks
License: BSD-3-Clause
@@ -29,21 +29,23 @@
BuildRequires: %{python_module setuptools >= 38.6.0}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
-Requires: python-jupyter-client >= 5.3.4
+Requires: python-async_generator
+Requires: python-jupyter-client >= 6.1.0
Requires: python-nbformat >= 5.0
Requires: python-traitlets >= 4.2
BuildArch: noarch
# SECTION test requirements
BuildRequires: %{python_module Pebble}
-BuildRequires: %{python_module ipython}
+BuildRequires: %{python_module async_generator}
BuildRequires: %{python_module ipython_genutils}
+BuildRequires: %{python_module ipython}
BuildRequires: %{python_module ipywidgets}
-BuildRequires: %{python_module jupyter-client >= 5.3.4}
+BuildRequires: %{python_module jupyter-client >= 6.1.0}
BuildRequires: %{python_module nbconvert}
BuildRequires: %{python_module nbformat >= 5.0}
BuildRequires: %{python_module pytest >= 4.1}
-BuildRequires: %{python_module traitlets >= 4.2}
BuildRequires: %{python_module testpath}
+BuildRequires: %{python_module traitlets >= 4.2}
BuildRequires: %{python_module xmltodict}
# /SECTION
%python_subpackages
@@ -70,6 +72,7 @@
%files %{python_files}
%doc CHANGELOG.md README.md
%license LICENSE
-%{python_sitelib}/*
+%{python_sitelib}/nbclient
+%{python_sitelib}/nbclient-%{version}-py*.egg-info/
%changelog
++++++ nbclient-0.1.0.tar.gz -> nbclient-0.2.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/.bumpversion.cfg
new/nbclient-0.2.0/.bumpversion.cfg
--- old/nbclient-0.1.0/.bumpversion.cfg 2020-02-11 08:19:17.000000000 +0100
+++ new/nbclient-0.2.0/.bumpversion.cfg 2020-03-31 21:26:58.000000000 +0200
@@ -1,5 +1,5 @@
[bumpversion]
-current_version = 0.1.0
+current_version = 0.2.0
commit = True
tag = True
tag_name = {new_version}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/CHANGELOG.md
new/nbclient-0.2.0/CHANGELOG.md
--- old/nbclient-0.1.0/CHANGELOG.md 2020-01-27 00:38:03.000000000 +0100
+++ new/nbclient-0.2.0/CHANGELOG.md 2020-02-26 20:12:03.000000000 +0100
@@ -1,4 +1,3 @@
# Change Log
-## 0.1.0
-- Initial release -- moved out of nbconvert 6.0.0-a0
+See the [nbclient
documentation](https://nbclient.readthedocs.io/changelog.html)
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/MANIFEST.in
new/nbclient-0.2.0/MANIFEST.in
--- old/nbclient-0.1.0/MANIFEST.in 2020-01-27 01:42:18.000000000 +0100
+++ new/nbclient-0.2.0/MANIFEST.in 2020-03-27 02:33:41.000000000 +0100
@@ -32,3 +32,8 @@
global-exclude *.pyo
global-exclude .git
global-exclude .ipynb_checkpoints
+
+# Binder files to be excluded
+exclude binder
+recursive-exclude binder *.ipynb
+recursive-exclude binder *.txt
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/PKG-INFO new/nbclient-0.2.0/PKG-INFO
--- old/nbclient-0.1.0/PKG-INFO 2020-02-11 08:19:37.000000000 +0100
+++ new/nbclient-0.2.0/PKG-INFO 2020-03-31 21:48:17.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: nbclient
-Version: 0.1.0
+Version: 0.2.0
Summary: A client library for executing notebooks. Formally nbconvert's
ExecutePreprocessor.
Home-page: https://jupyter.org
Author: Jupyter Development Team
@@ -10,9 +10,9 @@
Project-URL: Funding, https://numfocus.org/
Project-URL: Source, https://github.com/jupyter/nbclient
Project-URL: Tracker, https://github.com/jupyter/nbclient/issues
-Description: [](https://travis-ci.org/jupyter/nbclient)
+Description:
[](https://mybinder.org/v2/gh/jupyter/nbclient/master?filepath=binder%2Frun_nbclient.ipynb)
+ [](https://travis-ci.org/jupyter/nbclient)
[](https://codecov.io/github/jupyter/nbclient?branch=master)
- [](https://www.python.org/downloads/release/python-350/)
[](https://www.python.org/downloads/release/python-360/)
[](https://www.python.org/downloads/release/python-370/)
[](https://www.python.org/downloads/release/python-380/)
@@ -26,7 +26,7 @@
NBClient lets you:
- **execute** notebooks
+ - **execute** notebooks
Similar in nature to jupyter_client, as the jupyter_client is to the
jupyter
protocol nbclient is to notebooks allowing for execution contexts to
be run.
@@ -37,7 +37,7 @@
## Python Version Support
- This library currently supports python 3.5+ verisons. As minor python
+ This library currently supports python 3.6+ versions. As minor python
versions are officially sunset by the python org nbclient will
similarly
drop support in the future.
@@ -55,11 +55,10 @@
Classifier: License :: OSI Approved :: BSD License
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
-Requires-Python: >=3.5
+Requires-Python: >=3.6
Description-Content-Type: text/markdown
Provides-Extra: test
Provides-Extra: dev
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/README.md new/nbclient-0.2.0/README.md
--- old/nbclient-0.1.0/README.md 2020-02-11 07:29:37.000000000 +0100
+++ new/nbclient-0.2.0/README.md 2020-03-31 21:26:48.000000000 +0200
@@ -1,6 +1,6 @@
+[](https://mybinder.org/v2/gh/jupyter/nbclient/master?filepath=binder%2Frun_nbclient.ipynb)
[](https://travis-ci.org/jupyter/nbclient)
[](https://codecov.io/github/jupyter/nbclient?branch=master)
-[](https://www.python.org/downloads/release/python-350/)
[](https://www.python.org/downloads/release/python-360/)
[](https://www.python.org/downloads/release/python-370/)
[](https://www.python.org/downloads/release/python-380/)
@@ -14,7 +14,7 @@
NBClient lets you:
- **execute** notebooks
+- **execute** notebooks
Similar in nature to jupyter_client, as the jupyter_client is to the jupyter
protocol nbclient is to notebooks allowing for execution contexts to be run.
@@ -25,7 +25,7 @@
## Python Version Support
-This library currently supports python 3.5+ verisons. As minor python
+This library currently supports python 3.6+ versions. As minor python
versions are officially sunset by the python org nbclient will similarly
drop support in the future.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/docs/changelog.md
new/nbclient-0.2.0/docs/changelog.md
--- old/nbclient-0.1.0/docs/changelog.md 1970-01-01 01:00:00.000000000
+0100
+++ new/nbclient-0.2.0/docs/changelog.md 2020-03-31 21:47:32.000000000
+0200
@@ -0,0 +1,24 @@
+# Changelog
+
+## 0.2.0
+
+### Major Changes
+
+- Async support is now available on the client. Methods that support async
have an `async_` prefix and can be awaited
[#10](https://github.com/jupyter/nbclient/pull/10)
[#35](https://github.com/jupyter/nbclient/pull/35)
[#37](https://github.com/jupyter/nbclient/pull/37)
[#38](https://github.com/jupyter/nbclient/pull/38)
+- Dropped support for Python 3.5 due to async compatability issues
[#34](https://github.com/jupyter/nbclient/pull/34)
+- Notebook documents now include the [new kernel timing
fields](https://github.com/jupyter/nbformat/pull/144)
[#32](https://github.com/jupyter/nbclient/pull/32)
+
+### Fixes
+
+- Memory and process leaks from nbclient should now be fixed
[#34](https://github.com/jupyter/nbclient/pull/34)
+- Notebook execution exceptions now include error information in addition to
the message [#41](https://github.com/jupyter/nbclient/pull/41)
+
+### Docs
+
+- Added [binder
examples](https://mybinder.org/v2/gh/jupyter/nbclient/master?filepath=binder%2Frun_nbclient.ipynb)
/ tests [#7](https://github.com/jupyter/nbclient/pull/7)
+- Added changelog to docs [#22](https://github.com/jupyter/nbclient/pull/22)
+- Doc typo fixes [#27](https://github.com/jupyter/nbclient/pull/27)
[#30](https://github.com/jupyter/nbclient/pull/30)
+
+## 0.1.0
+
+- Initial release -- moved out of nbconvert 6.0.0-a0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/docs/conf.py
new/nbclient-0.2.0/docs/conf.py
--- old/nbclient-0.1.0/docs/conf.py 2020-02-11 08:04:21.000000000 +0100
+++ new/nbclient-0.2.0/docs/conf.py 2020-02-26 20:12:03.000000000 +0100
@@ -22,9 +22,6 @@
sys.path.insert(0, os.path.abspath('..'))
-import recommonmark
-from recommonmark.parser import CommonMarkParser
-
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
@@ -39,13 +36,12 @@
'sphinx.ext.intersphinx',
'sphinx.ext.mathjax',
'sphinx.ext.napoleon',
+ 'recommonmark'
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
-source_parsers = {'.md': CommonMarkParser}
-
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/docs/index.rst
new/nbclient-0.2.0/docs/index.rst
--- old/nbclient-0.1.0/docs/index.rst 2020-02-11 08:04:21.000000000 +0100
+++ new/nbclient-0.2.0/docs/index.rst 2020-03-31 21:26:48.000000000 +0200
@@ -10,6 +10,11 @@
Similar in nature to jupyter_client, as the jupyter_client is to the jupyter
protocol nbclient is to notebooks allowing for execution contexts to be run.
+To demo **NBClient** interactively, click the Binder link below:
+
+.. image:: https://mybinder.org/badge_logo.svg
+ :target:
https://mybinder.org/v2/gh/jupyter/nbclient/master?filepath=binder%2Frun_nbclient.ipynb
+
Origins
-------
@@ -20,7 +25,7 @@
Python Version Support
----------------------
-This library currently supports python 3.5+ verisons. As minor python
+This library currently supports python 3.6+ verisons. As minor python
versions are officially sunset by the python org nbclient will similarly
drop support in the future.
@@ -34,6 +39,7 @@
installation
client
+ changelog
API Reference
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/docs/requirements-doc.txt
new/nbclient-0.2.0/docs/requirements-doc.txt
--- old/nbclient-0.1.0/docs/requirements-doc.txt 2020-01-26
23:56:37.000000000 +0100
+++ new/nbclient-0.2.0/docs/requirements-doc.txt 2020-02-26
20:12:03.000000000 +0100
@@ -1,5 +1,5 @@
Sphinx>=1.7
sphinx_rtd_theme
-recommonmark==0.4.0
+recommonmark
mock
moto
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/_version.py
new/nbclient-0.2.0/nbclient/_version.py
--- old/nbclient-0.1.0/nbclient/_version.py 2020-02-11 08:19:17.000000000
+0100
+++ new/nbclient-0.2.0/nbclient/_version.py 2020-03-31 21:26:58.000000000
+0200
@@ -1 +1 @@
-version = '0.1.0'
+version = '0.2.0'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/client.py
new/nbclient-0.2.0/nbclient/client.py
--- old/nbclient-0.1.0/nbclient/client.py 2020-02-11 08:04:21.000000000
+0100
+++ new/nbclient-0.2.0/nbclient/client.py 2020-03-31 21:26:48.000000000
+0200
@@ -1,15 +1,31 @@
+import datetime
import base64
from textwrap import dedent
+
+from async_generator import asynccontextmanager
from contextlib import contextmanager
+
from time import monotonic
from queue import Empty
+import asyncio
from traitlets.config.configurable import LoggingConfigurable
from traitlets import List, Unicode, Bool, Enum, Any, Type, Dict, Integer,
default
from nbformat.v4 import output_from_msg
-from .exceptions import CellTimeoutError, DeadKernelError,
CellExecutionComplete, CellExecutionError
+from .exceptions import (
+ CellControlSignal,
+ CellTimeoutError,
+ DeadKernelError,
+ CellExecutionComplete,
+ CellExecutionError
+)
+from .util import run_sync, ensure_async
+
+
+def timestamp():
+ return datetime.datetime.utcnow().isoformat() + 'Z'
class NotebookClient(LoggingConfigurable):
@@ -85,6 +101,21 @@
),
).tag(config=True)
+ nest_asyncio = Bool(
+ False,
+ help=dedent(
+ """
+ If False (default), then blocking functions such as `execute`
+ assume that no event loop is already running. These functions
+ run their async counterparts (e.g. `async_execute`) in an event
+ loop with `asyncio.run_until_complete`, which will fail if an
+ event loop is already running. This can be the case if nbclient
+ is used e.g. in a Jupyter Notebook. In that case, `nest_asyncio`
+ should be set to True.
+ """
+ ),
+ ).tag(config=True)
+
force_raise_errors = Bool(
False,
help=dedent(
@@ -139,6 +170,16 @@
),
).tag(config=True)
+ record_timing = Bool(
+ True,
+ help=dedent(
+ """
+ If `True` (default), then the execution timings of each cell will
+ be stored in the metadata of the notebook.
+ """
+ ),
+ ).tag(config=True)
+
iopub_timeout = Integer(
4,
allow_none=False,
@@ -196,9 +237,9 @@
@default('kernel_manager_class')
def _kernel_manager_class_default(self):
"""Use a dynamic default to avoid importing jupyter_client at
startup"""
- from jupyter_client import KernelManager
+ from jupyter_client import AsyncKernelManager
- return KernelManager
+ return AsyncKernelManager
_display_id_map = Dict(
help=dedent(
@@ -285,17 +326,39 @@
self.km = self.kernel_manager_class(config=self.config)
else:
self.km = self.kernel_manager_class(kernel_name=self.kernel_name,
config=self.config)
+ self.km.client_class = 'jupyter_client.asynchronous.AsyncKernelClient'
return self.km
- def start_new_kernel_client(self, **kwargs):
+ async def _async_cleanup_kernel(self):
+ try:
+ # Send a polite shutdown request
+ await ensure_async(self.kc.shutdown())
+ try:
+ # Queue the manager to kill the process, sometimes the
built-in and above
+ # shutdowns have not been successful or called yet, so give a
direct kill
+ # call here and recover gracefully if it's already dead.
+ await ensure_async(self.km.shutdown_kernel(now=True))
+ except RuntimeError as e:
+ # The error isn't specialized, so we have to check the message
+ if 'No kernel is running!' not in str(e):
+ raise
+ finally:
+ # Remove any state left over even if we failed to stop the kernel
+ await ensure_async(self.km.cleanup())
+ await ensure_async(self.kc.stop_channels())
+ self.kc = None
+
+ _cleanup_kernel = run_sync(_async_cleanup_kernel)
+
+ async def async_start_new_kernel_client(self, **kwargs):
"""Creates a new kernel client.
Parameters
----------
kwargs :
Any options for `self.kernel_manager_class.start_kernel()`. Because
- that defaults to KernelManager, this will likely include options
- accepted by `KernelManager.start_kernel()``, which includes `cwd`.
+ that defaults to AsyncKernelManager, this will likely include
options
+ accepted by `AsyncKernelManager.start_kernel()``, which includes
`cwd`.
Returns
-------
@@ -309,19 +372,20 @@
if self.km.ipykernel and self.ipython_hist_file:
self.extra_arguments +=
['--HistoryManager.hist_file={}'.format(self.ipython_hist_file)]
- self.km.start_kernel(extra_arguments=self.extra_arguments, **kwargs)
+ await
ensure_async(self.km.start_kernel(extra_arguments=self.extra_arguments,
**kwargs))
self.kc = self.km.client()
- self.kc.start_channels()
+ await ensure_async(self.kc.start_channels())
try:
- self.kc.wait_for_ready(timeout=self.startup_timeout)
+ await
ensure_async(self.kc.wait_for_ready(timeout=self.startup_timeout))
except RuntimeError:
- self.kc.stop_channels()
- self.km.shutdown_kernel()
+ await self._async_cleanup_kernel()
raise
self.kc.allow_stdin = False
return self.kc
+ start_new_kernel_client = run_sync(async_start_new_kernel_client)
+
@contextmanager
def setup_kernel(self, **kwargs):
"""
@@ -332,6 +396,7 @@
When control returns from the yield it stops the client's zmq
channels, and shuts
down the kernel.
"""
+ # Can't use run_until_complete on an asynccontextmanager function :(
if self.km is None:
self.start_kernel_manager()
@@ -340,10 +405,29 @@
try:
yield
finally:
- self.kc.stop_channels()
- self.kc = None
+ self._cleanup_kernel()
- def execute(self, **kwargs):
+ @asynccontextmanager
+ async def async_setup_kernel(self, **kwargs):
+ """
+ Context manager for setting up the kernel to execute a notebook.
+
+ The assigns the Kernel Manager (`self.km`) if missing and Kernel
Client(`self.kc`).
+
+ When control returns from the yield it stops the client's zmq
channels, and shuts
+ down the kernel.
+ """
+ if self.km is None:
+ self.start_kernel_manager()
+
+ if not self.km.has_kernel:
+ await self.async_start_new_kernel_client(**kwargs)
+ try:
+ yield
+ finally:
+ await self._async_cleanup_kernel()
+
+ async def async_execute(self, **kwargs):
"""
Executes each code cell.
@@ -354,18 +438,23 @@
"""
self.reset_execution_trackers()
- with self.setup_kernel(**kwargs):
+ async with self.async_setup_kernel(**kwargs):
self.log.info("Executing notebook with kernel: %s" %
self.kernel_name)
for index, cell in enumerate(self.nb.cells):
# Ignore `'execution_count' in content` as it's always 1
# when store_history is False
- self.execute_cell(cell, index,
execution_count=self.code_cells_executed + 1)
- info_msg = self._wait_for_reply(self.kc.kernel_info())
+ await self.async_execute_cell(
+ cell, index, execution_count=self.code_cells_executed + 1
+ )
+ msg_id = await ensure_async(self.kc.kernel_info())
+ info_msg = await self.async_wait_for_reply(msg_id)
self.nb.metadata['language_info'] =
info_msg['content']['language_info']
self.set_widgets_metadata()
return self.nb
+ execute = run_sync(async_execute)
+
def set_widgets_metadata(self):
if self.widget_state:
self.nb.metadata.widgets = {
@@ -408,16 +497,42 @@
outputs[output_idx]['data'] = out['data']
outputs[output_idx]['metadata'] = out['metadata']
- def _poll_for_reply(self, msg_id, cell=None, timeout=None):
- try:
- # check with timeout if kernel is still alive
- msg = self.kc.shell_channel.get_msg(timeout=timeout)
- if msg['parent_header'].get('msg_id') == msg_id:
- return msg
- except Empty:
- # received no message, check if kernel is still alive
- self._check_alive()
- # kernel still alive, wait for a message
+ async def _async_poll_for_reply(self, msg_id, cell, timeout,
task_poll_output_msg):
+ if timeout is not None:
+ deadline = monotonic() + timeout
+ while True:
+ try:
+ msg = await
ensure_async(self.kc.shell_channel.get_msg(timeout=timeout))
+ if msg['parent_header'].get('msg_id') == msg_id:
+ if self.record_timing:
+ cell['metadata']['execution']['shell.execute_reply'] =
timestamp()
+ try:
+ await asyncio.wait_for(task_poll_output_msg,
self.iopub_timeout)
+ except (asyncio.TimeoutError, Empty):
+ if self.raise_on_iopub_timeout:
+ raise CellTimeoutError.error_from_timeout_and_cell(
+ "Timeout waiting for IOPub output",
self.iopub_timeout, cell
+ )
+ else:
+ self.log.warning("Timeout waiting for IOPub
output")
+ return msg
+ else:
+ if timeout is not None:
+ timeout = max(0, deadline - monotonic())
+ except Empty:
+ # received no message, check if kernel is still alive
+ await self._async_check_alive()
+ await self._async_handle_timeout(timeout, cell)
+
+ async def _async_poll_output_msg(self, parent_msg_id, cell, cell_index):
+ while True:
+ msg = await
ensure_async(self.kc.iopub_channel.get_msg(timeout=None))
+ if msg['parent_header'].get('msg_id') == parent_msg_id:
+ try:
+ # Will raise CellExecutionComplete when completed
+ self.process_message(msg, cell, cell_index)
+ except CellExecutionComplete:
+ return
def _get_timeout(self, cell):
if self.timeout_func is not None and cell is not None:
@@ -430,39 +545,46 @@
return timeout
- def _handle_timeout(self, timeout, cell=None):
+ async def _async_handle_timeout(self, timeout, cell=None):
self.log.error("Timeout waiting for execute reply (%is)." % timeout)
if self.interrupt_on_timeout:
self.log.error("Interrupting kernel")
- self.km.interrupt_kernel()
+ await ensure_async(self.km.interrupt_kernel())
else:
raise CellTimeoutError.error_from_timeout_and_cell(
"Cell execution timed out", timeout, cell
)
- def _check_alive(self):
- if not self.kc.is_alive():
+ async def _async_check_alive(self):
+ if not await ensure_async(self.kc.is_alive()):
self.log.error("Kernel died while waiting for execute reply.")
raise DeadKernelError("Kernel died")
- def _wait_for_reply(self, msg_id, cell=None):
+ async def async_wait_for_reply(self, msg_id, cell=None):
# wait for finish, with timeout
timeout = self._get_timeout(cell)
cummulative_time = 0
- self.shell_timeout_interval = 5
while True:
try:
- msg =
self.kc.shell_channel.get_msg(timeout=self.shell_timeout_interval)
+ msg = await ensure_async(
+ self.kc.shell_channel.get_msg(
+ timeout=self.shell_timeout_interval
+ )
+ )
except Empty:
- self._check_alive()
+ await self._async_check_alive()
cummulative_time += self.shell_timeout_interval
if timeout and cummulative_time > timeout:
- self._handle_timeout(timeout, cell)
+ await self._async_async_handle_timeout(timeout, cell)
break
else:
if msg['parent_header'].get('msg_id') == msg_id:
return msg
+ wait_for_reply = run_sync(async_wait_for_reply)
+ # Backwards compatability naming for papermill
+ _wait_for_reply = wait_for_reply
+
def _timeout_with_deadline(self, timeout, deadline):
if deadline is not None and deadline - monotonic() < timeout:
timeout = deadline - monotonic()
@@ -486,7 +608,7 @@
if (exec_reply is not None) and exec_reply['content']['status'] ==
'error':
raise CellExecutionError.from_cell_and_msg(cell,
exec_reply['content'])
- def execute_cell(self, cell, cell_index, execution_count=None,
store_history=True):
+ async def async_execute_cell(self, cell, cell_index, execution_count=None,
store_history=True):
"""
Executes a single code cell.
@@ -524,77 +646,39 @@
self.log.debug("Skipping non-executing cell %s", cell_index)
return cell
+ if self.record_timing and 'execution' not in cell['metadata']:
+ cell['metadata']['execution'] = {}
+
self.log.debug("Executing cell:\n%s", cell.source)
- parent_msg_id = self.kc.execute(
- cell.source, store_history=store_history, stop_on_error=not
self.allow_errors
+ parent_msg_id = await ensure_async(
+ self.kc.execute(
+ cell.source,
+ store_history=store_history,
+ stop_on_error=not self.allow_errors
+ )
)
# We launched a code cell to execute
self.code_cells_executed += 1
exec_timeout = self._get_timeout(cell)
- deadline = None
- if exec_timeout is not None:
- deadline = monotonic() + exec_timeout
cell.outputs = []
self.clear_before_next_output = False
- # This loop resolves nbconvert#659. By polling iopub_channel's and
shell_channel's
- # output we avoid dropping output and important signals (like idle)
from
- # iopub_channel. Prior to this change, iopub_channel wasn't polled
until
- # after exec_reply was obtained from shell_channel, leading to the
- # aforementioned dropped data.
-
- # These two variables are used to track what still needs polling:
- # more_output=true => continue to poll the iopub_channel
- more_output = True
- # polling_exec_reply=true => continue to poll the shell_channel
- polling_exec_reply = True
-
- while more_output or polling_exec_reply:
- if polling_exec_reply:
- if self._passed_deadline(deadline):
- self._handle_timeout(exec_timeout, cell)
- polling_exec_reply = False
- continue
-
- # Avoid exceeding the execution timeout (deadline), but stop
- # after at most 1s so we can poll output from iopub_channel.
- timeout = self._timeout_with_deadline(1, deadline)
- exec_reply = self._poll_for_reply(parent_msg_id, cell, timeout)
- if exec_reply is not None:
- polling_exec_reply = False
-
- if more_output:
- try:
- timeout = self.iopub_timeout
- if polling_exec_reply:
- # Avoid exceeding the execution timeout (deadline)
while
- # polling for output.
- timeout = self._timeout_with_deadline(timeout,
deadline)
- msg = self.kc.iopub_channel.get_msg(timeout=timeout)
- except Empty:
- if polling_exec_reply:
- # Still waiting for execution to finish so we expect
that
- # output may not always be produced yet.
- continue
-
- if self.raise_on_iopub_timeout:
- raise CellTimeoutError.error_from_timeout_and_cell(
- "Timeout waiting for IOPub output",
self.iopub_timeout, cell
- )
- else:
- self.log.warning("Timeout waiting for IOPub output")
- more_output = False
- continue
- if msg['parent_header'].get('msg_id') != parent_msg_id:
- # not an output from our execution
- continue
-
+ task_poll_output_msg = asyncio.ensure_future(
+ self._async_poll_output_msg(parent_msg_id, cell, cell_index)
+ )
+ try:
+ exec_reply = await self._async_poll_for_reply(
+ parent_msg_id, cell, exec_timeout, task_poll_output_msg
+ )
+ except Exception as e:
+ # Best effort to cancel request if it hasn't been resolved
try:
- # Will raise CellExecutionComplete when completed
- self.process_message(msg, cell, cell_index)
- except CellExecutionComplete:
- more_output = False
+ # Check if the task_poll_output is doing the raising for us
+ if not isinstance(e, CellControlSignal):
+ task_poll_output_msg.cancel()
+ finally:
+ raise
if execution_count:
cell['execution_count'] = execution_count
@@ -602,6 +686,8 @@
self.nb['cells'][cell_index] = cell
return cell
+ execute_cell = run_sync(async_execute_cell)
+
def process_message(self, msg, cell, cell_index):
"""
Processes a kernel message, updates cell state, and returns the
@@ -642,6 +728,15 @@
if 'execution_count' in content:
cell['execution_count'] = content['execution_count']
+ if self.record_timing:
+ if msg_type == 'status':
+ if content['execution_state'] == 'idle':
+ cell['metadata']['execution']['iopub.status.idle'] =
timestamp()
+ elif content['execution_state'] == 'busy':
+ cell['metadata']['execution']['iopub.status.busy'] =
timestamp()
+ elif msg_type == 'execute_input':
+ cell['metadata']['execution']['iopub.execute_input'] =
timestamp()
+
if msg_type == 'status':
if content['execution_state'] == 'idle':
raise CellExecutionComplete()
@@ -739,7 +834,7 @@
The notebook object to be executed
cwd : str, optional
If supplied, the kernel will run in this directory
- km : KernelManager, optional
+ km : AsyncKernelManager, optional
If supplied, the specified kernel manager will be used for code
execution.
kwargs :
Any other options for ExecutePreprocessor, e.g. timeout, kernel_name
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/exceptions.py
new/nbclient-0.2.0/nbclient/exceptions.py
--- old/nbclient-0.1.0/nbclient/exceptions.py 2020-02-11 08:04:21.000000000
+0100
+++ new/nbclient-0.2.0/nbclient/exceptions.py 2020-03-31 21:26:48.000000000
+0200
@@ -1,4 +1,13 @@
-class CellTimeoutError(TimeoutError):
+class CellControlSignal(Exception):
+ """
+ A custom exception used to indicate that the exception is used for cell
+ control actions (not the best model, but it's needed to cover existing
+ behavior without major refactors).
+ """
+ pass
+
+
+class CellTimeoutError(TimeoutError, CellControlSignal):
"""
A custom exception to capture when a cell has timed out during execution.
"""
@@ -21,7 +30,7 @@
pass
-class CellExecutionComplete(Exception):
+class CellExecutionComplete(CellControlSignal):
"""
Used as a control signal for cell execution across execute_cell and
process_message function calls. Raised when all execution requests
@@ -32,7 +41,7 @@
pass
-class CellExecutionError(Exception):
+class CellExecutionError(CellControlSignal):
"""
Custom exception to propagate exceptions that are raised during
notebook execution to the caller. This is mostly useful when
@@ -40,9 +49,11 @@
failures gracefully.
"""
- def __init__(self, traceback):
+ def __init__(self, traceback, ename, evalue):
super(CellExecutionError, self).__init__(traceback)
self.traceback = traceback
+ self.ename = ename
+ self.evalue = evalue
def __str__(self):
s = self.__unicode__()
@@ -65,7 +76,9 @@
traceback=tb,
ename=msg.get('ename', '<Error>'),
evalue=msg.get('evalue', ''),
- )
+ ),
+ ename=msg.get('ename', '<Error>'),
+ evalue=msg.get('evalue', '')
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/tests/fake_kernelmanager.py
new/nbclient-0.2.0/nbclient/tests/fake_kernelmanager.py
--- old/nbclient-0.1.0/nbclient/tests/fake_kernelmanager.py 2020-01-27
00:24:11.000000000 +0100
+++ new/nbclient-0.2.0/nbclient/tests/fake_kernelmanager.py 2020-03-27
02:33:41.000000000 +0100
@@ -1,7 +1,7 @@
-from jupyter_client.manager import KernelManager
+from jupyter_client.manager import AsyncKernelManager
-class FakeCustomKernelManager(KernelManager):
+class FakeCustomKernelManager(AsyncKernelManager):
expected_methods = {'__init__': 0, 'client': 0, 'start_kernel': 0}
def __init__(self, *args, **kwargs):
@@ -9,10 +9,10 @@
self.expected_methods['__init__'] += 1
super(FakeCustomKernelManager, self).__init__(*args, **kwargs)
- def start_kernel(self, *args, **kwargs):
+ async def start_kernel(self, *args, **kwargs):
self.log.info('FakeCustomKernelManager started a kernel')
self.expected_methods['start_kernel'] += 1
- return super(FakeCustomKernelManager, self).start_kernel(*args,
**kwargs)
+ return await super(FakeCustomKernelManager, self).start_kernel(*args,
**kwargs)
def client(self, *args, **kwargs):
self.log.info('FakeCustomKernelManager created a client')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/tests/files/Sleep1s.ipynb
new/nbclient-0.2.0/nbclient/tests/files/Sleep1s.ipynb
--- old/nbclient-0.1.0/nbclient/tests/files/Sleep1s.ipynb 1970-01-01
01:00:00.000000000 +0100
+++ new/nbclient-0.2.0/nbclient/tests/files/Sleep1s.ipynb 2020-03-08
22:18:43.000000000 +0100
@@ -0,0 +1,65 @@
+{
+ "cells": [
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import time\n",
+ "import datetime"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "t0 = datetime.datetime.utcnow()\n",
+ "time.sleep(1)\n",
+ "t1 = datetime.datetime.utcnow()"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "time_format = '%Y-%m-%dT%H:%M:%S.%fZ'\n",
+ "print(t0.strftime(time_format), end='')"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "print(t1.strftime(time_format), end='')"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.8.1"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/tests/test_client.py
new/nbclient-0.2.0/nbclient/tests/test_client.py
--- old/nbclient-0.1.0/nbclient/tests/test_client.py 2020-02-11
08:04:21.000000000 +0100
+++ new/nbclient-0.2.0/nbclient/tests/test_client.py 2020-03-31
21:26:48.000000000 +0200
@@ -4,6 +4,8 @@
import os
import re
import threading
+import asyncio
+import datetime
import nbformat
import sys
@@ -25,7 +27,7 @@
from pebble import ProcessPool
from queue import Empty
-from unittest.mock import MagicMock, patch
+from unittest.mock import MagicMock, Mock
addr_pat = re.compile(r'0x[0-9a-f]{7,9}')
@@ -34,6 +36,16 @@
IPY_MAJOR = IPython.version_info[0]
+class AsyncMock(Mock):
+ pass
+
+
+def make_async(mock_value):
+ async def _():
+ return mock_value
+ return _()
+
+
def normalize_base64(b64_text):
# if it's base64, pass it through b64 decode/encode to avoid
# equivalent values from being considered unequal
@@ -68,6 +80,31 @@
return input_nb, output_nb
+async def async_run_notebook(filename, opts, resources=None):
+ """Loads and runs a notebook, returning both the version prior to
+ running it and the version after running it.
+
+ """
+ with io.open(filename) as f:
+ input_nb = nbformat.read(f, 4)
+
+ cleaned_input_nb = copy.deepcopy(input_nb)
+ for cell in cleaned_input_nb.cells:
+ if 'execution_count' in cell:
+ del cell['execution_count']
+ cell['outputs'] = []
+
+ if resources:
+ opts = {'resources': resources, **opts}
+ executor = NotebookClient(cleaned_input_nb, **opts)
+
+ # Override terminal size to standardise traceback format
+ with modified_env({'COLUMNS': '80', 'LINES': '24'}):
+ output_nb = await executor.async_execute()
+
+ return input_nb, output_nb
+
+
def prepare_cell_mocks(*messages, reply_msg=None):
"""
This function prepares a executor object which has a fake kernel client
@@ -83,23 +120,25 @@
def shell_channel_message_mock():
# Return the message generator for
# self.kc.shell_channel.get_msg => {'parent_header': {'msg_id':
parent_id}}
- return MagicMock(
- return_value=NBClientTestsBase.merge_dicts(
+ return AsyncMock(
+ return_value=make_async(NBClientTestsBase.merge_dicts(
{
'parent_header': {'msg_id': parent_id},
'content': {'status': 'ok', 'execution_count': 1},
},
reply_msg or {},
- )
+ ))
)
def iopub_messages_mock():
# Return the message generator for
# self.kc.iopub_channel.get_msg => messages[i]
- return MagicMock(
+ return AsyncMock(
side_effect=[
# Default the parent_header so mocks don't need to include this
- NBClientTestsBase.merge_dicts({'parent_header': {'msg_id':
parent_id}}, msg)
+ make_async(
+ NBClientTestsBase.merge_dicts({'parent_header': {'msg_id':
parent_id}}, msg)
+ )
for msg in messages
]
)
@@ -109,7 +148,7 @@
def test_mock_wrapper(self):
"""
This inner function wrapper populates the executor object with
- the fake kernel client. This client has it's iopub and shell
+ the fake kernel client. This client has its iopub and shell
channels mocked so as to fake the setup handshake and return
the messages passed into prepare_cell_mocks as the execute_cell
loop
processes them.
@@ -126,6 +165,7 @@
iopub_channel=MagicMock(get_msg=message_mock),
shell_channel=MagicMock(get_msg=shell_channel_message_mock()),
execute=MagicMock(return_value=parent_id),
+ is_alive=MagicMock(return_value=make_async(True))
)
executor.parent_id = parent_id
return func(self, executor, cell_mock, message_mock)
@@ -247,7 +287,7 @@
def test_many_parallel_notebooks(capfd):
"""Ensure that when many IPython kernels are run in parallel, nothing
awful happens.
- Specifically, many IPython kernels when run simultaneously would enocunter
errors
+ Specifically, many IPython kernels when run simultaneously would encounter
errors
due to using the same SQLite history database.
"""
opts = dict(kernel_name="python", timeout=5)
@@ -259,11 +299,10 @@
# run once, to trigger creating the original context
run_notebook(input_file, opts, res)
- with ProcessPool(max_workers=4) as pool:
+ with ProcessPool(max_workers=2) as pool:
futures = [
- # Travis needs a lot more time even though 10s is enough on most
dev machines
- pool.schedule(run_notebook, args=(input_file, opts, res),
timeout=30)
- for i in range(0, 8)
+ pool.schedule(run_notebook, args=(input_file, opts, res))
+ for i in range(8)
]
for index, future in enumerate(futures):
future.result()
@@ -272,6 +311,95 @@
assert captured.err == ""
+def test_async_parallel_notebooks(capfd, tmpdir):
+ """Two notebooks should be able to be run simultaneously without problems.
+
+ The two notebooks spawned here use the filesystem to check that the other
notebook
+ wrote to the filesystem."""
+
+ opts = dict(kernel_name="python")
+ input_name = "Parallel Execute {label}.ipynb"
+ input_file = os.path.join(current_dir, "files", input_name)
+ res = notebook_resources()
+
+ with modified_env({"NBEXECUTE_TEST_PARALLEL_TMPDIR": str(tmpdir)}):
+ tasks = [
+ async_run_notebook(input_file.format(label=label), opts, res)
+ for label in ("A", "B")
+ ]
+ loop = asyncio.get_event_loop()
+ loop.run_until_complete(asyncio.gather(*tasks))
+
+ captured = capfd.readouterr()
+ assert captured.err == ""
+
+
+def test_many_async_parallel_notebooks(capfd):
+ """Ensure that when many IPython kernels are run in parallel, nothing
awful happens.
+
+ Specifically, many IPython kernels when run simultaneously would encounter
errors
+ due to using the same SQLite history database.
+ """
+ opts = dict(kernel_name="python", timeout=5)
+ input_name = "HelloWorld.ipynb"
+ input_file = os.path.join(current_dir, "files", input_name)
+ res = NBClientTestsBase().build_resources()
+ res["metadata"]["path"] = os.path.join(current_dir, "files")
+
+ # run once, to trigger creating the original context
+ run_notebook(input_file, opts, res)
+
+ tasks = [
+ async_run_notebook(input_file, opts, res)
+ for i in range(4)
+ ]
+ loop = asyncio.get_event_loop()
+ loop.run_until_complete(asyncio.gather(*tasks))
+
+ captured = capfd.readouterr()
+ assert captured.err == ""
+
+
+def test_execution_timing():
+ """Compare the execution timing information stored in the cell with the
+ actual time it took to run the cell. Also check for the cell timing string
+ format."""
+ opts = dict(kernel_name="python")
+ input_name = "Sleep1s.ipynb"
+ input_file = os.path.join(current_dir, "files", input_name)
+ res = notebook_resources()
+ input_nb, output_nb = run_notebook(input_file, opts, res)
+
+ def get_time_from_str(s):
+ time_format = '%Y-%m-%dT%H:%M:%S.%fZ'
+ return datetime.datetime.strptime(s, time_format)
+
+ execution_timing = output_nb['cells'][1]['metadata']['execution']
+ status_busy = get_time_from_str(execution_timing['iopub.status.busy'])
+ execute_input = get_time_from_str(execution_timing['iopub.execute_input'])
+ execute_reply = get_time_from_str(execution_timing['shell.execute_reply'])
+ status_idle = get_time_from_str(execution_timing['iopub.status.idle'])
+
+ cell_start = get_time_from_str(output_nb['cells'][2]['outputs'][0]['text'])
+ cell_end = get_time_from_str(output_nb['cells'][3]['outputs'][0]['text'])
+
+ delta = datetime.timedelta(milliseconds=100)
+ assert status_busy - cell_start < delta
+ assert execute_input - cell_start < delta
+ assert execute_reply - cell_end < delta
+ assert status_idle - cell_end < delta
+
+
+def test_synchronous_setup_kernel():
+ nb = nbformat.v4.new_notebook()
+ executor = NotebookClient(nb)
+ with executor.setup_kernel():
+ # Prove it initalized client
+ assert executor.kc is not None
+ # Prove it removed the client (and hopefully cleaned up)
+ assert executor.kc is None
+
+
class TestExecute(NBClientTestsBase):
"""Contains test functions for execute.py"""
@@ -377,12 +505,13 @@
output_nb = executor.execute()
km = executor.start_kernel_manager()
- with patch.object(km, "is_alive") as alive_mock:
- alive_mock.return_value = False
- # Will be a RuntimeError or subclass DeadKernelError depending
- # on if jupyter_client or nbconvert catches the dead client first
- with pytest.raises(RuntimeError):
- input_nb, output_nb = executor.execute()
+ async def is_alive():
+ return False
+ km.is_alive = is_alive
+ # Will be a RuntimeError or subclass DeadKernelError depending
+ # on if jupyter_client or nbconvert catches the dead client first
+ with pytest.raises(RuntimeError):
+ input_nb, output_nb = executor.execute()
def test_allow_errors(self):
"""
@@ -559,7 +688,11 @@
)
def test_deadline_exec_reply(self, executor, cell_mock, message_mock):
# exec_reply is never received, so we expect to hit the timeout.
- executor.kc.shell_channel.get_msg = MagicMock(side_effect=Empty())
+ async def get_msg(timeout):
+ await asyncio.sleep(timeout)
+ raise Empty
+
+ executor.kc.shell_channel.get_msg = get_msg
executor.timeout = 1
with pytest.raises(TimeoutError):
@@ -598,16 +731,21 @@
)
def test_eventual_deadline_iopub(self, executor, cell_mock, message_mock):
# Process a few messages before raising a timeout from iopub
- message_mock.side_effect = list(message_mock.side_effect)[:-1] +
[Empty()]
- executor.kc.shell_channel.get_msg = MagicMock(
- return_value={'parent_header': {'msg_id': executor.parent_id}}
+ def message_seq(messages):
+ for message in messages:
+ yield message
+ while True:
+ yield Empty()
+ message_mock.side_effect =
message_seq(list(message_mock.side_effect)[:-1])
+ executor.kc.shell_channel.get_msg = Mock(
+ return_value=make_async({'parent_header': {'msg_id':
executor.parent_id}})
)
executor.raise_on_iopub_timeout = True
with pytest.raises(TimeoutError):
executor.execute_cell(cell_mock, 0)
- assert message_mock.call_count == 3
+ assert message_mock.call_count >= 3
# Ensure the output was captured
self.assertListEqual(
cell_mock.outputs,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient/util.py
new/nbclient-0.2.0/nbclient/util.py
--- old/nbclient-0.1.0/nbclient/util.py 1970-01-01 01:00:00.000000000 +0100
+++ new/nbclient-0.2.0/nbclient/util.py 2020-03-31 21:26:48.000000000 +0200
@@ -0,0 +1,65 @@
+"""General utility methods"""
+
+# Copyright (c) Jupyter Development Team.
+# Distributed under the terms of the Modified BSD License.
+
+import asyncio
+import inspect
+
+
+def run_sync(coro):
+ """Runs a coroutine and blocks until it has executed.
+
+ An event loop is created if no one already exists. If an event loop is
+ already running, this event loop execution is nested into the already
+ running one if `nest_asyncio` is set to True.
+
+ Parameters
+ ----------
+ coro : coroutine
+ The coroutine to be executed.
+
+ Returns
+ -------
+ result :
+ Whatever the coroutine returns.
+ """
+ def wrapped(self, *args, **kwargs):
+ try:
+ loop = asyncio.get_event_loop()
+ except RuntimeError:
+ loop = asyncio.new_event_loop()
+ asyncio.set_event_loop(loop)
+ if self.nest_asyncio:
+ import nest_asyncio
+ nest_asyncio.apply(loop)
+ try:
+ result = loop.run_until_complete(coro(self, *args, **kwargs))
+ except RuntimeError as e:
+ if str(e) == 'This event loop is already running':
+ raise RuntimeError(
+ 'You are trying to run nbclient in an environment where an
'
+ 'event loop is already running. Please pass
`nest_asyncio=True` in '
+ '`NotebookClient.execute` and such methods.'
+ )
+ raise
+ return result
+ wrapped.__doc__ = coro.__doc__
+ return wrapped
+
+
+async def ensure_async(obj):
+ """Convert a non-awaitable object to a coroutine if needed,
+ and await it if it was not already awaited.
+ """
+ if inspect.isawaitable(obj):
+ try:
+ result = await obj
+ except RuntimeError as e:
+ if str(e) == 'cannot reuse already awaited coroutine':
+ # obj is already the coroutine's result
+ return obj
+ raise
+ return result
+ # obj doesn't need to be awaited
+ return obj
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient.egg-info/PKG-INFO
new/nbclient-0.2.0/nbclient.egg-info/PKG-INFO
--- old/nbclient-0.1.0/nbclient.egg-info/PKG-INFO 2020-02-11
08:19:37.000000000 +0100
+++ new/nbclient-0.2.0/nbclient.egg-info/PKG-INFO 2020-03-31
21:48:17.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: nbclient
-Version: 0.1.0
+Version: 0.2.0
Summary: A client library for executing notebooks. Formally nbconvert's
ExecutePreprocessor.
Home-page: https://jupyter.org
Author: Jupyter Development Team
@@ -10,9 +10,9 @@
Project-URL: Funding, https://numfocus.org/
Project-URL: Source, https://github.com/jupyter/nbclient
Project-URL: Tracker, https://github.com/jupyter/nbclient/issues
-Description: [](https://travis-ci.org/jupyter/nbclient)
+Description:
[](https://mybinder.org/v2/gh/jupyter/nbclient/master?filepath=binder%2Frun_nbclient.ipynb)
+ [](https://travis-ci.org/jupyter/nbclient)
[](https://codecov.io/github/jupyter/nbclient?branch=master)
- [](https://www.python.org/downloads/release/python-350/)
[](https://www.python.org/downloads/release/python-360/)
[](https://www.python.org/downloads/release/python-370/)
[](https://www.python.org/downloads/release/python-380/)
@@ -26,7 +26,7 @@
NBClient lets you:
- **execute** notebooks
+ - **execute** notebooks
Similar in nature to jupyter_client, as the jupyter_client is to the
jupyter
protocol nbclient is to notebooks allowing for execution contexts to
be run.
@@ -37,7 +37,7 @@
## Python Version Support
- This library currently supports python 3.5+ verisons. As minor python
+ This library currently supports python 3.6+ versions. As minor python
versions are officially sunset by the python org nbclient will
similarly
drop support in the future.
@@ -55,11 +55,10 @@
Classifier: License :: OSI Approved :: BSD License
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
-Requires-Python: >=3.5
+Requires-Python: >=3.6
Description-Content-Type: text/markdown
Provides-Extra: test
Provides-Extra: dev
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient.egg-info/SOURCES.txt
new/nbclient-0.2.0/nbclient.egg-info/SOURCES.txt
--- old/nbclient-0.1.0/nbclient.egg-info/SOURCES.txt 2020-02-11
08:19:37.000000000 +0100
+++ new/nbclient-0.2.0/nbclient.egg-info/SOURCES.txt 2020-03-31
21:48:17.000000000 +0200
@@ -13,6 +13,7 @@
tox.ini
docs/Makefile
docs/UPDATE.md
+docs/changelog.md
docs/client.rst
docs/conf.py
docs/environment.yml
@@ -29,6 +30,7 @@
nbclient/_version.py
nbclient/client.py
nbclient/exceptions.py
+nbclient/util.py
nbclient.egg-info/PKG-INFO
nbclient.egg-info/SOURCES.txt
nbclient.egg-info/dependency_links.txt
@@ -52,6 +54,7 @@
nbclient/tests/files/SVG.ipynb
nbclient/tests/files/Skip Exceptions with Cell Tags.ipynb
nbclient/tests/files/Skip Exceptions.ipynb
+nbclient/tests/files/Sleep1s.ipynb
nbclient/tests/files/Unicode.ipynb
nbclient/tests/files/UnicodePy3.ipynb
nbclient/tests/files/python.png
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/nbclient.egg-info/requires.txt
new/nbclient-0.2.0/nbclient.egg-info/requires.txt
--- old/nbclient-0.1.0/nbclient.egg-info/requires.txt 2020-02-11
08:19:37.000000000 +0100
+++ new/nbclient-0.2.0/nbclient.egg-info/requires.txt 2020-03-31
21:48:17.000000000 +0200
@@ -1,6 +1,8 @@
traitlets>=4.2
-jupyter_client>=5.3.4
+jupyter_client>=6.1.0
nbformat>=5.0
+async_generator
+nest_asyncio
[dev]
codecov
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/requirements.txt
new/nbclient-0.2.0/requirements.txt
--- old/nbclient-0.1.0/requirements.txt 2020-01-26 23:50:40.000000000 +0100
+++ new/nbclient-0.2.0/requirements.txt 2020-03-27 02:33:41.000000000 +0100
@@ -1,3 +1,5 @@
traitlets>=4.2
-jupyter_client>=5.3.4
+jupyter_client>=6.1.0
nbformat>=5.0
+async_generator
+nest_asyncio
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/setup.py new/nbclient-0.2.0/setup.py
--- old/nbclient-0.1.0/setup.py 2020-02-11 07:29:37.000000000 +0100
+++ new/nbclient-0.2.0/setup.py 2020-03-31 21:26:48.000000000 +0200
@@ -37,7 +37,7 @@
long_description = read(os.path.join(os.path.dirname(__file__), "README.md"))
requirements = read(os.path.join(os.path.dirname(__file__),
"requirements.txt"))
-dev_reqs = read_reqs('requirements-dev.txt')
+dev_reqs = read_reqs(os.path.join(os.path.dirname(__file__),
'requirements-dev.txt'))
extras_require = {"test": dev_reqs, "dev": dev_reqs}
setup(
@@ -51,7 +51,7 @@
long_description_content_type='text/markdown',
packages=['nbclient'],
include_package_data=True,
- python_requires=">=3.5",
+ python_requires=">=3.6",
install_requires=requirements,
extras_require=extras_require,
project_urls={
@@ -70,7 +70,6 @@
'License :: OSI Approved :: BSD License',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
- 'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/nbclient-0.1.0/tox.ini new/nbclient-0.2.0/tox.ini
--- old/nbclient-0.1.0/tox.ini 2020-01-28 07:24:54.000000000 +0100
+++ new/nbclient-0.2.0/tox.ini 2020-03-31 21:26:48.000000000 +0200
@@ -1,6 +1,6 @@
[tox]
skipsdist = true
-envlist = py{35,36,37,38}, flake8, dist, manifest, docs
+envlist = py{36,37,38}, flake8, dist, manifest, docs
# Linters
[testenv:flake8]
@@ -40,13 +40,21 @@
PYTHONHASHSEED = 0
passenv = *
basepython =
- py35: python3.5
py36: python3.6
py37: python3.7
py38: python3.8
- flake8: python3.6
- manifest: python3.6
- dist: python3.6
- docs: python3.6
+ flake8: python3.8
+ manifest: python3.8
+ binder: python3.8
+ dist: python3.8
+ docs: python3.8
deps = .[dev]
-commands = pytest -vv --maxfail=2 --cov=nbclient -W always {posargs}
+commands =
+ pytest -vv --maxfail=2 --cov=nbclient -W always {posargs}
+
+# Binder
+[testenv:binder]
+description = ensure /binder/*ipynb are runnable
+deps =
+ -r binder/requirements.txt
+commands = python -c "from glob import glob; from nbclient import execute;
import nbformat as nbf; [execute(nbf.read(input, nbf.NO_CONVERT),
cwd='./binder') for input in glob('binder/**/*.ipynb')]"