Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-vega_datasets for
openSUSE:Factory checked in at 2021-02-19 23:45:51
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-vega_datasets (Old)
and /work/SRC/openSUSE:Factory/.python-vega_datasets.new.28504 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-vega_datasets"
Fri Feb 19 23:45:51 2021 rev:5 rq:873801 version:0.9.0
Changes:
--------
---
/work/SRC/openSUSE:Factory/python-vega_datasets/python-vega_datasets.changes
2020-01-31 23:58:19.731683870 +0100
+++
/work/SRC/openSUSE:Factory/.python-vega_datasets.new.28504/python-vega_datasets.changes
2021-02-19 23:46:20.819427020 +0100
@@ -1,0 +2,10 @@
+Fri Feb 19 14:29:49 UTC 2021 - Ben Greiner <[email protected]>
+
+- Update to 0.9.0
+ * Change urls to use jsDelivr (a fast CDN) with a fixed version
+ number, instead of GitHub. This fixes the URLs broken by the
+ vega-datasets 2.0 release.
+- Skip python36: Pandas is no longer available for the TW python36
+ flavor (NEP 29, NumPy 1.20)
+
+-------------------------------------------------------------------
Old:
----
vega_datasets-0.8.0.tar.gz
New:
----
vega_datasets-0.9.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-vega_datasets.spec ++++++
--- /var/tmp/diff_new_pack.BZDSKE/_old 2021-02-19 23:46:21.563427750 +0100
+++ /var/tmp/diff_new_pack.BZDSKE/_new 2021-02-19 23:46:21.567427753 +0100
@@ -1,7 +1,7 @@
#
# spec file for package python-vega_datasets
#
-# Copyright (c) 2020 SUSE LLC
+# Copyright (c) 2021 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,8 +17,10 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
+%define skip_python2 1
+%define skip_python36 1
Name: python-vega_datasets
-Version: 0.8.0
+Version: 0.9.0
Release: 0
Summary: A Python package for offline access to Vega datasets
License: MIT
@@ -51,10 +53,11 @@
%python_expand %fdupes %{buildroot}%{$python_sitelib}
%check
-%python_expand BUILDROOT=%{buildroot}%{$python_sitelib}
pytest-%{$python_bin_suffix} vega_datasets/tests
+%pytest vega_datasets/tests
%files %{python_files}
%license LICENSE
-%{python_sitelib}/*
+%{python_sitelib}/vega_datasets
+%{python_sitelib}/vega_datasets-%{version}*-info
%changelog
++++++ vega_datasets-0.8.0.tar.gz -> vega_datasets-0.9.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/CHANGES.md
new/vega_datasets-0.9.0/CHANGES.md
--- old/vega_datasets-0.8.0/CHANGES.md 2019-12-14 16:01:56.000000000 +0100
+++ new/vega_datasets-0.9.0/CHANGES.md 2020-11-26 14:55:05.000000000 +0100
@@ -1,6 +1,11 @@
Change Log
==========
+Release v0.9 (Nov 26, 2020)
+---------------------------
+- Change urls to use jsDelivr (a fast CDN) with a fixed version number,
instead of GitHub.
+ This fixes the URLs broken by the vega-datasets 2.0 release.
+
Release v0.8 (Dec 14, 2019)
---------------------------
- Include all data from [vega-datasets
v1.29.0](https://github.com/vega/vega-datasets/releases/tag/v1.29.0)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/PKG-INFO
new/vega_datasets-0.9.0/PKG-INFO
--- old/vega_datasets-0.8.0/PKG-INFO 2019-12-14 16:04:28.000000000 +0100
+++ new/vega_datasets-0.9.0/PKG-INFO 2020-11-26 14:56:26.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: vega_datasets
-Version: 0.8.0
+Version: 0.9.0
Summary: A Python package for offline access to Vega datasets
Home-page: http://github.com/altair-viz/vega_datasets
Author: Jake VanderPlas
@@ -14,6 +14,8 @@
Description: # vega_datasets
[](https://travis-ci.org/altair-viz/vega_datasets)
+ [](https://github.com/altair-viz/vega_datasets/actions?query=workflow%3Abuild)
+ [](https://github.com/altair-viz/vega_datasets/actions?query=workflow%3Alint)
[](https://github.com/psf/black)
A Python package for offline access to [vega
datasets](https://github.com/vega/vega-datasets).
@@ -27,7 +29,7 @@
Currently the package bundles a half-dozen datasets, and falls back to
using HTTP requests for the others.
## Installation
-
+ ``vega_datasets`` is compatible with Python 3.5 or newer. Install with:
```
$ pip install vega_datasets
```
@@ -58,7 +60,7 @@
```python
>>> data.iris.url
- 'https://vega.github.io/vega-datasets/data/iris.json'
+ 'https://cdn.jsdelivr.net/npm/[email protected]/data/iris.json'
```
For datasets bundled with the package, you can also find their
location on disk:
@@ -106,8 +108,9 @@
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
-Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Requires-Python: >=3.5
Description-Content-Type: text/markdown
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/README.md
new/vega_datasets-0.9.0/README.md
--- old/vega_datasets-0.8.0/README.md 2019-12-04 04:09:44.000000000 +0100
+++ new/vega_datasets-0.9.0/README.md 2020-11-26 14:47:32.000000000 +0100
@@ -1,6 +1,8 @@
# vega_datasets
[](https://travis-ci.org/altair-viz/vega_datasets)
+[](https://github.com/altair-viz/vega_datasets/actions?query=workflow%3Abuild)
+[](https://github.com/altair-viz/vega_datasets/actions?query=workflow%3Alint)
[](https://github.com/psf/black)
A Python package for offline access to [vega
datasets](https://github.com/vega/vega-datasets).
@@ -14,7 +16,7 @@
Currently the package bundles a half-dozen datasets, and falls back to using
HTTP requests for the others.
## Installation
-
+``vega_datasets`` is compatible with Python 3.5 or newer. Install with:
```
$ pip install vega_datasets
```
@@ -45,7 +47,7 @@
```python
>>> data.iris.url
-'https://vega.github.io/vega-datasets/data/iris.json'
+'https://cdn.jsdelivr.net/npm/[email protected]/data/iris.json'
```
For datasets bundled with the package, you can also find their location on
disk:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/setup.cfg
new/vega_datasets-0.9.0/setup.cfg
--- old/vega_datasets-0.8.0/setup.cfg 2019-12-14 16:04:28.000000000 +0100
+++ new/vega_datasets-0.9.0/setup.cfg 2020-11-26 14:56:26.000000000 +0100
@@ -1,5 +1,4 @@
[flake8]
-exclude = altair_transform/utils/_parser_Parser_parsetab.py
max-line-length = 88
ignore = E203, E266, E501, W503
max-complexity = 18
@@ -9,9 +8,6 @@
description-file = README.md
license_file = LICENSE
-[bdist_wheel]
-universal = 1
-
[egg_info]
tag_build =
tag_date = 0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/setup.py
new/vega_datasets-0.9.0/setup.py
--- old/vega_datasets-0.8.0/setup.py 2019-12-04 04:07:43.000000000 +0100
+++ new/vega_datasets-0.9.0/setup.py 2019-12-14 16:39:50.000000000 +0100
@@ -41,6 +41,7 @@
download_url="http://github.com/altair-viz/vega_datasets",
license="MIT",
install_requires=["pandas"],
+ python_requires=">=3.5",
tests_require=["pytest"],
packages=find_packages(exclude=["tools"]),
package_data={
@@ -59,10 +60,10 @@
"Intended Audience :: Science/Research",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
- "Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
+ "Programming Language :: Python :: 3.8",
],
project_urls={
"Bug Reports": "https://github.com/altair-viz/vega_datasets/issues",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/vega_datasets/__init__.py
new/vega_datasets-0.9.0/vega_datasets/__init__.py
--- old/vega_datasets-0.8.0/vega_datasets/__init__.py 2019-12-14
16:02:18.000000000 +0100
+++ new/vega_datasets-0.9.0/vega_datasets/__init__.py 2020-11-26
14:55:05.000000000 +0100
@@ -2,8 +2,4 @@
data = DataLoader()
local_data = LocalDataLoader()
-__version__ = "0.8.0"
-
-# This is the tag in http://github.com/vega/vega-datasets from
-# which the datasets in this repository are sourced.
-SOURCE_TAG = "v1.29.0"
+__version__ = "0.9.0"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/vega_datasets/_compat.py
new/vega_datasets-0.9.0/vega_datasets/_compat.py
--- old/vega_datasets-0.8.0/vega_datasets/_compat.py 2019-12-04
04:07:43.000000000 +0100
+++ new/vega_datasets-0.9.0/vega_datasets/_compat.py 1970-01-01
01:00:00.000000000 +0100
@@ -1,19 +0,0 @@
-# flake8: noqa
-
-try:
- from urllib.error import URLError, HTTPError
- from urllib.request import urlopen, urlretrieve
- from io import BytesIO
-
- def bytes_decode(bytes_, encoding="utf-8"):
- return bytes_.decode(encoding)
-
-
-except ImportError: # noqa: F401
- # Python 2.X
- from urllib2 import URLError, HTTPError, urlopen
- from urllib import urlretrieve
- from StringIO import StringIO as BytesIO
-
- def bytes_decode(bytes_, encoding="utf-8"):
- return bytes_
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/vega_datasets/core.py
new/vega_datasets-0.9.0/vega_datasets/core.py
--- old/vega_datasets-0.8.0/vega_datasets/core.py 2019-12-04
04:07:43.000000000 +0100
+++ new/vega_datasets-0.9.0/vega_datasets/core.py 2020-11-26
14:47:32.000000000 +0100
@@ -1,14 +1,18 @@
+from io import BytesIO
import os
import json
import pkgutil
import textwrap
-
+from typing import Any, Dict, Iterable, List
+from urllib.request import urlopen
import pandas as pd
-from vega_datasets._compat import urlopen, BytesIO, bytes_decode
+# This is the tag in http://github.com/vega/vega-datasets from
+# which the datasets in this repository are sourced.
+SOURCE_TAG = "v1.29.0"
-def _load_dataset_info():
+def _load_dataset_info() -> Dict[str, Dict[str, Any]]:
"""This loads dataset info from three package files:
vega_datasets/datasets.json
@@ -18,9 +22,11 @@
It returns a dictionary with dataset information.
"""
- def load_json(path):
+ def load_json(path: str) -> Dict[str, Any]:
raw = pkgutil.get_data("vega_datasets", path)
- return json.loads(bytes_decode(raw))
+ if raw is None:
+ raise ValueError("Cannot locate package path
vega_datasets:{}".format(path))
+ return json.loads(raw.decode())
info = load_json("datasets.json")
descriptions = load_json("dataset_info.json")
@@ -88,13 +94,13 @@
_reference_info = """
For information on this dataset, see https://github.com/vega/vega-datasets/
"""
- base_url = "https://vega.github.io/vega-datasets/data/"
+ base_url = "https://cdn.jsdelivr.net/npm/vega-datasets@" + SOURCE_TAG +
"/data/"
_dataset_info = _load_dataset_info()
- _pd_read_kwds = {}
+ _pd_read_kwds = {} # type: Dict[str, Any]
_return_type = pd.DataFrame
@classmethod
- def init(cls, name):
+ def init(cls, name: str) -> "Dataset":
"""Return an instance of this class or an appropriate subclass"""
clsdict = {
subcls.name: subcls
@@ -103,7 +109,7 @@
}
return clsdict.get(name, cls)(name)
- def __init__(self, name):
+ def __init__(self, name: str):
info = self._infodict(name)
self.name = name
self.methodname = name.replace("-", "_")
@@ -116,7 +122,7 @@
self.references = info.get("references", None)
self.__doc__ = self._make_docstring()
- def _make_docstring(self):
+ def _make_docstring(self) -> str:
info = self._infodict(self.name)
# construct, indent, and line-wrap dataset description
@@ -131,15 +137,13 @@
description = "\n".join(wrapper.wrap(description))
# construct, indent, and join references
- references = info.get("references", [])
- references = (
- ".. [{0}] ".format(i + 1) + ref for i, ref in enumerate(references)
- )
+ reflist = info.get("references", []) # type: Iterable[str]
+ reflist = (".. [{0}] ".format(i + 1) + ref for i, ref in
enumerate(reflist))
wrapper = textwrap.TextWrapper(
width=70, initial_indent=4 * " ", subsequent_indent=7 * " "
)
- references = ("\n".join(wrapper.wrap(ref)) for ref in references)
- references = "\n\n".join(references)
+ reflist = ("\n".join(wrapper.wrap(ref)) for ref in reflist)
+ references = "\n\n".join(reflist) # type: str
if references.strip():
references = "References\n ----------\n" + references
@@ -165,18 +169,18 @@
)
@classmethod
- def list_datasets(cls):
+ def list_datasets(cls) -> List[str]:
"""Return a list of names of available datasets"""
return sorted(cls._dataset_info.keys())
@classmethod
- def list_local_datasets(cls):
+ def list_local_datasets(cls) -> List[str]:
return sorted(
name for name, info in cls._dataset_info.items() if
info["is_local"]
)
@classmethod
- def _infodict(cls, name):
+ def _infodict(cls, name: str) -> Dict[str, str]:
"""load the info dictionary for the given name"""
info = cls._dataset_info.get(name, None)
if info is None:
@@ -187,7 +191,7 @@
)
return info
- def raw(self, use_local=True):
+ def raw(self, use_local: bool = True) -> bytes:
"""Load the raw dataset from remote URL or local file
Parameters
@@ -198,11 +202,16 @@
data from an external URL.
"""
if use_local and self.is_local:
- return pkgutil.get_data("vega_datasets", self.pkg_filename)
+ out = pkgutil.get_data("vega_datasets", self.pkg_filename)
+ if out is not None:
+ return out
+ raise ValueError(
+ "Cannot locate package path
vega_datasets:{}".format(self.pkg_filename)
+ )
else:
return urlopen(self.url).read()
- def __call__(self, use_local=True, **kwargs):
+ def __call__(self, use_local: bool = True, **kwargs) -> pd.DataFrame:
"""Load and parse the dataset from remote URL or local file
Parameters
@@ -241,7 +250,7 @@
)
@property
- def filepath(self):
+ def filepath(self) -> str:
if not self.is_local:
raise ValueError("filepath is only valid for local datasets")
else:
@@ -337,7 +346,7 @@
def __call__(self, use_local=True, **kwargs):
__doc__ = super(Miserables, self).__call__.__doc__ # noqa:F841
- dct = json.loads(bytes_decode(self.raw(use_local=use_local)), **kwargs)
+ dct = json.loads(self.raw(use_local=use_local).decode(), **kwargs)
nodes = pd.DataFrame.from_records(dct["nodes"], index="index")
links = pd.DataFrame.from_records(dct["links"])
return nodes, links
@@ -379,7 +388,7 @@
def __call__(self, use_local=True, **kwargs):
__doc__ = super(US_10M, self).__call__.__doc__ # noqa:F841
- return json.loads(bytes_decode(self.raw(use_local=use_local)),
**kwargs)
+ return json.loads(self.raw(use_local=use_local).decode(), **kwargs)
class World_110M(Dataset):
@@ -393,7 +402,7 @@
def __call__(self, use_local=True, **kwargs):
__doc__ = super(World_110M, self).__call__.__doc__ # noqa:F841
- return json.loads(bytes_decode(self.raw(use_local=use_local)),
**kwargs)
+ return json.loads(self.raw(use_local=use_local).decode(), **kwargs)
class ZIPCodes(Dataset):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/vega_datasets/utils.py
new/vega_datasets-0.9.0/vega_datasets/utils.py
--- old/vega_datasets-0.8.0/vega_datasets/utils.py 2018-01-21
06:26:48.000000000 +0100
+++ new/vega_datasets-0.9.0/vega_datasets/utils.py 2019-12-14
17:15:45.000000000 +0100
@@ -1,8 +1,9 @@
from vega_datasets.core import Dataset
-from vega_datasets._compat import urlopen, HTTPError, URLError
+from urllib.request import urlopen
+from urllib.error import HTTPError, URLError
-def connection_ok():
+def connection_ok() -> bool:
"""Check web connection.
Returns True if web connection is OK, False otherwise.
"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore' old/vega_datasets-0.8.0/vega_datasets.egg-info/PKG-INFO
new/vega_datasets-0.9.0/vega_datasets.egg-info/PKG-INFO
--- old/vega_datasets-0.8.0/vega_datasets.egg-info/PKG-INFO 2019-12-14
16:04:28.000000000 +0100
+++ new/vega_datasets-0.9.0/vega_datasets.egg-info/PKG-INFO 2020-11-26
14:56:26.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: vega-datasets
-Version: 0.8.0
+Version: 0.9.0
Summary: A Python package for offline access to Vega datasets
Home-page: http://github.com/altair-viz/vega_datasets
Author: Jake VanderPlas
@@ -14,6 +14,8 @@
Description: # vega_datasets
[](https://travis-ci.org/altair-viz/vega_datasets)
+ [](https://github.com/altair-viz/vega_datasets/actions?query=workflow%3Abuild)
+ [](https://github.com/altair-viz/vega_datasets/actions?query=workflow%3Alint)
[](https://github.com/psf/black)
A Python package for offline access to [vega
datasets](https://github.com/vega/vega-datasets).
@@ -27,7 +29,7 @@
Currently the package bundles a half-dozen datasets, and falls back to
using HTTP requests for the others.
## Installation
-
+ ``vega_datasets`` is compatible with Python 3.5 or newer. Install with:
```
$ pip install vega_datasets
```
@@ -58,7 +60,7 @@
```python
>>> data.iris.url
- 'https://vega.github.io/vega-datasets/data/iris.json'
+ 'https://cdn.jsdelivr.net/npm/[email protected]/data/iris.json'
```
For datasets bundled with the package, you can also find their
location on disk:
@@ -106,8 +108,9 @@
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
-Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Requires-Python: >=3.5
Description-Content-Type: text/markdown
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn'
'--exclude=.svnignore'
old/vega_datasets-0.8.0/vega_datasets.egg-info/SOURCES.txt
new/vega_datasets-0.9.0/vega_datasets.egg-info/SOURCES.txt
--- old/vega_datasets-0.8.0/vega_datasets.egg-info/SOURCES.txt 2019-12-14
16:04:28.000000000 +0100
+++ new/vega_datasets-0.9.0/vega_datasets.egg-info/SOURCES.txt 2020-11-26
14:56:26.000000000 +0100
@@ -7,7 +7,6 @@
setup.cfg
setup.py
vega_datasets/__init__.py
-vega_datasets/_compat.py
vega_datasets/core.py
vega_datasets/dataset_info.json
vega_datasets/datasets.json