Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-dask for openSUSE:Factory 
checked in at 2021-04-06 17:29:45
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-dask (Old)
 and      /work/SRC/openSUSE:Factory/.python-dask.new.2401 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-dask"

Tue Apr  6 17:29:45 2021 rev:43 rq:883195 version:2021.4.0

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-dask/python-dask.changes  2021-03-12 
13:33:13.694318136 +0100
+++ /work/SRC/openSUSE:Factory/.python-dask.new.2401/python-dask.changes        
2021-04-06 17:31:06.419213709 +0200
@@ -1,0 +2,128 @@
+Sun Apr  4 16:38:31 UTC 2021 - Arun Persaud <a...@gmx.de>
+
+-  update to version 2021.4.0:
+  * Adding support for multidimensional histograms with
+    dask.array.histogramdd (:pr:`7387`) Doug Davis
+  * Update docs on number of threads and workers in default
+    LocalCluster (:pr:`7497`) cameron16
+  * Add labels automatically when certain files are touched in a PR
+    (:pr:`7506`) Julia Signell
+  * Extract ignore_order from kwargs (:pr:`7500`) GALI PREM SAGAR
+  * Only provide installation instructions when distributed is missing
+    (:pr:`7498`) Matthew Rocklin
+  * Start adding isort (:pr:`7370`) Julia Signell
+  * Add ignore_order parameter in dd.concat (:pr:`7473`) Daniel
+    Mesejo-Le??n
+  * Use powers-of-two when displaying RAM (:pr:`7484`) Guido Imperiale
+  * Added License Classifier (:pr:`7485`) Tom Augspurger
+  * Replace conda with mamba (:pr:`7227`) Guido Imperiale
+  * Fix typo in array docs (:pr:`7478`) James Lamb
+  * Use concurrent.futures in local scheduler (:pr:`6322`) John A
+    Kirkham
+
+-------------------------------------------------------------------
+Tue Mar 30 21:47:53 UTC 2021 - Ben Greiner <c...@bnavigator.de>
+
+- Update to 2021.3.1
+  * Add a dispatch for is_categorical_dtype to handle non-pandas 
+    objects (GH#7469) brandon-b-miller
+  * Use multiprocessing.Pool in test_read_text (GH#7472) John A 
+    Kirkham
+  * Add missing meta kwarg to gufunc class (GH#7423) Peter Andreas 
+    Entschev
+  * Example for memory-mapped Dask array (GH#7380) Dieter Weber
+  * Fix NumPy upstream failures xfail pandas and fastparquet 
+    failures (GH#7441) Julia Signell
+  * Fix bug in repartition with freq (GH#7357) Ruben van de Geer
+  * Fix __array_function__ dispatching for tril/triu (GH#7457) 
+    Peter Andreas Entschev
+  * Use concurrent.futures.Executors in a few tests (GH#7429) John 
+    A Kirkham
+  * Require NumPy >=1.16 (GH#7383) Guido Imperiale
+  * Minor sort_values housekeeping (GH#7462) Ryan Williams
+  * Ensure natural sort order in parquet part paths (GH#7249) Ryan 
+    Williams
+  * Remove global env mutation upon running test_config.py 
+    (GH#7464) Hristo
+  * Update NumPy intersphinx URL (GH#7460) Gabe Joseph
+  * Add rot90 (GH#7440) Trevor Manz
+  * Update docs for required package for endpoint (GH#7454) Nick 
+    Vazquez
+  * Master -> main in slice_array docstring (GH#7453) Gabe Joseph
+  * Expand dask.utils.is_arraylike docstring (GH#7445) Doug Davis
+  * Simplify BlockwiseIODeps importing (GH#7420) Richard (Rick) 
+    Zamora
+  * Update layer annotation packing method (GH#7430) James Bourbeau
+  * Drop duplicate test in test_describe_empty (GH#7431) John A 
+    Kirkham
+  * Add Series.dot method to dataframe module (GH#7236) Madhu94
+  * Added df kurtosis-method and testing (GH#7273) Jan Borchmann
+  * Avoid quadratic-time performance for HLG culling (GH#7403) 
+    Bruce Merry
+  * Temporarily skip problematic sparse test (GH#7421) James 
+    Bourbeau
+  * Update some CI workflow names (GH#7422) James Bourbeau
+  * Fix HDFS test (GH#7418) Julia Signell
+  * Make changelog subtitles match the hierarchy (GH#7419) Julia 
+    Signell
+  * Add support for normalize in value_counts (GH#7342) Julia 
+    Signell
+  * Avoid unnecessary imports for HLG Layer unpacking and 
+    materialization (GH#7381) Richard (Rick) Zamora
+  * Bincount fix slicing (GH#7391) Genevieve Buckley
+  * Add sliding_window_view (GH#7234) Deepak Cherian
+  * Fix typo in docs/source/develop.rst (GH#7414) Hristo
+  * Switch documentation builds for PRs to readthedocs (GH#7397) 
+    James Bourbeau
+  * Adds sort_values to dask.DataFrame (GH#7286) gerrymanoim
+  * Pin sqlalchemy<1.4.0 in CI (GH#7405) James Bourbeau
+  * Comment fixes (GH#7215) Ryan Williams
+  * Dead code removal / fixes (GH#7388) Ryan Williams
+  * Use single thread for pa.Table.from_pandas calls (GH#7347) 
+    Richard (Rick) Zamora
+  * Replace 'container' with 'image' (GH#7389) James Lamb
+  * DOC hyperlink repartition (GH#7394) Ray Bell
+  * Pass delimiter to fsspec in bag.read_text (GH#7349) Martin 
+    Durant
+  * Update read_hdf default mode to "r" (GH#7039) rs9w33
+  * Embed literals in SubgraphCallable when packing Blockwise 
+    (GH#7353) Mads R. B. Kristensen
+  * Update test_hdf.py to not reuse file handlers (GH#7044) rs9w33
+  * Require additional dependencies: cloudpickle, partd, fsspec, 
+    toolz (GH#7345) Julia Signell
+  * Prepare Blockwise + IO infrastructure (GH#7281) Richard (Rick) 
+    Zamora
+  * Remove duplicated imports from test_slicing.py (GH#7365) Hristo
+  * Add test deps for pip development (GH#7360) Julia Signell
+  * Support int slicing for non-NumPy arrays (GH#7364) Peter 
+    Andreas Entschev
+  * Automatically cancel previous CI builds (GH#7348) James 
+    Bourbeau
+  * dask.array.asarray should handle case where xarray class is in 
+    top-level namespace (GH#7335) Tom White
+  * HighLevelGraph length without materializing layers (GH#7274) 
+    Gabe Joseph
+  * Drop support for Python 3.6 (GH#7006) James Bourbeau
+  * Fix fsspec usage in create_metadata_file (GH#7295) Richard 
+    (Rick) Zamora
+  * Change default branch from master to main (GH#7198) Julia 
+    Signell
+  * Add Xarray to CI software environment (GH#7338) James Bourbeau
+  * Update repartition argument name in error text (GH#7336) Eoin 
+    Shanaghy
+  * Run upstream tests based on commit message (GH#7329) James 
+    Bourbeau
+  * Use pytest.register_assert_rewrite on util modules (GH#7278) 
+    Bruce Merry
+  * Add example on using specific chunk sizes in from_array() 
+    (GH#7330) James Lamb
+  * Move NumPy skip into test (GH#7247) Julia Signell
+- Update package descriptions
+- Add dask-delayed and dask-diagnostics packages 
+- Drop dask-multiprocessing package merged into main
+- Skip python36: upstream dropped support for Python < 3.7
+- Drop dask-pr7247-numpyskip.patch merged upstream
+- Test more optional requirements for better compatibility
+  assurance.
+
+-------------------------------------------------------------------

Old:
----
  dask-2021.3.0.tar.gz
  dask-pr7247-numpyskip.patch

New:
----
  dask-2021.4.0.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-dask.spec ++++++
--- /var/tmp/diff_new_pack.NFczxB/_old  2021-04-06 17:31:07.211214605 +0200
+++ /var/tmp/diff_new_pack.NFczxB/_new  2021-04-06 17:31:07.215214610 +0200
@@ -1,5 +1,5 @@
 #
-# spec file for package python-dask
+# spec file for package python-dask-test
 #
 # Copyright (c) 2021 SUSE LLC
 #
@@ -26,120 +26,133 @@
 %bcond_with test
 %endif
 %define         skip_python2 1
+%define         skip_python36 1
 Name:           python-dask%{psuffix}
-Version:        2021.3.0
+Version:        2021.4.0
 Release:        0
 Summary:        Minimal task scheduling abstraction
 License:        BSD-3-Clause
 URL:            https://dask.org
 Source:         
https://files.pythonhosted.org/packages/source/d/dask/dask-%{version}.tar.gz
-# PATCH-FIX-UPSTREAM dask-pr7247-numpyskip.patch -- gh#dask/dask#7247
-Patch0:         dask-pr7247-numpyskip.patch
-BuildRequires:  %{python_module PyYAML}
-BuildRequires:  %{python_module base >= 3.6}
+BuildRequires:  %{python_module base >= 3.7}
 BuildRequires:  %{python_module setuptools}
 BuildRequires:  fdupes
 BuildRequires:  python-rpm-macros
 Requires:       python-PyYAML
+Requires:       python-cloudpickle >= 1.1.1
+Requires:       python-fsspec >= 0.6.0
+Requires:       python-partd >= 0.3.10
+Requires:       python-toolz >= 0.8.2
 Recommends:     %{name}-array = %{version}
 Recommends:     %{name}-bag = %{version}
 Recommends:     %{name}-dataframe = %{version}
+Recommends:     %{name}-delayed = %{version}
 Recommends:     %{name}-distributed = %{version}
 Recommends:     %{name}-dot = %{version}
-Recommends:     %{name}-multiprocessing = %{version}
-Recommends:     python-bokeh >= 1.0.0
-Recommends:     python-cloudpickle >= 0.2.2
+Recommends:     python-SQLAlchemy
 Recommends:     python-cityhash
 Recommends:     python-distributed >= %{version}
 Recommends:     python-fastparquet
-Recommends:     python-fsspec >= 0.6.0
 Recommends:     python-gcsfs >= 0.4.0
 Recommends:     python-murmurhash
-Recommends:     python-partd >= 0.3.10
 Recommends:     python-psutil
 Recommends:     python-pyarrow >= 0.14.0
 Recommends:     python-s3fs >= 0.4.0
-Recommends:     python-SQLAlchemy
-Recommends:     python-toolz >= 0.8.2
 Recommends:     python-xxhash
+Suggests:       %{name}-all = %{version}
+Suggests:       %{name}-diagnostics = %{version}
+Provides:       %{name}-multiprocessing = %{version}-%{release}
+Obsoletes:      %{name}-multiprocessing < %{version}-%{release}
 BuildArch:      noarch
 %if %{with test}
+# test that we specified all requirements correctly in the core
+# and subpackages by only requiring dask-all and optional extras
+BuildRequires:  %{python_module dask-all = %{version}}
+BuildRequires:  %{python_module pytest-rerunfailures}
+BuildRequires:  %{python_module pytest-xdist}
+BuildRequires:  %{python_module pytest}
+# SECTION additional optionally tested (importorskip) packages
+BuildRequires:  %{python_module SQLAlchemy}
 BuildRequires:  %{python_module cachey}
-BuildRequires:  %{python_module cloudpickle >= 0.2.2}
-BuildRequires:  %{python_module distributed >= %{version}}
-# optional zarr needs fsspec >= 0.8.4 if present
+BuildRequires:  %{python_module fastparquet}
+# optional zarr increases fsspec miminum to 0.8.4 if present
 BuildRequires:  %{python_module fsspec >= 0.8.4}
-BuildRequires:  %{python_module graphviz}
+BuildRequires:  %{python_module h5py}
 BuildRequires:  %{python_module ipython}
 BuildRequires:  %{python_module jsonschema}
+BuildRequires:  %{python_module matplotlib}
 BuildRequires:  %{python_module mimesis}
 BuildRequires:  %{python_module multipledispatch}
-BuildRequires:  %{python_module partd >= 0.3.10}
-BuildRequires:  %{python_module pytest-rerunfailures}
-BuildRequires:  %{python_module pytest}
-BuildRequires:  %{python_module toolz >= 0.8.2}
-BuildRequires:  graphviz
-BuildRequires:  graphviz-gd
-BuildRequires:  graphviz-gnome
-BuildRequires:  %{python_module numpy >= 1.15.1 if (%python-base without 
python36-base)}
-BuildRequires:  %{python_module pandas >= 0.25.0 if (%python-base without 
python36-base)}
-BuildRequires:  %{python_module tables if (%python-base without python36-base)}
-BuildRequires:  %{python_module zarr if (%python-base without python36-base)}
-# pytest-xdist is not a hard requirement for testing, but this avoids a hang of
-# pytest on i586 after successfully passing the test suite
-BuildRequires:  %{python_module pytest-xdist}
+BuildRequires:  %{python_module numba}
+# snappy required for using fastparquet
+BuildRequires:  %{python_module python-snappy}
+BuildRequires:  %{python_module requests}
+BuildRequires:  %{python_module scikit-image}
+BuildRequires:  %{python_module scipy}
+BuildRequires:  %{python_module sparse}
+BuildRequires:  %{python_module tables}
+BuildRequires:  %{python_module xarray}
+BuildRequires:  %{python_module zarr}
+# /SECTION
 %endif
 %python_subpackages
 
 %description
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
 
-# This must have a Requires for dask and all the dask subpackages
 %package all
+# This must have a Requires for dask and all the dask subpackages
 Summary:        All dask components
 Requires:       %{name} = %{version}
+Requires:       %{name}-array = %{version}
 Requires:       %{name}-bag = %{version}
+Requires:       %{name}-dataframe = %{version}
+Requires:       %{name}-delayed = %{version}
+Requires:       %{name}-diagnostics = %{version}
 Requires:       %{name}-distributed = %{version}
 Requires:       %{name}-dot = %{version}
-Requires:       %{name}-multiprocessing = %{version}
-%if "%python_flavor" != "python36"
-Requires:       %{name}-array = %{version}
-Requires:       %{name}-dataframe = %{version}
-%endif
 
 %description all
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
-
-%if "%python_flavor" == "python36"
-This package pulls in all the optional dask components, except for dataframe
-and array, because NumPy does not support Python 3.6 anymore.
-%else
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
+
 This package pulls in all the optional dask components.
-%endif
 
 %package array
 Summary:        Numpy-like array data structure for dask
 Requires:       %{name} = %{version}
-Requires:       python-numpy >= 1.15.1
-Requires:       python-toolz >= 0.8.2
+Requires:       %{name}-delayed = %{version}
+Requires:       python-numpy >= 1.16
+Recommends:     python-scipy
 
 %description array
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
 
 This package contains the dask array class.
 
@@ -149,19 +162,18 @@
 %package bag
 Summary:        Data structure generic python objects in dask
 Requires:       %{name} = %{version}
-Requires:       %{name}-multiprocessing = %{version}
-Requires:       python-cloudpickle >= 0.2.2
-Requires:       python-fsspec >= 0.6.0
-Requires:       python-partd >= 0.3.10
-Requires:       python-toolz >= 0.8.2
 
 %description bag
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
 
 This package contains the dask bag class.
 
@@ -174,21 +186,20 @@
 Summary:        Pandas-like DataFrame data structure for dask
 Requires:       %{name} = %{version}
 Requires:       %{name}-array = %{version}
-Requires:       %{name}-multiprocessing = %{version}
-Requires:       python-fsspec >= 0.6.0
-Requires:       python-numpy >= 1.15.1
+Requires:       python-numpy >= 1.16
 Requires:       python-pandas >= 0.25.0
-Requires:       python-partd >= 0.3.10
-Requires:       python-toolz >= 0.8.2
-Recommends:     %{name}-bag = %{version}
 
 %description dataframe
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
 
 This package contains the dask DataFrame class.
 
@@ -200,21 +211,58 @@
 %package distributed
 Summary:        Interface with the distributed task scheduler in dask
 Requires:       %{name} = %{version}
-Requires:       python-distributed >= 2.0
+Requires:       python-distributed >= %{version}
 
 %description distributed
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
-
-This package contains the dask distributed interface.
-
-Dask.distributed is a lightweight library for distributed computing in
-Python. It extends both the concurrent.futures and dask APIs to
-moderate sized clusters.
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
+
+This meta package pulls in the distributed module into the dask namespace.
+
+%package diagnostics
+Summary:        Diagnostics for dask
+Requires:       %{name} = %{version}
+Requires:       python-bokeh >= 1.0.0
+
+%description diagnostics
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
+
+This package contains the dask.diagnostics module
+
+%package delayed
+Summary:        Delayed module for dask
+Requires:       %{name} = %{version}
+
+%description delayed
+A flexible library for parallel computing in Python.
+
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
+
+This package contains the dask.delayed module
 
 %package dot
 Summary:        Display dask graphs using graphviz
@@ -225,30 +273,18 @@
 Requires:       python-graphviz
 
 %description dot
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
-
-This package contains the graphviz dot rendering interface.
-
-%package multiprocessing
-Summary:        Display dask graphs using graphviz
-Requires:       %{name} = %{version}
-Requires:       python-cloudpickle >= 0.2.2
-Requires:       python-partd >= 0.3.10
+A flexible library for parallel computing in Python.
 
-%description multiprocessing
-A minimal task scheduling abstraction and parallel arrays.
-* dask is a specification to describe task dependency graphs.
-* dask.array is a drop-in NumPy replacement (for a subset of NumPy) that
-  encodes blocked algorithms in dask dependency graphs.
-* dask.async is a shared-memory asynchronous scheduler that efficiently
-  executes dask dependency graphs on multiple cores.
+Dask is composed of two parts:
+- Dynamic task scheduling optimized for computation. This is similar to
+  Airflow, Luigi, Celery, or Make, but optimized for interactive
+  computational workloads.
+- ???Big Data??? collections like parallel arrays, dataframes, and lists that
+  extend common interfaces like NumPy, Pandas, or Python iterators to
+  larger-than-memory or distributed environments. These parallel collections
+  run on top of dynamic task schedulers.
 
-This package contains the multiprocessing interface.
+This package contains the graphviz dot rendering interface.
 
 %prep
 %autosetup -p1 -n dask-%{version}
@@ -259,11 +295,28 @@
 %install
 %if !%{with test}
 %python_install
+%{python_expand # give SUSE specific install instructions
+sed -E -i '/Please either conda or pip install/,/python -m pip install/ {
+  s/either conda or pip//;
+  /conda install/ d;
+  s/python -m pip install "dask\[(.*)\]".*pip install/zypper in 
$python-dask-\1/
+  }' \
+  %{buildroot}%{$python_sitelib}/dask/distributed.py
+sed -E -i '/Please either conda or pip install/,/python -m pip install/ c \
+        "Please file a bug report https://bugzilla.opensuse.org and\\n"\
+        "report the missing requirements."' \
+  %{buildroot}%{$python_sitelib}/dask/array/__init__.py \
+  %{buildroot}%{$python_sitelib}/dask/bag/__init__.py \
+  %{buildroot}%{$python_sitelib}/dask/dataframe/__init__.py
+}
+%{python_compileall}
 %python_expand %fdupes %{buildroot}%{$python_sitelib}
 %endif
 
 %if %{with test}
 %check
+# move away from importpath
+mv dask dask.moved
 # different seed or mimesis version
 donttest="(test_datasets and test_deterministic)"
 # distributed/pytest-asyncio cancer is spreading
@@ -273,14 +326,14 @@
 donttest+="or (test_distributed and test_local_get_with_distributed_active)"
 donttest+="or (test_distributed and test_serializable_groupby_agg)"
 donttest+="or (test_distributed and test_await)"
-# NEP 29: There is no python36-dask-dataframe or -array because Tumbleweed 
dropped python36-numpy with 1.20
-python36_ignore="--ignore dask/dataframe --ignore dask/array"
 if [ $(getconf LONG_BIT) -eq 32 ]; then
   # Fails to convert datatype in obs constrained memory for 32-bit platforms
   donttest+="or (test_distributed and test_combo_of_layer_types)"
   donttest+="or (test_distributed and test_annotation_pack_unpack)"
+  # https://github.com/dask/dask/issues/7489
+  donttest+="or (test_distributed and test_blockwise_numpy_)"
 fi
-%pytest -ra -m "not network" -k "not ($donttest ${$python_donttest})" -n auto 
${$python_ignore}
+%pytest --pyargs dask -ra -m "not network" -k "not ($donttest)" -n auto
 %endif
 
 %if !%{with test}
@@ -288,50 +341,47 @@
 %doc README.rst
 %license LICENSE.txt
 %{python_sitelib}/dask/
-%{python_sitelib}/dask-%{version}-py*.egg-info
+%{python_sitelib}/dask-%{version}*-info
 %exclude %{python_sitelib}/dask/array/
 %exclude %{python_sitelib}/dask/bag/
 %exclude %{python_sitelib}/dask/dataframe/
-%exclude %{python_sitelib}/dask/distributed.py*
+%exclude %{python_sitelib}/dask/diagnostics
+%exclude %{python_sitelib}/dask/delayed.py*
 %exclude %{python_sitelib}/dask/dot.py*
-%exclude %{python_sitelib}/dask/multiprocessing.py*
-%pycache_only %exclude %{python_sitelib}/dask/__pycache__/distributed.*
+%pycache_only %exclude %{python_sitelib}/dask/__pycache__/delayed*.pyc
 %pycache_only %exclude %{python_sitelib}/dask/__pycache__/dot.*
-%pycache_only %exclude %{python_sitelib}/dask/__pycache__/multiprocessing.*
 
 %files %{python_files all}
 %license LICENSE.txt
 
-%if "%python_flavor" != "python36"
 %files %{python_files array}
 %license LICENSE.txt
 %{python_sitelib}/dask/array/
-%endif
 
 %files %{python_files bag}
 %license LICENSE.txt
 %{python_sitelib}/dask/bag/
 
-%if "%python_flavor" != "python36"
 %files %{python_files dataframe}
 %license LICENSE.txt
 %{python_sitelib}/dask/dataframe/
-%endif
 
 %files %{python_files distributed}
 %license LICENSE.txt
-%{python_sitelib}/dask/distributed.py*
-%pycache_only %{python_sitelib}/dask/__pycache__/distributed.*
 
 %files %{python_files dot}
 %license LICENSE.txt
 %{python_sitelib}/dask/dot.py*
 %pycache_only %{python_sitelib}/dask/__pycache__/dot.*
 
-%files %{python_files multiprocessing}
+%files %{python_files diagnostics}
+%license LICENSE.txt
+%{python_sitelib}/dask/diagnostics/
+
+%files %{python_files delayed}
 %license LICENSE.txt
-%{python_sitelib}/dask/multiprocessing.py*
-%pycache_only %{python_sitelib}/dask/__pycache__/multiprocessing.*
+%{python_sitelib}/dask/delayed.py*
+%pycache_only %{python_sitelib}/dask/__pycache__/delayed*.pyc
 %endif
 
 %changelog

++++++ dask-2021.3.0.tar.gz -> dask-2021.4.0.tar.gz ++++++
++++ 15850 lines of diff (skipped)

Reply via email to