Package: src:umap-learn
Version: 0.5.4+dfsg-1
Severity: serious
Tags: ftbfs

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules binary
dh binary --buildsystem=pybuild
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
I: pybuild base:311: python3.11 setup.py config
running config
   dh_auto_build -O--buildsystem=pybuild
I: pybuild base:311: /usr/bin/python3 setup.py build
running build
running build_py
creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/plot.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/sparse.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/aligned_umap.py -> 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/layouts.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/utils.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/umap_.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/parametric_umap.py -> 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/validation.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/spectral.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/distances.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
copying umap/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/umap
   debian/rules override_dh_auto_test
make[1]: Entering directory '/<<PKGBUILDDIR>>'
dh_auto_test -- --system=custom --test-args="PYTHONPATH={build_dir} {interpreter} -m 
pytest"
I: pybuild base:311: PYTHONPATH=/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build 
python3.11 -m pytest
============================= test session starts ==============================
platform linux -- Python 3.11.9, pytest-8.1.2, pluggy-1.5.0
rootdir: /<<PKGBUILDDIR>>
collected 209 items

umap/tests/test_aligned_umap.py ....s                                    [  2%]
umap/tests/test_chunked_parallel_spatial_metric.py sssssssssssssssssssss [ 12%]
sssssssssssssssssssss                                                    [ 22%]
umap/tests/test_composite_models.py .s..                                 [ 24%]
umap/tests/test_data_input.py .                                          [ 24%]
umap/tests/test_densmap.py .s..                                          [ 26%]
umap/tests/test_parametric_umap.py sssssss                               [ 30%]
umap/tests/test_plot.py s                                                [ 30%]
umap/tests/test_spectral.py ..                                           [ 31%]
umap/tests/test_umap_get_feature_names_out.py ....                       [ 33%]
umap/tests/test_umap_metrics.py ............s.................s......... [ 52%]
..                                                                       [ 53%]
umap/tests/test_umap_nn.py ..sssssss..                                   [ 58%]
umap/tests/test_umap_on_iris.py ............                             [ 64%]
umap/tests/test_umap_ops.py ...FFFFFFFF.....F..                          [ 73%]
umap/tests/test_umap_repeated_data.py .........                          [ 77%]
umap/tests/test_umap_trustworthiness.py ..........                       [ 82%]
umap/tests/test_umap_validation_params.py .............................. [ 97%]
......                                                                   [100%]

=================================== FAILURES ===================================
____________________ test_disconnected_data[True-jaccard-1] ____________________

num_isolates = 1, metric = 'jaccard', force_approximation = True

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
____________________ test_disconnected_data[True-jaccard-5] ____________________

num_isolates = 5, metric = 'jaccard', force_approximation = True

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
___________________ test_disconnected_data[True-hellinger-1] ___________________

num_isolates = 1, metric = 'hellinger', force_approximation = True

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
___________________ test_disconnected_data[True-hellinger-5] ___________________

num_isolates = 5, metric = 'hellinger', force_approximation = True

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
___________________ test_disconnected_data[False-jaccard-1] ____________________

num_isolates = 1, metric = 'jaccard', force_approximation = False

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
___________________ test_disconnected_data[False-jaccard-5] ____________________

num_isolates = 5, metric = 'jaccard', force_approximation = False

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
__________________ test_disconnected_data[False-hellinger-1] ___________________

num_isolates = 1, metric = 'hellinger', force_approximation = False

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
__________________ test_disconnected_data[False-hellinger-5] ___________________

num_isolates = 5, metric = 'hellinger', force_approximation = False

    @pytest.mark.parametrize("num_isolates", [1, 5])
    @pytest.mark.parametrize("metric", ["jaccard", "hellinger"])
    @pytest.mark.parametrize("force_approximation", [True, False])
    def test_disconnected_data(num_isolates, metric, force_approximation):
        options = [False, True]
        disconnected_data = np.random.choice(a=options, size=(10, 30), p=[0.6, 
1 - 0.6])
        # Add some disconnected data for the corner case test
        disconnected_data = np.vstack(
            [disconnected_data, np.zeros((num_isolates, 30), dtype="bool")]
        )
        new_columns = np.zeros((num_isolates + 10, num_isolates), dtype="bool")
        for i in range(num_isolates):
            new_columns[10 + i, i] = True
        disconnected_data = np.hstack([disconnected_data, new_columns])
      with pytest.warns(None) as w:

umap/tests/test_umap_ops.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = WarningsChecker(record=True), expected_warning = None, match_expr = None

    def __init__(
        self,
        expected_warning: Union[Type[Warning], Tuple[Type[Warning], ...]] = 
Warning,
        match_expr: Optional[Union[str, Pattern[str]]] = None,
        *,
        _ispytest: bool = False,
    ) -> None:
        check_ispytest(_ispytest)
        super().__init__(_ispytest=True)
msg = "exceptions must be derived from Warning, not %s"
        if isinstance(expected_warning, tuple):
            for exc in expected_warning:
                if not issubclass(exc, Warning):
                    raise TypeError(msg % type(exc))
            expected_warning_tup = expected_warning
        elif isinstance(expected_warning, type) and issubclass(
            expected_warning, Warning
        ):
            expected_warning_tup = (expected_warning,)
        else:
          raise TypeError(msg % type(expected_warning))
E           TypeError: exceptions must be derived from Warning, not <class 
'NoneType'>

/usr/lib/python3/dist-packages/_pytest/recwarn.py:285: TypeError
____________________________ test_umap_update_large ____________________________

iris = {'data': array([[5.1, 3.5, 1.4, 0.2],
       [4.9, 3. , 1.4, 0.2],
       [4.7, 3.2, 1.3, 0.2],
       [4.6, 3.1, 1.5,... width (cm)', 'petal length (cm)', 'petal width 
(cm)'], 'filename': 'iris.csv', 'data_module': 'sklearn.datasets.data'}
iris_subset_model_large = UMAP(force_approximation_algorithm=True, 
min_dist=0.01, n_neighbors=10, random_state=42, tqdm_kwds={'bar_format': 
'{desc}: {percentage:3.0f}%| {bar} {n_fmt}/{total_fmt} [{elapsed}]', 'desc': 
'Epochs completed', 'disable': True})
iris_selection = array([False,  True,  True, False,  True,  True,  True,  True, 
 True,
        True,  True,  True,  True, False,  True,...        True,  True,  True,  
True,  True, False,  True, False,  True,
        True, False, False, False,  True, False])
iris_model_large = UMAP(force_approximation_algorithm=True, min_dist=0.01, 
n_neighbors=10, random_state=42, tqdm_kwds={'bar_format': '{desc}: 
{percentage:3.0f}%| {bar} {n_fmt}/{total_fmt} [{elapsed}]', 'desc': 'Epochs 
completed', 'disable': True})

    def test_umap_update_large(
        iris, iris_subset_model_large, iris_selection, iris_model_large
    ):
new_data = iris.data[~iris_selection]
        new_model = iris_subset_model_large
        new_model.update(new_data)
comparison_graph = scipy.sparse.vstack(
            [
                iris_model_large.graph_[iris_selection],
                iris_model_large.graph_[~iris_selection],
            ]
        )
        comparison_graph = scipy.sparse.hstack(
            [comparison_graph[:, iris_selection], comparison_graph[:, 
~iris_selection]]
        )
error = np.sum(np.abs((new_model.graph_ - comparison_graph).data))
      assert error < 3.0  # Higher error tolerance based on approx nearest 
neighbors
E       assert 17.72839 < 3.0

umap/tests/test_umap_ops.py:262: AssertionError
=============================== warnings summary ===============================
umap/__init__.py:36
  /<<PKGBUILDDIR>>/umap/__init__.py:36: DeprecationWarning: pkg_resources is 
deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    import pkg_resources

umap/tests/test_chunked_parallel_spatial_metric.py:234
  /<<PKGBUILDDIR>>/umap/tests/test_chunked_parallel_spatial_metric.py:234: 
PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo?  You can register 
custom marks to avoid this warning - for details, see 
https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.benchmark(

umap/tests/test_chunked_parallel_spatial_metric.py:258
  /<<PKGBUILDDIR>>/umap/tests/test_chunked_parallel_spatial_metric.py:258: 
PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo?  You can register 
custom marks to avoid this warning - for details, see 
https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.benchmark(

umap/tests/test_chunked_parallel_spatial_metric.py:288
  /<<PKGBUILDDIR>>/umap/tests/test_chunked_parallel_spatial_metric.py:288: 
PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo?  You can register 
custom marks to avoid this warning - for details, see 
https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.benchmark(

umap/tests/test_chunked_parallel_spatial_metric.py:312
  /<<PKGBUILDDIR>>/umap/tests/test_chunked_parallel_spatial_metric.py:312: 
PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo?  You can register 
custom marks to avoid this warning - for details, see 
https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.benchmark(

umap/plot.py:20
  /<<PKGBUILDDIR>>/umap/plot.py:20: UserWarning: The umap.plot package requires 
extra plotting libraries to be installed.
      You can install these via pip using
pip install umap-learn[plot] or via conda using conda install pandas matplotlib datashader bokeh holoviews colorcet scikit-image warn(

umap/tests/test_aligned_umap.py::test_aligned_update
  /<<PKGBUILDDIR>>/umap/aligned_umap.py:195: NumbaTypeSafetyWarning: unsafe 
cast from int64 to int32. Precision may be lost.
    if i in relation_dict:

umap/tests/test_composite_models.py: 7 warnings
umap/tests/test_densmap.py: 3 warnings
umap/tests/test_umap_on_iris.py: 8 warnings
umap/tests/test_umap_ops.py: 2 warnings
umap/tests/test_umap_trustworthiness.py: 9 warnings
  /<<PKGBUILDDIR>>/umap/umap_.py:1943: UserWarning: n_jobs value -1 overridden 
to 1 by setting random_state. Use no seed for parallelism.
    warn(f"n_jobs value {self.n_jobs} overridden to 1 by setting random_state. Use 
no seed for parallelism.")

umap/tests/test_umap_get_feature_names_out.py::test_get_feature_names_out_featureunion
umap/tests/test_umap_get_feature_names_out.py::test_get_feature_names_out_featureunion
umap/tests/test_umap_repeated_data.py::test_repeated_points_large_n
umap/tests/test_umap_validation_params.py::test_umap_too_many_neighbors_warns
umap/tests/test_umap_validation_params.py::test_umap_inverse_transform_fails_expectedly
  /<<PKGBUILDDIR>>/umap/umap_.py:2433: UserWarning: n_neighbors is larger than 
the dataset size; truncating to X.shape[0] - 1
    warn(

umap/tests/test_umap_metrics.py::test_hellinger
  /<<PKGBUILDDIR>>/umap/tests/test_umap_metrics.py:413: RuntimeWarning: invalid 
value encountered in sqrt
    dist_matrix = np.sqrt(dist_matrix)

umap/tests/test_umap_on_iris.py::test_precomputed_transform_on_iris
umap/tests/test_umap_on_iris.py::test_precomputed_sparse_transform_on_iris
umap/tests/test_umap_ops.py::test_multi_component_layout_precomputed
umap/tests/test_umap_ops.py::test_disconnected_data_precomputed[True-1]
umap/tests/test_umap_ops.py::test_disconnected_data_precomputed[False-1]
umap/tests/test_umap_trustworthiness.py::test_sparse_precomputed_metric_umap_trustworthiness
umap/tests/test_umap_validation_params.py::test_umap_update_bad_params
  /<<PKGBUILDDIR>>/umap/umap_.py:1857: UserWarning: using precomputed metric; 
inverse_transform will be unavailable
    warn("using precomputed metric; inverse_transform will be unavailable")

umap/tests/test_umap_on_iris.py::test_precomputed_transform_on_iris
umap/tests/test_umap_on_iris.py::test_precomputed_sparse_transform_on_iris
  /<<PKGBUILDDIR>>/umap/umap_.py:2959: UserWarning: Transforming new data with 
precomputed metric. We are assuming the input data is a matrix of distances from the new 
points to the points in the training set. If the input matrix is sparse, it should contain 
distances from the new points to their nearest neighbours or approximate nearest neighbours 
in the training set.
    warn(

umap/tests/test_umap_ops.py::test_multi_component_layout
umap/tests/test_umap_ops.py::test_multi_component_layout_precomputed
  /usr/lib/python3/dist-packages/sklearn/manifold/_spectral_embedding.py:301: 
UserWarning: Graph is not fully connected, spectral embedding may not work as 
expected.
    warnings.warn(

umap/tests/test_umap_ops.py::test_disconnected_data_precomputed[True-1]
  /<<PKGBUILDDIR>>/umap/umap_.py:126: UserWarning: A few of your vertices were 
disconnected from the manifold.  This shouldn't cause problems.
  Disconnection_distance = 1 has removed 3 edges.
  It has only fully disconnected 1 vertices.
  Use umap.utils.disconnected_vertices() to identify them.
    warn(

umap/tests/test_umap_ops.py::test_disconnected_data_precomputed[False-1]
  /<<PKGBUILDDIR>>/umap/umap_.py:126: UserWarning: A few of your vertices were 
disconnected from the manifold.  This shouldn't cause problems.
  Disconnection_distance = 1 has removed 32 edges.
  It has only fully disconnected 1 vertices.
  Use umap.utils.disconnected_vertices() to identify them.
    warn(

umap/tests/test_umap_validation_params.py::test_umap_inverse_transform_fails_expectedly
  /<<PKGBUILDDIR>>/umap/umap_.py:1879: UserWarning: gradient function is not 
yet implemented for dice distance metric; inverse_transform will be unavailable
    warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[True-jaccard-1] - ...
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[True-jaccard-5] - ...
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[True-hellinger-1]
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[True-hellinger-5]
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[False-jaccard-1]
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[False-jaccard-5]
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[False-hellinger-1]
FAILED umap/tests/test_umap_ops.py::test_disconnected_data[False-hellinger-5]
FAILED umap/tests/test_umap_ops.py::test_umap_update_large - assert 17.72839 ...
====== 9 failed, 138 passed, 62 skipped, 56 warnings in 284.88s (0:04:44) ======
E: pybuild pybuild:389: test: plugin custom failed with: exit code=1: 
PYTHONPATH=/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build python3.11 -m pytest
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.11 
--system=custom "--test-args=PYTHONPATH={build_dir} {interpreter} -m pytest" 
returned exit code 13
make[1]: *** [debian/rules:16: override_dh_auto_test] Error 25
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
make: *** [debian/rules:12: binary] Error 2
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.
If required, the full build log is available here:

https://people.debian.org/~sanvila/build-logs/202405/

About the archive rebuild: The build was made on virtual machines
of type m6a.large and r6a.large from AWS, using sbuild and a
reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and affects, so that this is still visible in the BTS web
page for this package.

Thanks.

Reply via email to