Package: src:python-ewoksdata
Version: 0.6.0~rc0-5
Severity: serious
Tags: ftbfs forky sid

Dear maintainer:

During a rebuild of all packages in unstable, this package failed to build.

Below you will find the last part of the build log (probably the most
relevant part, but not necessarily). If required, the full build log
is available here:

https://people.debian.org/~sanvila/build-logs/202603/

About the archive rebuild: The build was made on virtual machines from AWS,
using sbuild and a reduced chroot with only build-essential packages.

If you cannot reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and add an affects on src:python-ewoksdata, so that this is still
visible in the BTS web page for this package.

Thanks.

--------------------------------------------------------------------------------
[...]
 debian/rules clean
dh clean --buildsystem=pybuild
   dh_auto_clean -O--buildsystem=pybuild
   dh_autoreconf_clean -O--buildsystem=pybuild
   dh_clean -O--buildsystem=pybuild
 debian/rules binary
dh binary --buildsystem=pybuild
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
   dh_auto_build -O--buildsystem=pybuild
I: pybuild plugin_pyproject:142: Building wheel for python3.14 with "build" 
module
I: pybuild base:385: python3.14 -m build --skip-dependency-check --no-isolation 
--wheel --outdir /<<PKGBUILDDIR>>/.pybuild/cpython3_3.14  
* Building wheel...
/usr/lib/python3/dist-packages/setuptools/dist.py:759: 
SetuptoolsDeprecationWarning: License classifiers are deprecated.

[... snipped ...]

    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
h5py/_debian_h5py_serial/_objects.pyx:54: in 
h5py._debian_h5py_serial._objects.with_phil.wrapper
    ???
h5py/_debian_h5py_serial/_objects.pyx:55: in 
h5py._debian_h5py_serial._objects.with_phil.wrapper
    ???
/usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/dataset.py:1107: in 
__setitem__
    mspace = h5s.create_simple(selection.expand_shape(mshape))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <h5py._debian_h5py_serial._hl.selections.SimpleSelection object at 
0x7fa3a19ba2d0>
source_shape = (0,)

    def expand_shape(self, source_shape):
        """Match the dimensions of an array to be broadcast to the selection
    
        The returned shape describes an array of the same size as the input
        shape, but its dimensions
    
        E.g. with a dataset shape (10, 5, 4, 2), writing like this::
    
            ds[..., 0] = np.ones((5, 4))
    
        The source shape (5, 4) will expand to (1, 5, 4, 1).
        Then the broadcast method below repeats that chunk 10
        times to write to an effective shape of (10, 5, 4, 1).
        """
        start, count, step, scalar = self._sel
    
        rank = len(count)
        remaining_src_dims = list(source_shape)
    
        eshape = []
        for idx in range(1, rank + 1):
            if len(remaining_src_dims) == 0 or scalar[-idx]:  # Skip scalar axes
                eshape.append(1)
            else:
                t = remaining_src_dims.pop()
                if t == 1 or count[-idx] == t:
                    eshape.append(t)
                else:
>                   raise TypeError("Can't broadcast %s -> %s" % (source_shape, 
> self.array_shape))  # array shape
                    
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                   TypeError: Can't broadcast (0,) -> (0, 10, 20)

/usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/selections.py:265: 
TypeError
_____________ test_stack_dataset_writer[False-False-False-0.1-3-4] _____________

tmpdir = 
local('/tmp/pytest-of-sbuild/pytest-1/test_stack_dataset_writer_Fals45')
nstack = 4, npoints = 3, flush_period = 0.1, known_npoints = False
known_nstack = False, append_stacks_in_parallel = False

    @pytest.mark.parametrize("nstack", (1, 4))
    @pytest.mark.parametrize("npoints", (1, 3, 1000))
    @pytest.mark.parametrize("flush_period", (None, 0.1))
    @pytest.mark.parametrize("known_npoints", (True, False))
    @pytest.mark.parametrize("known_nstack", (True, False))
    @pytest.mark.parametrize("append_stacks_in_parallel", (True, False))
    def test_stack_dataset_writer(
        tmpdir,
        nstack,
        npoints,
        flush_period,
        known_npoints,
        known_nstack,
        append_stacks_in_parallel,
    ):
        expected = [list() for _ in range(nstack)]
        filename = str(tmpdir / "test.h5")
        if flush_period is None:
            sleep_time = None
        else:
            sleep_time = flush_period + 0.1
        isleep = (nstack * npoints) // 3
    
        kwargs = {"flush_period": flush_period}
        if known_npoints:
            kwargs["npoints"] = npoints
        if known_nstack:
            kwargs["nstack"] = nstack
    
        if append_stacks_in_parallel:
            itpoints = itertools.product(range(npoints), range(nstack))
        else:
            itpoints = itertools.product(range(nstack), range(npoints))
    
        with h5py.File(filename, mode="w") as f:
>           with dataset_writer.StackDatasetWriter(f, "data", **kwargs) as 
> writer:
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

tests/test_dataset_writer.py:75: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
ewoksdata/data/hdf5/dataset_writer.py:36: in __exit__
    self.flush_buffer()
ewoksdata/data/hdf5/dataset_writer.py:306: in flush_buffer
    self._dataset[i_dim0, istart_dim1 : istart_dim1 + n_dim1, ...] = (
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
h5py/_debian_h5py_serial/_objects.pyx:54: in 
h5py._debian_h5py_serial._objects.with_phil.wrapper
    ???
h5py/_debian_h5py_serial/_objects.pyx:55: in 
h5py._debian_h5py_serial._objects.with_phil.wrapper
    ???
/usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/dataset.py:1107: in 
__setitem__
    mspace = h5s.create_simple(selection.expand_shape(mshape))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <h5py._debian_h5py_serial._hl.selections.SimpleSelection object at 
0x7fa3a1540a70>
source_shape = (0,)

    def expand_shape(self, source_shape):
        """Match the dimensions of an array to be broadcast to the selection
    
        The returned shape describes an array of the same size as the input
        shape, but its dimensions
    
        E.g. with a dataset shape (10, 5, 4, 2), writing like this::
    
            ds[..., 0] = np.ones((5, 4))
    
        The source shape (5, 4) will expand to (1, 5, 4, 1).
        Then the broadcast method below repeats that chunk 10
        times to write to an effective shape of (10, 5, 4, 1).
        """
        start, count, step, scalar = self._sel
    
        rank = len(count)
        remaining_src_dims = list(source_shape)
    
        eshape = []
        for idx in range(1, rank + 1):
            if len(remaining_src_dims) == 0 or scalar[-idx]:  # Skip scalar axes
                eshape.append(1)
            else:
                t = remaining_src_dims.pop()
                if t == 1 or count[-idx] == t:
                    eshape.append(t)
                else:
>                   raise TypeError("Can't broadcast %s -> %s" % (source_shape, 
> self.array_shape))  # array shape
                    
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                   TypeError: Can't broadcast (0,) -> (0, 10, 20)

/usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/selections.py:265: 
TypeError
___________ test_stack_dataset_writer[False-False-False-0.1-1000-4] ____________

tmpdir = 
local('/tmp/pytest-of-sbuild/pytest-1/test_stack_dataset_writer_Fals47')
nstack = 4, npoints = 1000, flush_period = 0.1, known_npoints = False
known_nstack = False, append_stacks_in_parallel = False

    @pytest.mark.parametrize("nstack", (1, 4))
    @pytest.mark.parametrize("npoints", (1, 3, 1000))
    @pytest.mark.parametrize("flush_period", (None, 0.1))
    @pytest.mark.parametrize("known_npoints", (True, False))
    @pytest.mark.parametrize("known_nstack", (True, False))
    @pytest.mark.parametrize("append_stacks_in_parallel", (True, False))
    def test_stack_dataset_writer(
        tmpdir,
        nstack,
        npoints,
        flush_period,
        known_npoints,
        known_nstack,
        append_stacks_in_parallel,
    ):
        expected = [list() for _ in range(nstack)]
        filename = str(tmpdir / "test.h5")
        if flush_period is None:
            sleep_time = None
        else:
            sleep_time = flush_period + 0.1
        isleep = (nstack * npoints) // 3
    
        kwargs = {"flush_period": flush_period}
        if known_npoints:
            kwargs["npoints"] = npoints
        if known_nstack:
            kwargs["nstack"] = nstack
    
        if append_stacks_in_parallel:
            itpoints = itertools.product(range(npoints), range(nstack))
        else:
            itpoints = itertools.product(range(nstack), range(npoints))
    
        with h5py.File(filename, mode="w") as f:
>           with dataset_writer.StackDatasetWriter(f, "data", **kwargs) as 
> writer:
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

tests/test_dataset_writer.py:75: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
ewoksdata/data/hdf5/dataset_writer.py:36: in __exit__
    self.flush_buffer()
ewoksdata/data/hdf5/dataset_writer.py:306: in flush_buffer
    self._dataset[i_dim0, istart_dim1 : istart_dim1 + n_dim1, ...] = (
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
h5py/_debian_h5py_serial/_objects.pyx:54: in 
h5py._debian_h5py_serial._objects.with_phil.wrapper
    ???
h5py/_debian_h5py_serial/_objects.pyx:55: in 
h5py._debian_h5py_serial._objects.with_phil.wrapper
    ???
/usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/dataset.py:1107: in 
__setitem__
    mspace = h5s.create_simple(selection.expand_shape(mshape))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <h5py._debian_h5py_serial._hl.selections.SimpleSelection object at 
0x7fa3a1490710>
source_shape = (0,)

    def expand_shape(self, source_shape):
        """Match the dimensions of an array to be broadcast to the selection
    
        The returned shape describes an array of the same size as the input
        shape, but its dimensions
    
        E.g. with a dataset shape (10, 5, 4, 2), writing like this::
    
            ds[..., 0] = np.ones((5, 4))
    
        The source shape (5, 4) will expand to (1, 5, 4, 1).
        Then the broadcast method below repeats that chunk 10
        times to write to an effective shape of (10, 5, 4, 1).
        """
        start, count, step, scalar = self._sel
    
        rank = len(count)
        remaining_src_dims = list(source_shape)
    
        eshape = []
        for idx in range(1, rank + 1):
            if len(remaining_src_dims) == 0 or scalar[-idx]:  # Skip scalar axes
                eshape.append(1)
            else:
                t = remaining_src_dims.pop()
                if t == 1 or count[-idx] == t:
                    eshape.append(t)
                else:
>                   raise TypeError("Can't broadcast %s -> %s" % (source_shape, 
> self.array_shape))  # array shape
                    
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                   TypeError: Can't broadcast (0,) -> (0, 10, 20)

/usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/selections.py:265: 
TypeError
=========================== short test summary info ============================
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-True-None-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-True-None-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-True-None-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-True-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-True-0.1-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-True-0.1-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-False-None-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-False-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-True-False-0.1-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-False-True-None-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-False-True-None-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-False-True-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-False-True-0.1-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-False-True-0.1-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[True-False-False-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-True-None-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-True-None-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-True-None-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-True-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-True-0.1-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-True-0.1-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-False-None-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-False-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-False-0.1-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-True-False-0.1-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-True-None-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-True-None-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-True-None-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-True-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-True-0.1-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-True-0.1-1000-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-False-0.1-1-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-False-0.1-3-4]
FAILED 
tests/test_dataset_writer.py::test_stack_dataset_writer[False-False-False-0.1-1000-4]
======================== 34 failed, 78 passed in 28.85s ========================
E: pybuild pybuild:485: test: plugin pyproject failed with: exit code=1: cd 
'/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13/build'; python3.13 -m pytest 
tests/test_contextiterator.py tests/test_data_hdf5.py tests/test_data_nexus.py 
tests/test_dataset_writer.py
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.14 
3.13" --parallel=2 returned exit code 13
make: *** [debian/rules:6: binary] Error 25
dpkg-buildpackage: error: debian/rules binary subprocess failed with exit 
status 2
--------------------------------------------------------------------------------

Reply via email to