Source: python-fastjsonschema
Version: 2.19.0-1
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: [email protected]
Usertags: ftbfs-20240313 ftbfs-trixie
Hi,
During a rebuild of all packages in sid, your package failed to build
on amd64.
Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_test
> I: pybuild pybuild:308: cp -r /usr/share/json-schema-test-suite/
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_fastjsonschema/build/JSON-Schema-Test-Suite
> I: pybuild base:305: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_fastjsonschema/build; python3.12 -m
> pytest tests
> ============================= test session starts
> ==============================
> platform linux -- Python 3.12.2, pytest-8.0.2, pluggy-1.4.0
> benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False
> min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10
> warmup=False warmup_iterations=100000)
> rootdir: /<<PKGBUILDDIR>>
> plugins: benchmark-4.0.0
> collected 1884 items
>
> tests/benchmarks/test_benchmark.py ............ [
> 0%]
> tests/json_schema/test_draft04.py ...................................... [
> 2%]
> ...........xx........................................................... [
> 6%]
> ..................................................................x..... [
> 10%]
> ........................................................................ [
> 14%]
> .....................xx................................................. [
> 17%]
> ................................................ [
> 20%]
> tests/json_schema/test_draft06.py ...................................... [
> 22%]
> ............................................................xx.......... [
> 26%]
> ........................................................................ [
> 30%]
> ..........................................................x............. [
> 33%]
> ........................................................................ [
> 37%]
> ........................................................................ [
> 41%]
> ...xx................................................................... [
> 45%]
> ................................. [
> 47%]
> tests/json_schema/test_draft07.py ...................................... [
> 49%]
> ............................................................xx.......... [
> 53%]
> ........................................................................ [
> 56%]
> ........................................................................ [
> 60%]
> ..............x.....................xxxx................XXXXXxXXX....... [
> 64%]
> ........................................................................ [
> 68%]
> ..................................................................xx.... [
> 72%]
> ........................................................................ [
> 75%]
> ........................ [
> 77%]
> tests/test_array.py .................................................... [
> 79%]
> ............ [
> 80%]
> tests/test_boolean.py ....... [
> 80%]
> tests/test_boolean_schema.py ........ [
> 81%]
> tests/test_common.py .................................. [
> 83%]
> tests/test_compile_to_code.py ..... [
> 83%]
> tests/test_composition.py .......... [
> 84%]
> tests/test_const.py ....... [
> 84%]
> tests/test_default.py .......... [
> 84%]
> tests/test_examples.py .... [
> 85%]
> tests/test_exceptions.py ............. [
> 85%]
> tests/test_format.py ......................... [
> 87%]
> tests/test_integration.py ................. [
> 88%]
> tests/test_null.py ...... [
> 88%]
> tests/test_number.py ................................................... [
> 91%]
> ..................................................... [
> 93%]
> tests/test_object.py ................................................... [
> 96%]
> ........... [
> 97%]
> tests/test_pattern_properties.py ...F [
> 97%]
> tests/test_pattern_serialization.py . [
> 97%]
> tests/test_security.py ..................... [
> 98%]
> tests/test_string.py ........................F..
> [100%]
>
> =================================== FAILURES
> ===================================
> _____________________ test_pattern_with_escape_no_warnings
> _____________________
>
> asserter = <function asserter.<locals>.f at 0x7f7f51003b00>
>
> def test_pattern_with_escape_no_warnings(asserter):
> value = {
> 'bar': {}
> }
>
> > with pytest.warns(None) as record:
>
> tests/test_pattern_properties.py:62:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = WarningsChecker(record=True), expected_warning = None, match_expr =
> None
>
> def __init__(
> self,
> expected_warning: Optional[
> Union[Type[Warning], Tuple[Type[Warning], ...]]
> ] = Warning,
> match_expr: Optional[Union[str, Pattern[str]]] = None,
> *,
> _ispytest: bool = False,
> ) -> None:
> check_ispytest(_ispytest)
> super().__init__(_ispytest=True)
>
> msg = "exceptions must be derived from Warning, not %s"
> if expected_warning is None:
> > warnings.warn(WARNS_NONE_ARG, stacklevel=4)
> E pytest.PytestRemovedIn8Warning: Passing None has been deprecated.
> E See
> https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests
> for alternatives in common use cases.
>
> /usr/lib/python3/dist-packages/_pytest/recwarn.py:281: PytestRemovedIn8Warning
> _____________________ test_pattern_with_escape_no_warnings
> _____________________
>
> asserter = <function asserter.<locals>.f at 0x7f7fb5947420>
>
> def test_pattern_with_escape_no_warnings(asserter):
> > with pytest.warns(None) as record:
>
> tests/test_string.py:77:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/_pytest/recwarn.py:281: in __init__
> warnings.warn(WARNS_NONE_ARG, stacklevel=4)
> /usr/lib/python3/dist-packages/_pytest/runner.py:342: in from_call
> result: Optional[TResult] = func()
> /usr/lib/python3/dist-packages/_pytest/runner.py:263: in <lambda>
> lambda: ihook(item=item, **kwds), when=when, reraise=reraise
> /usr/lib/python3/dist-packages/pluggy/_hooks.py:501: in __call__
> return self._hookexec(self.name, self._hookimpls.copy(), kwargs,
> firstresult)
> /usr/lib/python3/dist-packages/pluggy/_manager.py:119: in _hookexec
> return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
> /usr/lib/python3/dist-packages/_pytest/threadexception.py:87: in
> pytest_runtest_call
> yield from thread_exception_runtest_hook()
> /usr/lib/python3/dist-packages/_pytest/threadexception.py:63: in
> thread_exception_runtest_hook
> yield
> /usr/lib/python3/dist-packages/_pytest/unraisableexception.py:90: in
> pytest_runtest_call
> yield from unraisable_exception_runtest_hook()
> /usr/lib/python3/dist-packages/_pytest/unraisableexception.py:65: in
> unraisable_exception_runtest_hook
> yield
> /usr/lib/python3/dist-packages/_pytest/logging.py:839: in pytest_runtest_call
> yield from self._runtest_for(item, "call")
> /usr/lib/python3/dist-packages/_pytest/logging.py:822: in _runtest_for
> yield
> /usr/lib/python3/dist-packages/_pytest/capture.py:882: in pytest_runtest_call
> return (yield)
> /usr/lib/python3/dist-packages/_pytest/skipping.py:256: in pytest_runtest_call
> return (yield)
> /usr/lib/python3/dist-packages/_pytest/runner.py:178: in pytest_runtest_call
> raise e
> /usr/lib/python3/dist-packages/_pytest/runner.py:170: in pytest_runtest_call
> item.runtest()
> /usr/lib/python3/dist-packages/_pytest/python.py:1831: in runtest
> self.ihook.pytest_pyfunc_call(pyfuncitem=self)
> /usr/lib/python3/dist-packages/pluggy/_hooks.py:501: in __call__
> return self._hookexec(self.name, self._hookimpls.copy(), kwargs,
> firstresult)
> /usr/lib/python3/dist-packages/pluggy/_manager.py:119: in _hookexec
> return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
> /usr/lib/python3/dist-packages/_pytest/python.py:194: in pytest_pyfunc_call
> result = testfunction(**testargs)
> tests/test_pattern_properties.py:62: in test_pattern_with_escape_no_warnings
> with pytest.warns(None) as record:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = WarningsChecker(record=True), expected_warning = None, match_expr =
> None
>
> def __init__(
> self,
> expected_warning: Optional[
> Union[Type[Warning], Tuple[Type[Warning], ...]]
> ] = Warning,
> match_expr: Optional[Union[str, Pattern[str]]] = None,
> *,
> _ispytest: bool = False,
> ) -> None:
> check_ispytest(_ispytest)
> super().__init__(_ispytest=True)
>
> msg = "exceptions must be derived from Warning, not %s"
> if expected_warning is None:
> > warnings.warn(WARNS_NONE_ARG, stacklevel=4)
> E pytest.PytestRemovedIn8Warning: Passing None has been deprecated.
> E See
> https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests
> for alternatives in common use cases.
>
> /usr/lib/python3/dist-packages/_pytest/recwarn.py:281: PytestRemovedIn8Warning
>
> -------------------------------------------------------------------------------------------
> benchmark: 12 tests
> -------------------------------------------------------------------------------------------
> Name (time in us) Min Max
> Mean StdDev Median IQR
> Outliers OPS (Kops/s) Rounds Iterations
> -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> test_benchmark_bad_values[value0] 2.2300 (1.0) 272.7280 (1.11)
> 2.4419 (1.0) 2.6955 (1.0) 2.3350 (1.0) 0.0540 (1.04)
> 219;3637 409.5159 (1.0) 75569 1
> test_benchmark_bad_values[value1] 2.3630 (1.06) 1,761.5200 (7.17)
> 2.7013 (1.11) 7.9608 (2.95) 2.4840 (1.06) 0.0520 (1.0)
> 235;4722 370.1919 (0.90) 79504 1
> test_benchmark_bad_values[value3] 2.7170 (1.22) 4,163.5730 (16.95)
> 4.0247 (1.65) 32.5655 (12.08) 2.8650 (1.23) 0.0690 (1.33)
> 538;5700 248.4679 (0.61) 72349 1
> test_benchmark_bad_values[value2] 2.8480 (1.28) 570.3250 (2.32)
> 3.2063 (1.31) 4.3837 (1.63) 3.0010 (1.29) 0.0720 (1.38)
> 305;2816 311.8842 (0.76) 69057 1
> test_benchmark_bad_values[value5] 3.6340 (1.63) 8,426.9510 (34.31)
> 5.0813 (2.08) 44.4455 (16.49) 3.8060 (1.63) 0.0790 (1.52)
> 212;3379 196.7994 (0.48) 44973 1
> test_benchmark_bad_values[value6] 4.6890 (2.10) 245.6420 (1.0)
> 5.1975 (2.13) 5.4545 (2.02) 4.8670 (2.08) 0.0890 (1.71)
> 288;1954 192.4012 (0.47) 46043 1
> test_benchmark_bad_values[value4] 5.0600 (2.27) 3,852.4290 (15.68)
> 8.6393 (3.54) 37.8905 (14.06) 5.3120 (2.27) 0.1500 (2.88)
> 850;5384 115.7505 (0.28) 40701 1
> test_benchmark_ok_values[value3] 6.6810 (3.00) 1,804.1470 (7.34)
> 7.6546 (3.13) 20.5684 (7.63) 6.9330 (2.97) 0.1200 (2.31)
> 134;1504 130.6407 (0.32) 33819 1
> test_benchmark_ok_values[value2] 6.6900 (3.00) 537.9700 (2.19)
> 7.4449 (3.05) 6.3472 (2.35) 6.9520 (2.98) 0.1140 (2.19)
> 481;2046 134.3202 (0.33) 43257 1
> test_benchmark_ok_values[value1] 6.7830 (3.04) 1,269.2420 (5.17)
> 7.7584 (3.18) 9.3403 (3.47) 7.0320 (3.01) 0.1250 (2.40)
> 629;2624 128.8924 (0.31) 45190 1
> test_benchmark_ok_values[value0] 6.8410 (3.07) 250.6740 (1.02)
> 7.5808 (3.10) 5.7703 (2.14) 7.0800 (3.03) 0.1210 (2.33)
> 201;759 131.9127 (0.32) 17215 1
> test_benchmark_bad_values[value7] 7.5160 (3.37) 1,705.9220 (6.94)
> 8.3049 (3.40) 11.7178 (4.35) 7.7820 (3.33) 0.1310 (2.52)
> 204;1414 120.4109 (0.29) 32031 1
> -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Legend:
> Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range)
> from 1st Quartile and 3rd Quartile.
> OPS: Operations Per Second, computed as 1 / Mean
> =========================== short test summary info
> ============================
> FAILED tests/test_pattern_properties.py::test_pattern_with_escape_no_warnings
> FAILED tests/test_string.py::test_pattern_with_escape_no_warnings -
> pytest.Py...
> ============ 2 failed, 1854 passed, 20 xfailed, 8 xpassed in 32.50s
> ============
> E: pybuild pybuild:389: test: plugin pyproject failed with: exit code=1: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_fastjsonschema/build; python3.12 -m
> pytest tests
> I: pybuild pybuild:308: cp -r /usr/share/json-schema-test-suite/
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_fastjsonschema/build/JSON-Schema-Test-Suite
> I: pybuild base:305: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_fastjsonschema/build; python3.11 -m
> pytest tests
> ============================= test session starts
> ==============================
> platform linux -- Python 3.11.8, pytest-8.0.2, pluggy-1.4.0
> benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False
> min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10
> warmup=False warmup_iterations=100000)
> rootdir: /<<PKGBUILDDIR>>
> plugins: benchmark-4.0.0
> collected 1884 items
>
> tests/benchmarks/test_benchmark.py ............ [
> 0%]
> tests/json_schema/test_draft04.py ...................................... [
> 2%]
> ...........xx........................................................... [
> 6%]
> ..................................................................x..... [
> 10%]
> ........................................................................ [
> 14%]
> .....................xx................................................. [
> 17%]
> ................................................ [
> 20%]
> tests/json_schema/test_draft06.py ...................................... [
> 22%]
> ............................................................xx.......... [
> 26%]
> ........................................................................ [
> 30%]
> ..........................................................x............. [
> 33%]
> ........................................................................ [
> 37%]
> ........................................................................ [
> 41%]
> ...xx................................................................... [
> 45%]
> ................................. [
> 47%]
> tests/json_schema/test_draft07.py ...................................... [
> 49%]
> ............................................................xx.......... [
> 53%]
> ........................................................................ [
> 56%]
> ........................................................................ [
> 60%]
> ..............x.....................xxxx................XXXXXxXXX....... [
> 64%]
> ........................................................................ [
> 68%]
> ..................................................................xx.... [
> 72%]
> ........................................................................ [
> 75%]
> ........................ [
> 77%]
> tests/test_array.py .................................................... [
> 79%]
> ............ [
> 80%]
> tests/test_boolean.py ....... [
> 80%]
> tests/test_boolean_schema.py ........ [
> 81%]
> tests/test_common.py .................................. [
> 83%]
> tests/test_compile_to_code.py ..... [
> 83%]
> tests/test_composition.py .......... [
> 84%]
> tests/test_const.py ....... [
> 84%]
> tests/test_default.py .......... [
> 84%]
> tests/test_examples.py .... [
> 85%]
> tests/test_exceptions.py ............. [
> 85%]
> tests/test_format.py ......................... [
> 87%]
> tests/test_integration.py ................. [
> 88%]
> tests/test_null.py ...... [
> 88%]
> tests/test_number.py ................................................... [
> 91%]
> ..................................................... [
> 93%]
> tests/test_object.py ................................................... [
> 96%]
> ........... [
> 97%]
> tests/test_pattern_properties.py ...F [
> 97%]
> tests/test_pattern_serialization.py . [
> 97%]
> tests/test_security.py ..................... [
> 98%]
> tests/test_string.py ........................F..
> [100%]
>
> =================================== FAILURES
> ===================================
> _____________________ test_pattern_with_escape_no_warnings
> _____________________
>
> asserter = <function asserter.<locals>.f at 0x7f3995a1a520>
>
> def test_pattern_with_escape_no_warnings(asserter):
> value = {
> 'bar': {}
> }
>
> > with pytest.warns(None) as record:
>
> tests/test_pattern_properties.py:62:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = WarningsChecker(record=True), expected_warning = None, match_expr =
> None
>
> def __init__(
> self,
> expected_warning: Optional[
> Union[Type[Warning], Tuple[Type[Warning], ...]]
> ] = Warning,
> match_expr: Optional[Union[str, Pattern[str]]] = None,
> *,
> _ispytest: bool = False,
> ) -> None:
> check_ispytest(_ispytest)
> super().__init__(_ispytest=True)
>
> msg = "exceptions must be derived from Warning, not %s"
> if expected_warning is None:
> > warnings.warn(WARNS_NONE_ARG, stacklevel=4)
> E pytest.PytestRemovedIn8Warning: Passing None has been deprecated.
> E See
> https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests
> for alternatives in common use cases.
>
> /usr/lib/python3/dist-packages/_pytest/recwarn.py:281: PytestRemovedIn8Warning
> _____________________ test_pattern_with_escape_no_warnings
> _____________________
>
> asserter = <function asserter.<locals>.f at 0x7f3995601b20>
>
> def test_pattern_with_escape_no_warnings(asserter):
> > with pytest.warns(None) as record:
>
> tests/test_string.py:77:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/_pytest/recwarn.py:281: in __init__
> warnings.warn(WARNS_NONE_ARG, stacklevel=4)
> /usr/lib/python3/dist-packages/_pytest/runner.py:342: in from_call
> result: Optional[TResult] = func()
> /usr/lib/python3/dist-packages/_pytest/runner.py:263: in <lambda>
> lambda: ihook(item=item, **kwds), when=when, reraise=reraise
> /usr/lib/python3/dist-packages/pluggy/_hooks.py:501: in __call__
> return self._hookexec(self.name, self._hookimpls.copy(), kwargs,
> firstresult)
> /usr/lib/python3/dist-packages/pluggy/_manager.py:119: in _hookexec
> return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
> /usr/lib/python3/dist-packages/_pytest/threadexception.py:87: in
> pytest_runtest_call
> yield from thread_exception_runtest_hook()
> /usr/lib/python3/dist-packages/_pytest/threadexception.py:63: in
> thread_exception_runtest_hook
> yield
> /usr/lib/python3/dist-packages/_pytest/unraisableexception.py:90: in
> pytest_runtest_call
> yield from unraisable_exception_runtest_hook()
> /usr/lib/python3/dist-packages/_pytest/unraisableexception.py:65: in
> unraisable_exception_runtest_hook
> yield
> /usr/lib/python3/dist-packages/_pytest/logging.py:839: in pytest_runtest_call
> yield from self._runtest_for(item, "call")
> /usr/lib/python3/dist-packages/_pytest/logging.py:822: in _runtest_for
> yield
> /usr/lib/python3/dist-packages/_pytest/capture.py:882: in pytest_runtest_call
> return (yield)
> /usr/lib/python3/dist-packages/_pytest/skipping.py:256: in pytest_runtest_call
> return (yield)
> /usr/lib/python3/dist-packages/_pytest/runner.py:178: in pytest_runtest_call
> raise e
> /usr/lib/python3/dist-packages/_pytest/runner.py:170: in pytest_runtest_call
> item.runtest()
> /usr/lib/python3/dist-packages/_pytest/python.py:1831: in runtest
> self.ihook.pytest_pyfunc_call(pyfuncitem=self)
> /usr/lib/python3/dist-packages/pluggy/_hooks.py:501: in __call__
> return self._hookexec(self.name, self._hookimpls.copy(), kwargs,
> firstresult)
> /usr/lib/python3/dist-packages/pluggy/_manager.py:119: in _hookexec
> return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
> /usr/lib/python3/dist-packages/_pytest/python.py:194: in pytest_pyfunc_call
> result = testfunction(**testargs)
> tests/test_pattern_properties.py:62: in test_pattern_with_escape_no_warnings
> with pytest.warns(None) as record:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = WarningsChecker(record=True), expected_warning = None, match_expr =
> None
>
> def __init__(
> self,
> expected_warning: Optional[
> Union[Type[Warning], Tuple[Type[Warning], ...]]
> ] = Warning,
> match_expr: Optional[Union[str, Pattern[str]]] = None,
> *,
> _ispytest: bool = False,
> ) -> None:
> check_ispytest(_ispytest)
> super().__init__(_ispytest=True)
>
> msg = "exceptions must be derived from Warning, not %s"
> if expected_warning is None:
> > warnings.warn(WARNS_NONE_ARG, stacklevel=4)
> E pytest.PytestRemovedIn8Warning: Passing None has been deprecated.
> E See
> https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests
> for alternatives in common use cases.
>
> /usr/lib/python3/dist-packages/_pytest/recwarn.py:281: PytestRemovedIn8Warning
>
> -------------------------------------------------------------------------------------------
> benchmark: 12 tests
> -------------------------------------------------------------------------------------------
> Name (time in us) Min Max
> Mean StdDev Median IQR
> Outliers OPS (Kops/s) Rounds Iterations
> -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> test_benchmark_bad_values[value0] 2.2760 (1.0) 1,362.1070 (3.18)
> 2.7920 (1.0) 6.9679 (1.0) 2.4090 (1.0) 0.0590 (1.0)
> 586;5701 358.1626 (1.0) 87990 1
> test_benchmark_bad_values[value1] 2.4780 (1.09) 4,824.0580 (11.26)
> 3.1848 (1.14) 19.0235 (2.73) 2.5930 (1.08) 0.0610 (1.03)
> 617;6698 313.9893 (0.88) 91651 1
> test_benchmark_bad_values[value3] 2.8410 (1.25) 2,571.8130 (6.00)
> 3.7244 (1.33) 14.9156 (2.14) 2.9770 (1.24) 0.0700 (1.19)
> 835;6820 268.5004 (0.75) 93888 1
> test_benchmark_bad_values[value2] 2.9430 (1.29) 3,413.2920 (7.97)
> 3.8935 (1.39) 20.6834 (2.97) 3.0790 (1.28) 0.0690 (1.17)
> 444;3554 256.8413 (0.72) 48939 1
> test_benchmark_bad_values[value5] 3.7300 (1.64) 3,852.4800 (8.99)
> 4.9899 (1.79) 18.0363 (2.59) 3.9370 (1.63) 0.0980 (1.66)
> 889;5457 200.4039 (0.56) 73899 1
> test_benchmark_bad_values[value6] 4.7830 (2.10) 1,835.6110 (4.29)
> 6.8337 (2.45) 19.4516 (2.79) 5.0700 (2.10) 0.1370 (2.32)
> 1206;5865 146.3340 (0.41) 55898 1
> test_benchmark_bad_values[value4] 5.0660 (2.23) 611.8550 (1.43)
> 6.9300 (2.48) 12.8519 (1.84) 5.3530 (2.22) 0.1420 (2.41)
> 1003;4525 144.3005 (0.40) 47244 1
> test_benchmark_ok_values[value2] 6.7900 (2.98) 1,218.7670 (2.85)
> 8.3579 (2.99) 12.1616 (1.75) 7.0650 (2.93) 0.1350 (2.29)
> 869;3373 119.6479 (0.33) 46791 1
> test_benchmark_ok_values[value3] 6.8250 (3.00) 775.4520 (1.81)
> 8.6558 (3.10) 15.6545 (2.25) 7.0680 (2.93) 0.1400 (2.37)
> 291;1251 115.5300 (0.32) 15753 1
> test_benchmark_ok_values[value1] 7.0310 (3.09) 1,192.1910 (2.78)
> 9.2354 (3.31) 18.5442 (2.66) 7.3090 (3.03) 0.1540 (2.61)
> 1016;4337 108.2789 (0.30) 47235 1
> test_benchmark_ok_values[value0] 7.1130 (3.13) 428.3780 (1.0)
> 8.6488 (3.10) 11.1682 (1.60) 7.4160 (3.08) 0.1530 (2.59)
> 288;1100 115.6232 (0.32) 15162 1
> test_benchmark_bad_values[value7] 7.6260 (3.35) 677.1160 (1.58)
> 9.5713 (3.43) 12.4846 (1.79) 7.9840 (3.31) 0.1740 (2.95)
> 946;3340 104.4793 (0.29) 38348 1
> -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Legend:
> Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range)
> from 1st Quartile and 3rd Quartile.
> OPS: Operations Per Second, computed as 1 / Mean
> =========================== short test summary info
> ============================
> FAILED tests/test_pattern_properties.py::test_pattern_with_escape_no_warnings
> FAILED tests/test_string.py::test_pattern_with_escape_no_warnings -
> pytest.Py...
> ============ 2 failed, 1854 passed, 20 xfailed, 8 xpassed in 31.30s
> ============
> E: pybuild pybuild:389: test: plugin pyproject failed with: exit code=1: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_fastjsonschema/build; python3.11 -m
> pytest tests
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12
> 3.11" returned exit code 13
The full build log is available from:
http://qa-logs.debian.net/2024/03/13/python-fastjsonschema_2.19.0-1_unstable.log
All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240313;[email protected]
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240313&[email protected]&allbugs=1&cseverity=1&ctags=1&caffected=1#results
A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!
If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects
If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.