Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-cloudpickle for 
openSUSE:Factory checked in at 2024-11-27 22:10:38
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-cloudpickle (Old)
 and      /work/SRC/openSUSE:Factory/.python-cloudpickle.new.28523 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-cloudpickle"

Wed Nov 27 22:10:38 2024 rev:25 rq:1226826 version:3.1.0

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-cloudpickle/python-cloudpickle.changes    
2024-09-10 21:12:33.732359798 +0200
+++ 
/work/SRC/openSUSE:Factory/.python-cloudpickle.new.28523/python-cloudpickle.changes
 2024-11-27 22:12:32.299475342 +0100
@@ -1,0 +2,11 @@
+Wed Nov 27 08:42:54 UTC 2024 - Ben Greiner <c...@bnavigator.de>
+
+- Update to 3.1.0
+  * Some improvements to make cloudpickle more deterministic when
+    pickling dynamic functions and classes, in particular with
+    CPython 3.13. (PR #524 and PR #534)
+  * Fix a problem with the joint usage of cloudpickle's
+    _whichmodule and multiprocessing. (PR #529)
+- Drop Fix-test_extract_class_dict-for-Python-313.patch
+
+-------------------------------------------------------------------

Old:
----
  Fix-test_extract_class_dict-for-Python-313.patch
  cloudpickle-3.0.0-gh.tar.gz

New:
----
  cloudpickle-3.1.0-gh.tar.gz

BETA DEBUG BEGIN:
  Old:    _whichmodule and multiprocessing. (PR #529)
- Drop Fix-test_extract_class_dict-for-Python-313.patch
BETA DEBUG END:

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-cloudpickle.spec ++++++
--- /var/tmp/diff_new_pack.FIMYhL/_old  2024-11-27 22:12:33.859540470 +0100
+++ /var/tmp/diff_new_pack.FIMYhL/_new  2024-11-27 22:12:33.863540638 +0100
@@ -18,14 +18,12 @@
 
 %{?sle15_python_module_pythons}
 Name:           python-cloudpickle
-Version:        3.0.0
+Version:        3.1.0
 Release:        0
 Summary:        Extended pickling support for Python objects
 License:        BSD-3-Clause
 URL:            https://github.com/cloudpipe/cloudpickle
 Source:         
https://github.com/cloudpipe/cloudpickle/archive/refs/tags/v{%version}.tar.gz#/cloudpickle-%{version}-gh.tar.gz
-# PATCH-FIX-UPSTREAM gh/cloudpipe/cloudpickle#534 - Fix 
test_extract_class_dict for Python 3.13 beta 1
-Patch:          Fix-test_extract_class_dict-for-Python-313.patch
 BuildRequires:  %{python_module base >= 3.8}
 BuildRequires:  %{python_module flit-core}
 BuildRequires:  %{python_module pip}

++++++ cloudpickle-3.0.0-gh.tar.gz -> cloudpickle-3.1.0-gh.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/.github/workflows/testing.yml 
new/cloudpickle-3.1.0/.github/workflows/testing.yml
--- old/cloudpickle-3.0.0/.github/workflows/testing.yml 2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/.github/workflows/testing.yml 2024-10-11 
18:25:16.000000000 +0200
@@ -12,7 +12,7 @@
     steps:
     - uses: actions/checkout@v4
     - name: Set up Python 3.11
-      uses: actions/setup-python@v4
+      uses: actions/setup-python@v5
       with:
         python-version: 3.11
     - name: Install pre-commit
@@ -29,7 +29,7 @@
     strategy:
       matrix:
         os: [ubuntu-latest, windows-latest, macos-latest]
-        python_version: ["3.8", "3.9", "3.10", "3.11", "3.12", "pypy-3.9"]
+        python_version: ["3.8", "3.9", "3.10", "3.11", "3.12", "3.13", 
"pypy-3.9"]
         exclude:
           # Do not test all minor versions on all platforms, especially if they
           # are not the oldest/newest supported versions
@@ -50,7 +50,7 @@
     steps:
     - uses: actions/checkout@v4
     - name: Set up Python ${{ matrix.python_version }}
-      uses: actions/setup-python@v4
+      uses: actions/setup-python@v5
       with:
         python-version: ${{ matrix.python_version }}
         allow-prereleases: true
@@ -91,7 +91,7 @@
     steps:
     - uses: actions/checkout@v4
     - name: Set up Python
-      uses: actions/setup-python@v4
+      uses: actions/setup-python@v5
       with:
         python-version: ${{ matrix.python_version }}
     - name: Install project and dependencies
@@ -127,7 +127,7 @@
     steps:
     - uses: actions/checkout@v4
     - name: Set up Python
-      uses: actions/setup-python@v4
+      uses: actions/setup-python@v5
       with:
         python-version: ${{ matrix.python_version }}
     - name: Install project and dependencies
@@ -155,7 +155,7 @@
     steps:
     - uses: actions/checkout@v4
     - name: Set up Python
-      uses: actions/setup-python@v4
+      uses: actions/setup-python@v5
       with:
         python-version: ${{ matrix.python_version }}
     - name: Install downstream project and dependencies
@@ -180,7 +180,7 @@
     steps:
     - uses: actions/checkout@v4
     - name: Set up Python
-      uses: actions/setup-python@v4
+      uses: actions/setup-python@v5
       with:
         python-version: ${{ matrix.python_version }}
     - name: Install project and tests dependencies
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/CHANGES.md 
new/cloudpickle-3.1.0/CHANGES.md
--- old/cloudpickle-3.0.0/CHANGES.md    2023-10-13 14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/CHANGES.md    2024-10-11 18:25:16.000000000 +0200
@@ -1,11 +1,23 @@
+3.1.0
+=====
+
+- Some improvements to make cloudpickle more deterministic when pickling
+  dynamic functions and classes, in particular with CPython 3.13.
+  ([PR #524](https://github.com/cloudpipe/cloudpickle/pull/524) and
+   [PR #534](https://github.com/cloudpipe/cloudpickle/pull/534))
+
+- Fix a problem with the joint usage of cloudpickle's `_whichmodule` and
+  `multiprocessing`.
+  ([PR #529](https://github.com/cloudpipe/cloudpickle/pull/529))
+
 3.0.0
 =====
 
 - Officially support Python 3.12 and drop support for Python 3.6 and 3.7.
   Dropping support for older Python versions made it possible to simplify the
-  code base signficantly, hopefully making it easier to contribute to and
+  code base significantly, hopefully making it easier to contribute to and
   maintain the project.
-  ([PR #515](https://github.com/cloudpipe/cloudpickle/pull/515))
+  ([PR #517](https://github.com/cloudpipe/cloudpickle/pull/517))
 
 - Fix pickling of dataclasses and their instances.
   ([issue #386](https://github.com/cloudpipe/cloudpickle/issues/386),
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/cloudpickle-3.0.0/ci/install_coverage_subprocess_pth.py 
new/cloudpickle-3.1.0/ci/install_coverage_subprocess_pth.py
--- old/cloudpickle-3.0.0/ci/install_coverage_subprocess_pth.py 2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/ci/install_coverage_subprocess_pth.py 2024-10-11 
18:25:16.000000000 +0200
@@ -9,8 +9,8 @@
 import coverage; coverage.process_startup()
 """
 
-filename = op.join(get_path('purelib'), 'coverage_subprocess.pth')
-with open(filename, 'wb') as f:
-    f.write(FILE_CONTENT.encode('ascii'))
+filename = op.join(get_path("purelib"), "coverage_subprocess.pth")
+with open(filename, "wb") as f:
+    f.write(FILE_CONTENT.encode("ascii"))
 
-print('Installed subprocess coverage support: %s' % filename)
+print("Installed subprocess coverage support: %s" % filename)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/cloudpickle/__init__.py 
new/cloudpickle-3.1.0/cloudpickle/__init__.py
--- old/cloudpickle-3.0.0/cloudpickle/__init__.py       2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/cloudpickle/__init__.py       2024-10-11 
18:25:16.000000000 +0200
@@ -3,7 +3,7 @@
 
 __doc__ = cloudpickle.__doc__
 
-__version__ = "3.0.0"
+__version__ = "3.1.0"
 
 __all__ = [  # noqa
     "__version__",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/cloudpickle/cloudpickle.py 
new/cloudpickle-3.1.0/cloudpickle/cloudpickle.py
--- old/cloudpickle-3.0.0/cloudpickle/cloudpickle.py    2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/cloudpickle/cloudpickle.py    2024-10-11 
18:25:16.000000000 +0200
@@ -126,7 +126,7 @@
 
 
 def register_pickle_by_value(module):
-    """Register a module to make it functions and classes picklable by value.
+    """Register a module to make its functions and classes picklable by value.
 
     By default, functions and classes that are attributes of an importable
     module are to be pickled by reference, that is relying on re-importing
@@ -213,6 +213,7 @@
         # sys.modules
         if (
             module_name == "__main__"
+            or module_name == "__mp_main__"
             or module is None
             or not isinstance(module, types.ModuleType)
         ):
@@ -409,7 +410,10 @@
 
 def _extract_class_dict(cls):
     """Retrieve a copy of the dict of a class without the inherited method."""
-    clsdict = dict(cls.__dict__)  # copy dict proxy to a dict
+    # Hack to circumvent non-predictable memoization caused by string 
interning.
+    # See the inline comment in _class_setstate for details.
+    clsdict = {"".join(k): cls.__dict__[k] for k in sorted(cls.__dict__)}
+
     if len(cls.__bases__) == 1:
         inherited_dict = cls.__bases__[0].__dict__
     else:
@@ -533,9 +537,15 @@
     The "extra" variable is meant to be a dict (or None) that can be used for
     forward compatibility shall the need arise.
     """
+    # We need to intern the keys of the type_kwargs dict to avoid having
+    # different pickles for the same dynamic class depending on whether it was
+    # dynamically created or reconstructed from a pickled stream.
+    type_kwargs = {sys.intern(k): v for k, v in type_kwargs.items()}
+
     skeleton_class = types.new_class(
         name, bases, {"metaclass": type_constructor}, lambda ns: 
ns.update(type_kwargs)
     )
+
     return _lookup_class_or_track(class_tracker_id, skeleton_class)
 
 
@@ -694,8 +704,10 @@
     #   unpickling time by iterating over slotstate and calling setattr(func,
     #   slotname, slotvalue)
     slotstate = {
-        "__name__": func.__name__,
-        "__qualname__": func.__qualname__,
+        # Hack to circumvent non-predictable memoization caused by string 
interning.
+        # See the inline comment in _class_setstate for details.
+        "__name__": "".join(func.__name__),
+        "__qualname__": "".join(func.__qualname__),
         "__annotations__": func.__annotations__,
         "__kwdefaults__": func.__kwdefaults__,
         "__defaults__": func.__defaults__,
@@ -721,7 +733,9 @@
     )
     slotstate["__globals__"] = f_globals
 
-    state = func.__dict__
+    # Hack to circumvent non-predictable memoization caused by string 
interning.
+    # See the inline comment in _class_setstate for details.
+    state = {"".join(k): v for k, v in func.__dict__.items()}
     return state, slotstate
 
 
@@ -802,6 +816,19 @@
     # of the specific type from types, for example:
     # >>> from types import CodeType
     # >>> help(CodeType)
+
+    # Hack to circumvent non-predictable memoization caused by string 
interning.
+    # See the inline comment in _class_setstate for details.
+    co_name = "".join(obj.co_name)
+
+    # Create shallow copies of these tuple to make cloudpickle payload 
deterministic.
+    # When creating a code object during load, copies of these four tuples are
+    # created, while in the main process, these tuples can be shared.
+    # By always creating copies, we make sure the resulting payload is 
deterministic.
+    co_names = tuple(name for name in obj.co_names)
+    co_varnames = tuple(name for name in obj.co_varnames)
+    co_freevars = tuple(name for name in obj.co_freevars)
+    co_cellvars = tuple(name for name in obj.co_cellvars)
     if hasattr(obj, "co_exceptiontable"):
         # Python 3.11 and later: there are some new attributes
         # related to the enhanced exceptions.
@@ -814,16 +841,16 @@
             obj.co_flags,
             obj.co_code,
             obj.co_consts,
-            obj.co_names,
-            obj.co_varnames,
+            co_names,
+            co_varnames,
             obj.co_filename,
-            obj.co_name,
+            co_name,
             obj.co_qualname,
             obj.co_firstlineno,
             obj.co_linetable,
             obj.co_exceptiontable,
-            obj.co_freevars,
-            obj.co_cellvars,
+            co_freevars,
+            co_cellvars,
         )
     elif hasattr(obj, "co_linetable"):
         # Python 3.10 and later: obj.co_lnotab is deprecated and constructor
@@ -837,14 +864,14 @@
             obj.co_flags,
             obj.co_code,
             obj.co_consts,
-            obj.co_names,
-            obj.co_varnames,
+            co_names,
+            co_varnames,
             obj.co_filename,
-            obj.co_name,
+            co_name,
             obj.co_firstlineno,
             obj.co_linetable,
-            obj.co_freevars,
-            obj.co_cellvars,
+            co_freevars,
+            co_cellvars,
         )
     elif hasattr(obj, "co_nmeta"):  # pragma: no cover
         # "nogil" Python: modified attributes from 3.9
@@ -859,15 +886,15 @@
             obj.co_flags,
             obj.co_code,
             obj.co_consts,
-            obj.co_varnames,
+            co_varnames,
             obj.co_filename,
-            obj.co_name,
+            co_name,
             obj.co_firstlineno,
             obj.co_lnotab,
             obj.co_exc_handlers,
             obj.co_jump_table,
-            obj.co_freevars,
-            obj.co_cellvars,
+            co_freevars,
+            co_cellvars,
             obj.co_free2reg,
             obj.co_cell2reg,
         )
@@ -882,14 +909,14 @@
             obj.co_flags,
             obj.co_code,
             obj.co_consts,
-            obj.co_names,
-            obj.co_varnames,
+            co_names,
+            co_varnames,
             obj.co_filename,
-            obj.co_name,
+            co_name,
             obj.co_firstlineno,
             obj.co_lnotab,
-            obj.co_freevars,
-            obj.co_cellvars,
+            co_freevars,
+            co_cellvars,
         )
     return types.CodeType, args
 
@@ -1127,7 +1154,30 @@
         if attrname == "_abc_impl":
             registry = attr
         else:
+            # Note: setting attribute names on a class automatically triggers 
their
+            # interning in CPython:
+            # 
https://github.com/python/cpython/blob/v3.12.0/Objects/object.c#L957
+            #
+            # This means that to get deterministic pickling for a dynamic 
class that
+            # was initially defined in a different Python process, the pickler
+            # needs to ensure that dynamic class and function attribute names 
are
+            # systematically copied into a non-interned version to avoid
+            # unpredictable pickle payloads.
+            #
+            # Indeed the Pickler's memoizer relies on physical object identity 
to break
+            # cycles in the reference graph of the object being serialized.
             setattr(obj, attrname, attr)
+
+    if sys.version_info >= (3, 13) and "__firstlineno__" in state:
+        # Set the Python 3.13+ only __firstlineno__  attribute one more time, 
as it
+        # will be automatically deleted by the `setattr(obj, attrname, attr)` 
call
+        # above when `attrname` is "__firstlineno__". We assume that 
preserving this
+        # information might be important for some users and that it not stale 
in the
+        # context of cloudpickle usage, hence legitimate to propagate. 
Furthermore it
+        # is necessary to do so to keep deterministic chained pickling as 
tested in
+        # test_deterministic_str_interning_for_chained_dynamic_class_pickling.
+        obj.__firstlineno__ = state["__firstlineno__"]
+
     if registry is not None:
         for subclass in registry:
             obj.register(subclass)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/cloudpickle/cloudpickle_fast.py 
new/cloudpickle-3.1.0/cloudpickle/cloudpickle_fast.py
--- old/cloudpickle-3.0.0/cloudpickle/cloudpickle_fast.py       2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/cloudpickle/cloudpickle_fast.py       2024-10-11 
18:25:16.000000000 +0200
@@ -6,6 +6,7 @@
 
 See: tests/test_backward_compat.py
 """
+
 from . import cloudpickle
 
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/dev-requirements.txt 
new/cloudpickle-3.1.0/dev-requirements.txt
--- old/cloudpickle-3.0.0/dev-requirements.txt  2023-10-13 14:58:24.000000000 
+0200
+++ new/cloudpickle-3.1.0/dev-requirements.txt  2024-10-11 18:25:16.000000000 
+0200
@@ -9,7 +9,7 @@
 tornado
 # To be able to test numpy specific things
 # but do not build numpy from source on Python nightly
-numpy >=1.18.5; python_version <= '3.8'
+numpy >=1.18.5; python_version <= '3.12'
 # Code coverage uploader for Travis:
 codecov
 coverage
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/__init__.py 
new/cloudpickle-3.1.0/tests/__init__.py
--- old/cloudpickle-3.0.0/tests/__init__.py     2023-10-13 14:58:24.000000000 
+0200
+++ new/cloudpickle-3.1.0/tests/__init__.py     2024-10-11 18:25:16.000000000 
+0200
@@ -0,0 +1,3 @@
+import pytest
+
+pytest.register_assert_rewrite("tests.testutils")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/cloudpickle_test.py 
new/cloudpickle-3.1.0/tests/cloudpickle_test.py
--- old/cloudpickle-3.0.0/tests/cloudpickle_test.py     2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tests/cloudpickle_test.py     2024-10-11 
18:25:16.000000000 +0200
@@ -29,6 +29,7 @@
 import pickle
 
 import pytest
+from pathlib import Path
 
 try:
     # try importing numpy and scipy. These are not hard dependencies and
@@ -48,10 +49,11 @@
 from cloudpickle.cloudpickle import _extract_class_dict, _whichmodule
 from cloudpickle.cloudpickle import _lookup_module_and_qualname
 
+from .testutils import subprocess_worker
 from .testutils import subprocess_pickle_echo
 from .testutils import subprocess_pickle_string
 from .testutils import assert_run_python_script
-from .testutils import subprocess_worker
+from .testutils import check_deterministic_pickle
 
 
 _TEST_GLOBAL_VARIABLE = "default_value"
@@ -108,7 +110,12 @@
             return "c"
 
     clsdict = _extract_class_dict(C)
-    assert sorted(clsdict.keys()) == ["C_CONSTANT", "__doc__", "method_c"]
+    expected_keys = ["C_CONSTANT", "__doc__", "method_c"]
+    # New attribute in Python 3.13 beta 1
+    # https://github.com/python/cpython/pull/118475
+    if sys.version_info >= (3, 13):
+        expected_keys.insert(2, "__firstlineno__")
+    assert list(clsdict.keys()) == expected_keys
     assert clsdict["C_CONSTANT"] == 43
     assert clsdict["__doc__"] is None
     assert clsdict["method_c"](C()) == C().method_c()
@@ -329,6 +336,25 @@
         g = pickle_depickle(f(), protocol=self.protocol)
         self.assertEqual(g(), 2)
 
+    def test_class_no_firstlineno_deletion_(self):
+        # `__firstlineno__` is a new attribute of classes introduced in Python 
3.13.
+        # This attribute used to be automatically deleted when unpickling a 
class as a
+        # consequence of cloudpickle setting a class's `__module__` attribute 
at
+        # unpickling time (see 
https://github.com/python/cpython/blob/73c152b346a18ed8308e469bdd232698e6cd3a63/Objects/typeobject.c#L1353-L1356).
+        # This deletion would cause tests like
+        # 
`test_deterministic_dynamic_class_attr_ordering_for_chained_pickling` to fail.
+        # This test makes sure that the attribute `__firstlineno__` is 
preserved
+        # across a cloudpickle roundtrip.
+
+        class A:
+            pass
+
+        if hasattr(A, "__firstlineno__"):
+            A_roundtrip = pickle_depickle(A, protocol=self.protocol)
+            assert hasattr(A_roundtrip, "__firstlineno__")
+            assert A_roundtrip.__firstlineno__ == A.__firstlineno__
+
+
     def test_dynamically_generated_class_that_uses_super(self):
         class Base:
             def method(self):
@@ -1478,6 +1504,32 @@
         finally:
             sys.modules.pop("NonModuleObject")
 
+    def test_importing_multiprocessing_does_not_impact_whichmodule(self):
+        # non-regression test for #528
+        pytest.importorskip("numpy")
+        script = textwrap.dedent("""
+        import multiprocessing
+        import cloudpickle
+        from numpy import exp
+
+        print(cloudpickle.cloudpickle._whichmodule(exp, exp.__name__))
+        """)
+        script_path = Path(self.tmpdir) / "whichmodule_and_multiprocessing.py"
+        with open(script_path, mode="w") as f:
+            f.write(script)
+
+        proc = subprocess.Popen(
+            [sys.executable, str(script_path)],
+            stdout=subprocess.PIPE,
+            stderr=subprocess.STDOUT,
+        )
+        out, _ = proc.communicate()
+        self.assertEqual(proc.wait(), 0)
+        assert out.strip() in (
+            b"numpy.core._multiarray_umath",  # numpy 1
+            b"numpy._core._multiarray_umath",  # numpy 2
+        )
+
     def test_unrelated_faulty_module(self):
         # Check that pickling a dynamically defined function or class does not
         # fail when introspecting the currently loaded modules in sys.modules
@@ -1951,7 +2003,6 @@
 
             class A:
                 '''Updated class definition'''
-                pass
 
             assert not w.run(lambda obj_id: isinstance(lookup(obj_id), A), id1)
             retrieved1 = w.run(lookup, id1)
@@ -1983,6 +2034,144 @@
         """.format(protocol=self.protocol)
         assert_run_python_script(code)
 
+    def test_dynamic_func_deterministic_roundtrip(self):
+        # Check that the pickle serialization for a dynamic func is the same
+        # in two processes.
+
+        def get_dynamic_func_pickle():
+            def test_method(arg_1, arg_2):
+                pass
+
+            return cloudpickle.dumps(test_method)
+
+        with subprocess_worker(protocol=self.protocol) as w:
+            A_dump = w.run(get_dynamic_func_pickle)
+            check_deterministic_pickle(A_dump, get_dynamic_func_pickle())
+
+    def test_dynamic_class_deterministic_roundtrip(self):
+        # Check that the pickle serialization for a dynamic class is the same
+        # in two processes.
+        pytest.xfail("This test fails due to different tracker_id.")
+
+        def get_dynamic_class_pickle():
+            class A:
+                """Class with potential string interning issues."""
+
+                arg_1 = "class_value"
+
+                def join(self):
+                    pass
+
+                def test_method(self, arg_1, join):
+                    pass
+
+            return cloudpickle.dumps(A)
+
+        with subprocess_worker(protocol=self.protocol) as w:
+            A_dump = w.run(get_dynamic_class_pickle)
+            check_deterministic_pickle(A_dump, get_dynamic_class_pickle())
+
+    def 
test_deterministic_dynamic_class_attr_ordering_for_chained_pickling(self):
+        # Check that the pickle produced by pickling a reconstructed class 
definition
+        # in a remote process matches the pickle produced by pickling the 
original
+        # class definition.
+        # In particular, this test checks that the order of the class 
attributes is
+        # deterministic.
+
+        with subprocess_worker(protocol=self.protocol) as w:
+
+            class A:
+                """Simple class definition"""
+
+                pass
+
+            A_dump = w.run(cloudpickle.dumps, A)
+            check_deterministic_pickle(A_dump, cloudpickle.dumps(A))
+
+            # If the `__doc__` attribute is defined after some other class
+            # attribute, this can cause class attribute ordering changes due to
+            # the way we reconstruct the class definition in
+            # `_make_skeleton_class`, which creates the class and thus its
+            # `__doc__` attribute before populating the class attributes.
+            class A:
+                name = "A"
+                __doc__ = "Updated class definition"
+
+            A_dump = w.run(cloudpickle.dumps, A)
+            check_deterministic_pickle(A_dump, cloudpickle.dumps(A))
+
+            # If a `__doc__` is defined on the `__init__` method, this can
+            # cause ordering changes due to the way we reconstruct the class
+            # with `_make_skeleton_class`.
+            class A:
+                def __init__(self):
+                    """Class definition with explicit __init__"""
+                    pass
+
+            A_dump = w.run(cloudpickle.dumps, A)
+            check_deterministic_pickle(A_dump, cloudpickle.dumps(A))
+
+    def 
test_deterministic_str_interning_for_chained_dynamic_class_pickling(self):
+        # Check that the pickle produced by the unpickled instance is the same.
+        # This checks that there is no issue related to the string interning of
+        # the names of attributes of class definitions and names of attributes
+        # of the `__code__` objects of the methods.
+
+        with subprocess_worker(protocol=self.protocol) as w:
+            # Due to interning of class attributes, check that this does not
+            # create issues with dynamic function definition.
+            class A:
+                """Class with potential string interning issues."""
+
+                arg_1 = "class_value"
+
+                def join(self):
+                    pass
+
+                def test_method(self, arg_1, join):
+                    pass
+
+            A_dump = w.run(cloudpickle.dumps, A)
+            check_deterministic_pickle(A_dump, cloudpickle.dumps(A))
+
+            # Also check that memoization of string value inside the class does
+            # not cause non-deterministic pickle with interned method names.
+            class A:
+                """Class with potential string interning issues."""
+
+                arg_1 = "join"
+
+                def join(self, arg_1):
+                    pass
+
+            # Set a custom method attribute that can potentially trigger
+            # undeterministic memoization depending on the interning state of
+            # the string used for the attribute name.
+            A.join.arg_1 = "join"
+
+            A_dump = w.run(cloudpickle.dumps, A)
+            check_deterministic_pickle(A_dump, cloudpickle.dumps(A))
+
+    def test_dynamic_class_determinist_subworker_tuple_memoization(self):
+        # Check that the pickle produced by the unpickled instance is the same.
+        # This highlights some issues with tuple memoization.
+
+        with subprocess_worker(protocol=self.protocol) as w:
+            # Arguments' tuple is memoized in the main process but not in the
+            # subprocess as the tuples do not share the same id in the loaded
+            # class.
+            class A:
+                """Class with potential tuple memoization issues."""
+
+                def func1(self):
+                    pass
+
+                def func2(self):
+                    pass
+
+            A_dump = w.run(cloudpickle.dumps, A)
+            check_deterministic_pickle(A_dump, cloudpickle.dumps(A))
+
     @pytest.mark.skipif(
         platform.python_implementation() == "PyPy",
         reason="Skip PyPy because memory grows too much",
@@ -2315,6 +2504,12 @@
         inner_func = depickled_factory()
         assert inner_func() == _TEST_GLOBAL_VARIABLE
 
+    @pytest.mark.skipif(
+        sys.version_info < (3, 9),
+        reason="Can cause CPython 3.8 to segfault",
+    )
+    # TODO: remove this xfail when we drop support for Python 3.8. We don't
+    # plan to fix it because Python 3.8 is EOL.
     def test_recursion_during_pickling(self):
         class A:
             def __getattribute__(self, name):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/cloudpickle-3.0.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/__init__.py
 
new/cloudpickle-3.1.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/__init__.py
--- 
old/cloudpickle-3.0.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/__init__.py
    2023-10-13 14:58:24.000000000 +0200
+++ 
new/cloudpickle-3.1.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/__init__.py
    2024-10-11 18:25:16.000000000 +0200
@@ -27,19 +27,22 @@
     Relative import of functions living both inside modules and packages are
     tested.
     """
+
     def f():
         # module_function belongs to _cloudpickle_testpkg.mod, which is a
         # module
         from .mod import module_function
+
         return module_function()
 
     def g():
         # package_function belongs to _cloudpickle_testpkg, which is a package
         from . import package_function
+
         return package_function()
 
     return f, g
 
 
 some_singleton = _SingletonClass()
-T = typing.TypeVar('T')
+T = typing.TypeVar("T")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/cloudpickle-3.0.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/mod.py 
new/cloudpickle-3.1.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/mod.py
--- old/cloudpickle-3.0.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/mod.py 
2023-10-13 14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tests/cloudpickle_testpkg/_cloudpickle_testpkg/mod.py 
2024-10-11 18:25:16.000000000 +0200
@@ -20,7 +20,7 @@
 # module.  The following lines emulate such a behavior without being a compiled
 # extension module.
 
-submodule_name = '_cloudpickle_testpkg.mod.dynamic_submodule'
+submodule_name = "_cloudpickle_testpkg.mod.dynamic_submodule"
 dynamic_submodule = types.ModuleType(submodule_name)
 
 # This line allows the dynamic_module to be imported using either one of:
@@ -32,7 +32,7 @@
 # so this dynamic module could be binded to another name. This behavior is
 # demonstrated with `dynamic_submodule_two`
 
-submodule_name_two = '_cloudpickle_testpkg.mod.dynamic_submodule_two'
+submodule_name_two = "_cloudpickle_testpkg.mod.dynamic_submodule_two"
 # Notice the inconsistent name binding, breaking attribute lookup-based import
 # attempts.
 another_submodule = types.ModuleType(submodule_name_two)
@@ -41,9 +41,7 @@
 
 # In this third case, the module is not added to sys.modules, and can only be
 # imported using attribute lookup-based imports.
-submodule_three = types.ModuleType(
-    '_cloudpickle_testpkg.mod.dynamic_submodule_three'
-)
+submodule_three = 
types.ModuleType("_cloudpickle_testpkg.mod.dynamic_submodule_three")
 code = """
 def f(x):
     return x
@@ -53,9 +51,7 @@
 
 # What about a dynamic submodule inside a dynamic submodule inside an
 # importable module?
-subsubmodule_name = (
-    '_cloudpickle_testpkg.mod.dynamic_submodule.dynamic_subsubmodule'
-)
+subsubmodule_name = 
"_cloudpickle_testpkg.mod.dynamic_submodule.dynamic_subsubmodule"
 dynamic_subsubmodule = types.ModuleType(subsubmodule_name)
 dynamic_submodule.dynamic_subsubmodule = dynamic_subsubmodule
 sys.modules[subsubmodule_name] = dynamic_subsubmodule
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/cloudpickle_testpkg/setup.py 
new/cloudpickle-3.1.0/tests/cloudpickle_testpkg/setup.py
--- old/cloudpickle-3.0.0/tests/cloudpickle_testpkg/setup.py    2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tests/cloudpickle_testpkg/setup.py    2024-10-11 
18:25:16.000000000 +0200
@@ -5,12 +5,12 @@
 
 
 setup(
-    name='cloudpickle_testpkg',
-    version='0.0.0',
-    description='Package used only for cloudpickle testing purposes',
-    author='Cloudpipe',
-    author_email='cloudp...@googlegroups.com',
-    license='BSD 3-Clause License',
-    packages=['_cloudpickle_testpkg'],
-    python_requires='>=3.8',
+    name="cloudpickle_testpkg",
+    version="0.0.0",
+    description="Package used only for cloudpickle testing purposes",
+    author="Cloudpipe",
+    author_email="cloudp...@googlegroups.com",
+    license="BSD 3-Clause License",
+    packages=["_cloudpickle_testpkg"],
+    python_requires=">=3.8",
 )
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/generate_old_pickles.py 
new/cloudpickle-3.1.0/tests/generate_old_pickles.py
--- old/cloudpickle-3.0.0/tests/generate_old_pickles.py 2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tests/generate_old_pickles.py 2024-10-11 
18:25:16.000000000 +0200
@@ -8,6 +8,7 @@
 active cloudpickle branch to make sure that cloudpickle is able to depickle old
 cloudpickle files.
 """
+
 import sys
 
 from pathlib import Path
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/mock_local_folder/mod.py 
new/cloudpickle-3.1.0/tests/mock_local_folder/mod.py
--- old/cloudpickle-3.0.0/tests/mock_local_folder/mod.py        2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tests/mock_local_folder/mod.py        2024-10-11 
18:25:16.000000000 +0200
@@ -5,6 +5,7 @@
 reference should instead flagged to cloudpickle for pickling by value: this is
 done using the register_pickle_by_value api exposed by cloudpickle.
 """
+
 import typing
 
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/test_backward_compat.py 
new/cloudpickle-3.1.0/tests/test_backward_compat.py
--- old/cloudpickle-3.0.0/tests/test_backward_compat.py 2023-10-13 
14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tests/test_backward_compat.py 2024-10-11 
18:25:16.000000000 +0200
@@ -9,6 +9,7 @@
 few canonical use cases. Cloudpicke backward-compatitibility support remains a
 best-effort initiative.
 """
+
 import pickle
 
 import pytest
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tests/testutils.py 
new/cloudpickle-3.1.0/tests/testutils.py
--- old/cloudpickle-3.0.0/tests/testutils.py    2023-10-13 14:58:24.000000000 
+0200
+++ new/cloudpickle-3.1.0/tests/testutils.py    2024-10-11 18:25:16.000000000 
+0200
@@ -1,9 +1,12 @@
 import sys
 import os
-import os.path as op
+import io
+import difflib
 import tempfile
+import os.path as op
 from subprocess import Popen, check_output, PIPE, STDOUT, CalledProcessError
 import pickle
+import pickletools
 from contextlib import contextmanager
 from concurrent.futures import ProcessPoolExecutor
 
@@ -213,6 +216,39 @@
         os.unlink(source_file)
 
 
+def check_deterministic_pickle(a, b):
+    """Check that two pickle output are bitwise equal.
+
+    If it is not the case, print the diff between the disassembled pickle
+    payloads.
+
+    This helper is useful to investigate non-deterministic pickling.
+    """
+    if a != b:
+        with io.StringIO() as out:
+            pickletools.dis(pickletools.optimize(a), out)
+            a_out = out.getvalue()
+            # Remove the 11 first characters of each line to remove the 
bytecode offset
+            # of each object, which is different on each line for very small 
differences,
+            # making the diff very hard to read.
+            a_out = "\n".join(ll[11:] for ll in a_out.splitlines())
+        with io.StringIO() as out:
+            pickletools.dis(pickletools.optimize(b), out)
+            b_out = out.getvalue()
+            b_out = "\n".join(ll[11:] for ll in b_out.splitlines())
+        assert a_out == b_out
+        full_diff = difflib.context_diff(
+            a_out.splitlines(keepends=True), b_out.splitlines(keepends=True)
+        )
+        full_diff = "".join(full_diff)
+        if len(full_diff) > 1500:
+            full_diff = full_diff[:1494] + " [...]"
+        raise AssertionError(
+           "Pickle payloads are not bitwise equal:\n"
+           + full_diff
+        )
+
+
 if __name__ == "__main__":
     protocol = int(sys.argv[sys.argv.index("--protocol") + 1])
     pickle_echo(protocol=protocol)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cloudpickle-3.0.0/tox.ini 
new/cloudpickle-3.1.0/tox.ini
--- old/cloudpickle-3.0.0/tox.ini       2023-10-13 14:58:24.000000000 +0200
+++ new/cloudpickle-3.1.0/tox.ini       2024-10-11 18:25:16.000000000 +0200
@@ -1,5 +1,5 @@
 [tox]
-envlist = py{38, 39, 310, 311, 312, py3}
+envlist = py{38, 39, 310, 311, 312, 313, py3}
 
 [testenv]
 deps = -rdev-requirements.txt

Reply via email to