Łukasz Langa added the comment:

> What the problem tries to solve PR 3508?

The two test cases added demonstrate what was impossible to pickle before and 
is now.

> Swallowing all exceptions looks like an antipattern to me.

This is the only thing that we can do faced with custom `__getattr__()` 
implementations, especially when `copy.deepcopy()` creates new objects with 
`cls.__new__()`, something that most class implementers won't expect.

> Rather than failing and allowing the programmer to fix the picleability of 
> its class, this can silently produce incorrect pickle data.

Can you give me an example where this would lead to incorrect pickle data?

> By swallowing MemoryError and RecursionError this changes the behavior of 
> objects in an environment with limited resources: lack of memory or calling 
> deep in the calling stack. This adds heisenbugs.

This is what `hasattr()` in Python 2 did.  This is why in Python 2 the 
`RecursionError` example I added to the tests was actually working just fine.

----------
nosy: +lukasz.langa
stage: patch review -> needs patch

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue16251>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to