[issue47234] PEP-484 "numeric tower" approach makes it hard/impossible to specify contracts in documentation
Thomas Fischbacher added the comment: Re AlexWaygood: If these PEP-484 related things were so obvious that they would admit a compact description of the problem in 2-3 lines, these issues would likely have been identified much earlier. We would not be seeing them now, given that Python by and large is a somewhat mature language. -- ___ Python tracker <https://bugs.python.org/issue47234> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47121] math.isfinite() can raise exception when called on a number
Thomas Fischbacher added the comment: Tim, the problem may well be simply due to the documentation of math.isfinite() being off here. This is what we currently have: https://docs.python.org/3/library/math.html#math.isfinite === math.isfinite(x) Return True if x is neither an infinity nor a NaN, and False otherwise. (Note that 0.0 is considered finite.) New in version 3.2. === If this were re-worded as follows (and corresponding changes were made to other such functions), everyone would know what the expectations and behavior are: === math.isfinite(x) If `x` is a `float` instance, this evaluates to `True` if `x` is neither a float infinity nor a NaN, and `False` otherwise. If `x` is not a `float` instance, this is evaluates to `math.isfinite(float(x))`. New in version 3.2. === This would be an accurate defining description of the actual behavior. Note that, "thanks to PEP-484", this abbreviation would currently be ambiguous though: === math.isfinite(x) If `x` is a float, this evaluates to `True` if `x` is neither a float infinity nor a NaN, and `False` otherwise. If `x` is not a float, this is evaluates to `math.isfinite(float(x))`. New in version 3.2. === ("ambiguous" since "float" means different things as a static type and as a numbers class - and it is not clear what would be referred to here). Changing/generalizing the behavior might potentially be an interesting other proposal, but I would argue that then one would want to change the behavior of quite a few other functions here as well, and all this should then perhaps go into some other `xmath` (or so) module - bit like it is with `cmath`. However, since the Python philosophy is to not rely on bureaucracy to enforce contracts (as C++, Java, etc. do it), but instead to rely on people's ability to define their own contracts, making the math.isfinite() contract more accurate w.r.t. actual behavior in the CPython implementation via extra clarification looks like a good thing to do, no? -- ___ Python tracker <https://bugs.python.org/issue47121> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47234] PEP-484 "numeric tower" approach makes it hard/impossible to specify contracts in documentation
Thomas Fischbacher added the comment: This is not a partial duplicate of https://bugs.python.org/issue47121 about math.isfinite(). The problem there is about a specific function on which the documentation may be off - I'll comment separately on that. The problem here is: There is a semantic discrepancy between what the term 'float' means "at run time", such as in a check like: issubclass(type(x), float) (I am deliberately writing it that way, given that isinstance() can, in general [but actually not for float], lie.) and what the term 'float' means in a statically-checkable type annotation like: def f(x: float) -> ... : ... ...and this causes headaches. The specific example ('middle_mean') illustrates the sort of weird situations that arise due to this. (I discovered this recently when updating some of our company's Python onboarding material, where the aspiration naturally is to be extremely accurate with all claims.) So, basically, there is a choice to make between these options: Option A: Give up on the idea that "we want to be able to reason with stringency about the behavior of code" / "we accept that there will be gaps between what code does and what we can reason about". (Not really an option, especially with an eye on "writing secure code requires being able to reason out everything with stringency".) Option B: Accept the discrepancy and tell people that they have to be mindful about float-the-dynamic-type being a different concept from float-the-static-type. Option C: Realizing that having "float" mean different things for dynamic and static typing was not a great idea to begin with, and get everybody who wants to state things such as "this function parameter can be any instance of a real number type" to use the type `numbers.Real` instead (which may well need better support by tooling), respectively express "can be int or float" as `Union[int, float]`. Also, there is Option D: PEP-484 has quite a lot of other problems where the design does not meet rather natural requirements, such as: "I cannot introduce a newtype for 'a mapping where I know the key to be a particular enum-type, but the value is type-parametric' (so the new type would also be 1-parameter type-parametric)", and this float-mess is merely one symptom of "maybe PEP-484 was approved too hastily and should have been also scrutinized by people from a community with more static typing experience". Basically, Option B would spell out as: 'We expect users who use static type annotations to write code like this, and expect them to be aware of the fact that the four places where the term "float" occurs refer to two different concepts': def foo(x: float) -> float: """Returns the foo of the number `x`. Args: x: float, the number to foo. Returns: float, the value of the foo-function at `x`. """ ... ...which actually is shorthand for...: def foo(x: float # Note: means float-or-int ) -> float # Note: means float-or-int : """Returns the foo of the number `x`. Args: x: the number to foo, an instance of the `float` type. Returns: The value of the foo-function at `x`, as an instance of the `float` type. """ ... Option C (and perhaps D) appear - to me - to be the only viable choices here. The pain with Option C is that it invalidates/changes the meaning of already-written code that claims to follow PEP-484, and the main point of Option D is all about: "If we have to cause a new wound and open up the patient again, let's try to minimize the number of times we have to do this." Option C would amount to changing the meaning of...: def foo(x: float) -> float: """Returns the foo of the number `x`. Args: x: float, the number to foo. Returns: float, the value of the foo-function at `x`. """ ... to "static type annotation float really means instance-of-float here" (I do note that issubclass(numpy.float64, float), so passing a numpy-float64 is expected to work here, which is good), and ask people who would want to have functions that can process more generic real numbers to announce this properly. So, we would end up with basically a list of different things that a function-sketch like the one above could turn into - depending on the author's intentions for the function, some major cases being perhaps: (a) ("this is supposed to strictly operate on float") def foo(x: float) -> float: """Returns the foo of the number `x`. Args: x: the number to foo. Returns: the value of the foo-function at `x`. """ (b) ("this will eat any kind of real number") def foo(x: numbers.Real) -> numbers.Real: "&qu
[issue43944] Processes in Python 3.9 exiting with code 1 when It's created inside a ThreadPoolExecutor
Thomas Grainger added the comment: the problem is multiprocessing/process is calling threading._shutdown which tries to join its own thread, because concurrent.futures.thread._threads_queues contains the main thread in the subprocess File "/home/graingert/miniconda3/envs/dask-distributed/lib/python3.10/multiprocessing/process.py", line 333, in _bootstrap threading._shutdown() File "/home/graingert/miniconda3/envs/dask-distributed/lib/python3.10/threading.py", line 1530, in _shutdown atexit_call() File "/home/graingert/miniconda3/envs/dask-distributed/lib/python3.10/concurrent/futures/thread.py", line 31, in _python_exit t.join() File "/home/graingert/miniconda3/envs/dask-distributed/lib/python3.10/threading.py", line 1086, in join raise RuntimeError("cannot join current thread") -- nosy: +graingert ___ Python tracker <https://bugs.python.org/issue43944> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47166] Dataclass transform should ignore TypeAlias variables
Thomas MK added the comment: There is of course no hard reason for not using the global scope. I just often have enums (or other types) that are very closely linked to one class. And it makes sense to me then to have a TypeAlias in that class so that I don't have to import the enum separately. For normal classes, the nested TypeAlias works completely fine in the type checkers I tested (pyright and mypy). It's just dataclasses that are the problem. But I see now that there is a general wish to keep the implementation of dataclass simple, which I can understand. -- ___ Python tracker <https://bugs.python.org/issue47166> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47234] PEP-484 "numeric tower" approach makes it hard/impossible to specify contracts in documentation
New submission from Thomas Fischbacher : Here is a major general problem with python-static-typing as it is described by PEP-484: The approach described in https://peps.python.org/pep-0484/#the-numeric-tower negatively impacts our ability to reason about the behavior of code with stringency. I would like to clarify one thing in advance: this is a real problem if we subscribe to some of the important ideas that Dijkstra articulated in his classic article "On the role of scientific thought" (e.g.: https://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD447.html). Specifically, this part: """ Let me try to explain to you, what to my taste is characteristic for all intelligent thinking. It is, that one is willing to study in depth an aspect of one's subject matter in isolation for the sake of its own consistency, all the time knowing that one is occupying oneself only with one of the aspects. We know that a program must be correct and we can study it from that viewpoint only; we also know that it should be efficient and we can study its efficiency on another day, so to speak. In another mood we may ask ourselves whether, and if so: why, the program is desirable. But nothing is gained —on the contrary!— by tackling these various aspects simultaneously. It is what I sometimes have called "the separation of concerns", which, even if not perfectly possible, is yet the only available technique for effective ordering of one's thoughts, that I know of. This is what I mean by "focussing one's attention upon some aspect": it does not mean ignoring the other aspects, it is just doing justice to the fact that from this aspect's point of view, the other is irrelevant. It is being one- and multiple-track minded simultaneously. """ So, "code should be easy to reason about". Now, let us look at this function - I am here (mostly) following the Google Python style guide (https://google.github.io/styleguide/pyguide.html) for now: === Example 1, original form === def middle_mean(xs): """Compute the average of the nonterminal elements of `xs`. Args: `xs`: a list of floating point numbers. Returns: A float, the mean of the elements in `xs[1:-1]`. Raises: ValueError: If `len(xs) < 3`. """ if len(xs) < 3: raise ValueError('Need at least 3 elements to compute middle mean.') return sum(xs[1:-1]) / (len(xs) - 2) == Let's not discuss performance, or whether it makes sense to readily generalize this to operate on other sequences than lists, but focus, following Dijkstra, on one specific concern here: Guaranteed properties. Given the function as it is above, I can make statements that are found to be correct when reasoning with mathematical rigor, such as this specific one that we will come back to: === Theorem 1 === If we have an object X that satisfies these properties...: 1. type(X) is list 2. len(X) == 4 3. all(type(x) is float for x in X) ...then we are guaranteed that `middle_mean(X)` evaluates to a value Y which satisfies: - type(Y) is float - Y == (X[1] + X[2]) * 0.5 or math.isnan(Y) === Now, following PEP-484, we would want to re-write our function, adding type annotations. Doing this mechanically would give us: === Example 1, with mechanically added type information === def middle_mean(xs: List[float]) -> float: """Compute the average of the nonterminal elements of `xs`. Args: `xs`: a list of floating point numbers. Returns: A float, the mean of the elements in `xs[1:-1]`. Raises: ValueError: If `len(xs) < 3`. """ if len(xs) < 3: raise ValueError('Need at least 3 elements to compute middle mean.') return sum(xs[1:-1]) / (len(xs) - 2) == (We are also deliberately not discussing another question here: given this documentation and type annotation, should the callee be considered to be permitted to mutate the input list?) So, given the above form, we now find that there seems to be quite a bit of redundancy here. After all, we have the type annotation but also repeat some typing information in the docstring. Hence, the obvious proposal here is to re-write the above definition again, obtaining: === Example 1, "cleaned up" === def middle_mean(xs: List[float]) -> float: """Compute the average of the nonterminal elements of `xs`. Args: `xs`: numbers to average, with terminals ignored. Returns: The mean of the elements in `xs[1:-1]`. Raises: ValueError: If `len(xs) < 3`. """ if len(xs) < 3: raise ValueError('Need at least 3 elements to compute middle mean.') return sum(xs[1:-1]) / (len(xs) - 2) == But now, what does this change mean for the contract? Part of the "If arguments have these properties, then these are the guarantees&quo
[issue47166] Dataclass transform should ignore TypeAlias variables
New submission from Thomas MK : The dataclass transformation ignores attributes that are annotated as ClassVar. I think it should also ignore attributes that are annotated as TypeAlias. Specifically, I have this usecase in mind: class RunMode(Enum): release = auto() debug = auto() @dataclass class Run: Mode: TypeAlias = RunMode mode: Mode = Mode.release -- components: Library (Lib) messages: 416368 nosy: thomkeh priority: normal severity: normal status: open title: Dataclass transform should ignore TypeAlias variables type: behavior versions: Python 3.10, Python 3.11, Python 3.7, Python 3.8, Python 3.9 ___ Python tracker <https://bugs.python.org/issue47166> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47121] math.isfinite() can raise exception when called on a number
Thomas Fischbacher added the comment: The problem with PEP-484 is that if one wants to use static type analysis, neither of these options are good: - Use static annotations on functions, and additionally spec out expectations in docstrings. Do note that the two types places where "float" is mentioned here refer to different concepts. This looks as if there were duplication, but there actually isn't, since the claims are different. This is confusing as hell. def foo(x: float) -> float: """Foos the barbaz Args: x: float, the foobar Returns: float, the foofoo""" The floats in the docstring give me a guarantee: "If I feed in a float, I am guaranteed to receive back a float". The floats in the static type annotation merely say "yeah, can be float or int, and I'd call it ok in these cases" - that's a very different statement. - Just go with static annotations, drop mention of types from docstrings, and accept that we lose the ability to stringently reason about the behavior of code. With respect to this latter option, I think we can wait for "losing the ability to stringently reason about the behavior of code" to cause major security headaches. That's basically opening up the door to many problems at the level of "I can crash the webserver by requesting the url http://lpt1";. -- ___ Python tracker <https://bugs.python.org/issue47121> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47121] math.isfinite() can raise exception when called on a number
New submission from Thomas Fischbacher : >>> help(math.isfinite) isfinite(x, /) Return True if x is neither an infinity nor a NaN, and False otherwise. So, one would expect the following expression to return `True` or `False`. We instead observe: >>> math.isfinite(10**1000) Traceback (most recent call last): File "", line 1, in OverflowError: int too large to convert to float (There likewise is a corresponding issue with other, similar, functions). This especially hurts since PEP-484 states that having a Sequence[float] `xs` does not allow us to infer that `all(issubclass(type(x), float) for x in xs)` actually holds - since a PEP-484 "float" actually does also include "int" (and still, issubclass(int, float) == False). Now, strictly speaking, `help(math)` states that DESCRIPTION This module provides access to the mathematical functions defined by the C standard. ...but according to "man 3 isfinite", the math.h "isfinite" is a macro and not a function - and the man page does not show type information for that reason. -- messages: 416010 nosy: tfish2 priority: normal severity: normal status: open title: math.isfinite() can raise exception when called on a number ___ Python tracker <https://bugs.python.org/issue47121> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46972] Documentation: Reference says AssertionError is raised by `assert`, but not all AssertionErrors are.
Thomas Fischbacher added the comment: Addendum Serhiy, I agree that my assessment was incorrect. It actually is unittest/mock.py that has quite a few 'raise AssertionError' that are not coming from an 'assert' keyword statement. At a deeper level, the problem here is as follows: Every programming language has to make an awkward choice: either it excludes some authors ("must be forklift certified"), or it adds a lot of bureaucratic scaffolding to have some mechanisms that allow code authors to enforce API contracts (as if this would help to "keep out the tide" of unprincipled code authors), or it takes a more relaxed perspective - as also Perl did - of "we are all responsible users" / "do not do this because you are not invited, not because the owner has a shotgun". I'd call this third approach quite reasonable overall, but then the understanding is that "everybody treats documentation as binding and knows how to write good documentation". After all, we need to be able to reason about code, and in order to do that, it matters to have guarantees such as for example: "Looking up a nonexistent key for a mapping by evaluating the_mapping[the_key] can raise an exception, and when it does, that exception is guaranteed to be an instance of KeyError". Unfortunately, Python on the one hand emphasizes "responsible behavior" - i.e. "people know how to write and read documentation, and the written documentation creates a shared understanding between its author and reader", but on the other hand is often really bad at properly documenting its interfaces. If I had to name one thing that really needs fixing with Python, it would be this. -- ___ Python tracker <https://bugs.python.org/issue46972> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45563] inspect.getframeinfo() doesn't handle frames without lineno
Change by Thomas Grainger : -- keywords: +patch nosy: +graingert nosy_count: 4.0 -> 5.0 pull_requests: +30134 stage: resolved -> patch review pull_request: https://github.com/python/cpython/pull/32044 ___ Python tracker <https://bugs.python.org/issue45563> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47085] missing frame.f_lineno on JUMP_ABSOLUTE
New submission from Thomas Grainger : the following code prints: import sys import dis import pprint def demo(): for i in range(1): if i >= 0: pass class Tracer: def __init__(self): self.events = [] def trace(self, frame, event, arg): self.events.append((frame.f_lineno, frame.f_lasti, event)) frame.f_trace_lines = True frame.f_trace_opcodes = True return self.trace def main(): t = Tracer() old_trace = sys.gettrace() try: sys.settrace(t.trace) demo() finally: sys.settrace(old_trace) dis.dis(demo) pprint.pp(t.events) if __name__ == "__main__": sys.exit(main()) 7 0 LOAD_GLOBAL 0 (range) 2 LOAD_CONST 1 (1) 4 CALL_FUNCTION1 6 GET_ITER >>8 FOR_ITER 7 (to 24) 10 STORE_FAST 0 (i) 8 12 LOAD_FAST0 (i) 14 LOAD_CONST 2 (0) 16 COMPARE_OP 5 (>=) 18 POP_JUMP_IF_FALSE 11 (to 22) 9 20 NOP >> 22 JUMP_ABSOLUTE4 (to 8) 7 >> 24 LOAD_CONST 0 (None) 26 RETURN_VALUE [(6, -1, 'call'), (7, 0, 'line'), (7, 0, 'opcode'), (7, 2, 'opcode'), (7, 4, 'opcode'), (7, 6, 'opcode'), (7, 8, 'opcode'), (7, 10, 'opcode'), (8, 12, 'line'), (8, 12, 'opcode'), (8, 14, 'opcode'), (8, 16, 'opcode'), (8, 18, 'opcode'), (9, 20, 'line'), (9, 20, 'opcode'), (None, 22, 'opcode'), (7, 8, 'line'), (7, 8, 'opcode'), (7, 24, 'opcode'), (7, 26, 'opcode'), (7, 26, 'return')] but I'd expect (9, 22, 'opcode') instead of (None, 22, 'opcode'), -- messages: 415697 nosy: graingert priority: normal severity: normal status: open title: missing frame.f_lineno on JUMP_ABSOLUTE versions: Python 3.10, Python 3.11 ___ Python tracker <https://bugs.python.org/issue47085> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Thomas Grainger added the comment: > I'd advocate for not supporting bytes paths and instead updating the > documentation to require strings. I've got PR GH-31934 started to do this -- message_count: 8.0 -> 9.0 pull_requests: +30026 pull_request: https://github.com/python/cpython/pull/31934 ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47026] BytesWarning in zipimport paths on sys.path
New submission from Thomas Grainger : importing from a bytes zipimport path on sys.path results in a BytesWarning: Comparison between bytes and string running the reproducer with `python -b` shows: python -b zipfile_demo.py :1345: BytesWarning: Comparison between bytes and string see also https://bugs.python.org/issue47025 -- components: Library (Lib) files: zipfile_demo.py messages: 415245 nosy: graingert priority: normal severity: normal status: open title: BytesWarning in zipimport paths on sys.path versions: Python 3.10, Python 3.11, Python 3.9 Added file: https://bugs.python.org/file50680/zipfile_demo.py ___ Python tracker <https://bugs.python.org/issue47026> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Thomas Grainger added the comment: zipimporter.zipimporter handles non-bytes paths here: https://github.com/python/cpython/blob/2cf7f865f099db11cc6903b334d9c376610313e8/Lib/zipimport.py#L65-L67 I think FileFinder should do the same -- ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Thomas Grainger added the comment: interestingly bytes filenames pointing to zip files on sys.path do support bytes (see zipfile_demo.py) -- Added file: https://bugs.python.org/file50679/zipfile_demo.py ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Thomas Grainger added the comment: https://docs.python.org/3/reference/import.html#path-entry-finders says "The encoding of bytes entries is determined by the individual path entry finders." see https://github.com/python/cpython/commit/82c1c781c7ee6496bd4c404b7ba972eed5dbcb12 -- ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Thomas Grainger added the comment: this is a regression from 3.2: ``` Python 3.2.6 (default, Jan 18 2016, 19:21:14) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import tempfile >>> tempfile.TemporaryDirectory() >>> v = _ >>> tmp_dir = str(v.__enter__()) >>> tmp_dir '/tmp/tmpd4jzut' >>> f = open(tmp_dir + "/module.py", "w") >>> f.write("def function():\nreturn 1\n") 29 >>> f.close() >>> import sys >>> sys.path.append(tmp_dir.encode()) >>> import module >>> module >>> ``` -- ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Change by Thomas Grainger : -- keywords: +patch pull_requests: +29993 stage: -> patch review pull_request: https://github.com/python/cpython/pull/31897 ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
Change by Thomas Grainger : -- components: +Library (Lib) versions: +Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47025] bytes do not work on sys.path
New submission from Thomas Grainger : importing a module with bytes in `sys.path` fails with: File "", line 182, in _path_isabs TypeError: startswith first arg must be bytes or a tuple of bytes, not str (see reproducer in attached demo.py) however `sys.path` is documented as supporting bytes "Only strings and bytes should be added to sys.path; all other data types are ignored during import." https://docs.python.org/3/library/sys.html?highlight=Only%20strings%20and%20bytes#sys.path bytes are allowed in PathFinder._find_spec https://github.com/python/cpython/blob/2cf7f865f099db11cc6903b334d9c376610313e8/Lib/importlib/_bootstrap_external.py#L1460-L1462 but perhaps they should be ignored or explicitly fsdecoded ? see also: https://bugs.python.org/issue32642 https://github.com/python/importlib_metadata/issues/372#issuecomment-1067799424 -- files: demo.py messages: 415233 nosy: graingert priority: normal severity: normal status: open title: bytes do not work on sys.path Added file: https://bugs.python.org/file50678/demo.py ___ Python tracker <https://bugs.python.org/issue47025> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46972] Documentation: Reference says AssertionError is raised by `assert`, but not all AssertionErrors are.
Thomas Fischbacher added the comment: The documentation of exceptions in the reference is one of the places that makes the life of users substantially harder than it ought to be, since the documentation appears to not have been written with the intent to give guarantees that users can expect correctly written code to follow. I would argue that "The reference documentation for X states that it gets raised under condition Y" generally should be understood as "this is a guarantee that also includes the guarantee that it is not raised under other conditions in correctly written code". Other languages often appear to be somewhat stricter w.r.t. interpreting the reference documentation as binding for correct code - and for Python, having this certainly would help a lot when writing code that can give binding guarantees. -- ___ Python tracker <https://bugs.python.org/issue46972> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46972] Documentation: Reference says AssertionError is raised by `assert`, but not all AssertionErrors are.
New submission from Thomas Fischbacher : The Python reference says: (1) https://docs.python.org/3/library/exceptions.html#concrete-exceptions exception AssertionError Raised when an assert statement fails. (2) https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement "assert ..." is equivalent to "if __debug__: ..." >From this, one can infer the guarantee "the -O flag will suppress >AssertionError exceptions from being raised". However, there is code in the Python standard library that does a direct "raise AssertionError" (strictly speaking, in violation of (1)), and it is just reasonable to assume that other code following the design of that would then also want to do a direct "raise AssertionError". This happens e.g. in many methods defined in: unittest/mock.py The most appropriate fix here may be to change the documentation to not say: === exception AssertionError Raised when an assert statement fails. === but instead: === exception AssertionError An assert[{add reference to `assert` definition}] statement fails, or a unit testing related assert{...}() callable detects an assertion violation. === -- messages: 414837 nosy: tfish2 priority: normal severity: normal status: open title: Documentation: Reference says AssertionError is raised by `assert`, but not all AssertionErrors are. ___ Python tracker <https://bugs.python.org/issue46972> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46960] Docs: Link from settrace to frame
New submission from Thomas Guettler : https://docs.python.org/3.10/library/sys.html#sys.settrace > Trace functions should have three arguments: frame, event, and arg. frame is > the current stack frame. It would be super cool, if "current stack frame" could be a hyperlink to the docs about "frame". -- messages: 414761 nosy: guettli priority: normal severity: normal status: open title: Docs: Link from settrace to frame ___ Python tracker <https://bugs.python.org/issue46960> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue43923] Can't create generic NamedTuple as of py3.9
Thomas Grainger added the comment: The main advantage for my usecase is support for heterogeneous unpacking On Sat, Mar 5, 2022, 6:04 PM Alex Waygood wrote: > > Alex Waygood added the comment: > > I sense we'll have to agree to disagree on the usefulness of NamedTuples > in the age of dataclasses :) > > For me, I find the simplicity of the underlying idea behind namedtuples — > "tuples with some properties bolted on" — very attractive. Yes, standard > tuples are more performant, but it's great to have a tool in the arsenal > that's essentially the same as a tuple (and is backwards-compatible with a > tuple, for APIs that require a tuple), but can also, like dataclasses, be > self-documenting. (You're right that DoneAndNotDoneFutures isn't a great > example of this.) > > But I agree that this shouldn't be a priority if it's hard to accomplish; > and there'll certainly be no complaints from me if energy is invested into > making dataclasses faster. > > -- > > ___ > Python tracker > <https://bugs.python.org/issue43923> > ___ > -- ___ Python tracker <https://bugs.python.org/issue43923> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46885] Ensure PEP 663 changes are reverted from 3.11
Change by Thomas Wouters : -- nosy: +twouters ___ Python tracker <https://bugs.python.org/issue46885> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45390] asyncio.Task doesn't propagate CancelledError() exception correctly.
Thomas Grainger added the comment: there could be multiple messages here perhaps it could be: ``` finally: # Must reacquire lock even if wait is cancelled cancelled = [] while True: try: await self.acquire() break except exceptions.CancelledError as e: cancelled.append(e) if len(cancelled) > 1: raise ExceptionGroup("Cancelled", cancelled) if cancelled: raise cancelled[0] ``` -- ___ Python tracker <https://bugs.python.org/issue45390> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46827] asyncio SelectorEventLoop.sock_connect fails with a UDP socket
Change by Thomas Grainger : -- keywords: +patch pull_requests: +29629 stage: -> patch review pull_request: https://github.com/python/cpython/pull/31499 ___ Python tracker <https://bugs.python.org/issue46827> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46827] asyncio SelectorEventLoop.sock_connect fails with a UDP socket
New submission from Thomas Grainger : the following code: import socket import asyncio async def amain(): with socket.socket(family=socket.AF_INET, proto=socket.IPPROTO_UDP, type=socket.SOCK_DGRAM) as sock: sock.setblocking(False) await asyncio.get_running_loop().sock_connect(sock, ("google.com", "443")) asyncio.run(amain()) fails with: Traceback (most recent call last): File "/home/graingert/projects/test_foo.py", line 9, in asyncio.run(amain()) File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/usr/lib/python3.10/asyncio/base_events.py", line 641, in run_until_complete return future.result() File "/home/graingert/projects/test_foo.py", line 7, in amain await asyncio.get_running_loop().sock_connect(sock, ("google.com", "443")) File "/usr/lib/python3.10/asyncio/selector_events.py", line 496, in sock_connect resolved = await self._ensure_resolved( File "/usr/lib/python3.10/asyncio/base_events.py", line 1395, in _ensure_resolved return await loop.getaddrinfo(host, port, family=family, type=type, File "/usr/lib/python3.10/asyncio/base_events.py", line 855, in getaddrinfo return await self.run_in_executor( File "/usr/lib/python3.10/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/usr/lib/python3.10/socket.py", line 955, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno -7] ai_socktype not supported -- components: asyncio messages: 413709 nosy: asvetlov, graingert, yselivanov priority: normal severity: normal status: open title: asyncio SelectorEventLoop.sock_connect fails with a UDP socket versions: Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue46827> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46824] use AI_NUMERICHOST | AI_NUMERICSERV to skip getaddrinfo thread in asyncio
Thomas Grainger added the comment: hello, it's actually a bit of a round about context, but it was brought up on a tornado issue where I was attempting to port the asyncio optimization to tornado: https://github.com/tornadoweb/tornado/issues/3113#issuecomment-1041019287 I think it would be better to use this AI_NUMERICHOST | AI_NUMERICSERV optimization from trio everywhere instead -- ___ Python tracker <https://bugs.python.org/issue46824> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46824] use AI_NUMERICHOST | AI_NUMERICSERV to skip getaddrinfo thread in asyncio
Change by Thomas Grainger : -- keywords: +patch pull_requests: +29627 stage: -> patch review pull_request: https://github.com/python/cpython/pull/31497 ___ Python tracker <https://bugs.python.org/issue46824> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46824] use AI_NUMERICHOST | AI_NUMERICSERV to skip getaddrinfo thread in asyncio
New submission from Thomas Grainger : now that the getaddrinfo lock has been removed on all platforms the numeric only host resolve in asyncio could be moved back into BaseEventLoop.getaddrinfo -- components: asyncio messages: 413699 nosy: asvetlov, graingert, yselivanov priority: normal severity: normal status: open title: use AI_NUMERICHOST | AI_NUMERICSERV to skip getaddrinfo thread in asyncio type: enhancement ___ Python tracker <https://bugs.python.org/issue46824> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue42752] multiprocessing Queue leaks a file descriptor associated with the pipe writer (#33081 still a problem)
Change by Thomas Grainger : -- nosy: +graingert, vstinner ___ Python tracker <https://bugs.python.org/issue42752> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue44863] Allow TypedDict to inherit from Generics
Thomas Grainger added the comment: there's a thread on typing-sig for this now: https://mail.python.org/archives/list/typing-...@python.org/thread/I7P3ER2NH7SENVMIXK74U6L4Z5JDLQGZ/#I7P3ER2NH7SENVMIXK74U6L4Z5JDLQGZ -- nosy: +graingert ___ Python tracker <https://bugs.python.org/issue44863> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46522] concurrent.futures.__getattr__ raises the wrong AttributeError message
Thomas Grainger added the comment: this also applies to io and _pyio -- ___ Python tracker <https://bugs.python.org/issue46522> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46522] concurrent.futures.__getattr__ raises the wrong AttributeError message
New submission from Thomas Grainger : >>> import types >>> types.ModuleType("concurrent.futures").missing_attribute Traceback (most recent call last): File "", line 1, in AttributeError: module 'concurrent.futures' has no attribute 'missing_attribute' >>> import concurrent.futures >>> concurrent.futures.missing_attribute Traceback (most recent call last): File "", line 1, in File "/home/graingert/miniconda3/lib/python3.9/concurrent/futures/__init__.py", line 53, in __getattr__ raise AttributeError(f"module {__name__} has no attribute {name}") AttributeError: module concurrent.futures has no attribute missing_attribute -- messages: 411611 nosy: graingert priority: normal pull_requests: 29069 severity: normal status: open title: concurrent.futures.__getattr__ raises the wrong AttributeError message versions: Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue46522> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46045] NetBSD: do not use POSIX semaphores
Thomas Klausner added the comment: Thanks for merging this, @serhiy.storchaka! -- ___ Python tracker <https://bugs.python.org/issue46045> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46415] ipaddress.ip_{address, network, interface} raises TypeError instead of ValueError if given a tuple as address
Change by Thomas Cellerier : -- title: ipaddress.ip_{address,network,interface} raise TypeError instead of ValueError if given a tuple as address -> ipaddress.ip_{address,network,interface} raises TypeError instead of ValueError if given a tuple as address ___ Python tracker <https://bugs.python.org/issue46415> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46415] ipaddress.ip_{address, network, interface} raise TypeError instead of ValueError if given a tuple as address
Change by Thomas Cellerier : -- keywords: +patch pull_requests: +28845 stage: -> patch review pull_request: https://github.com/python/cpython/pull/30642 ___ Python tracker <https://bugs.python.org/issue46415> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46415] ipaddress.ip_{address, network, interface} raise TypeError instead of ValueError if given a tuple as address
New submission from Thomas Cellerier : `IPv*Network` and `IPv*Interface` constructors accept a 2-tuple of (address description, netmask) as the address parameter. When the tuple-based address is used errors are not propagated correctly through the `ipaddress.ip_*` helper because of the %-formatting now expecting several arguments: In [7]: ipaddress.ip_network(("192.168.100.0", "fooo")) --- TypeError Traceback (most recent call last) in > 1 ipaddress.ip_network(("192.168.100.0", "fooo")) /usr/lib/python3.8/ipaddress.py in ip_network(address, strict) 81 pass 82 ---> 83 raise ValueError('%r does not appear to be an IPv4 or IPv6 network' % 84 address) 85 TypeError: not all arguments converted during string formatting Compared to: In [8]: ipaddress.IPv4Network(("192.168.100.0", "foo")) --- NetmaskValueError Traceback (most recent call last) in > 1 ipaddress.IPv4Network(("192.168.100.0", "foo")) /usr/lib/python3.8/ipaddress.py in __init__(self, address, strict) 1453 1454 self.network_address = IPv4Address(addr) -> 1455 self.netmask, self._prefixlen = self._make_netmask(mask) 1456 packed = int(self.network_address) 1457 if packed & int(self.netmask) != packed: /usr/lib/python3.8/ipaddress.py in _make_netmask(cls, arg) 1118 # Check for a netmask or hostmask in dotted-quad form. 1119 # This may raise NetmaskValueError. -> 1120 prefixlen = cls._prefix_from_ip_string(arg) 1121 netmask = IPv4Address(cls._ip_int_from_prefix(prefixlen)) 1122 cls._netmask_cache[arg] = netmask, prefixlen /usr/lib/python3.8/ipaddress.py in _prefix_from_ip_string(cls, ip_str) 516 ip_int = cls._ip_int_from_string(ip_str) 517 except AddressValueError: --> 518 cls._report_invalid_netmask(ip_str) 519 520 # Try matching a netmask (this would be /1*0*/ as a bitwise regexp). /usr/lib/python3.8/ipaddress.py in _report_invalid_netmask(cls, netmask_str) 472 def _report_invalid_netmask(cls, netmask_str): 473 msg = '%r is not a valid netmask' % netmask_str --> 474 raise NetmaskValueError(msg) from None 475 476 @classmethod NetmaskValueError: 'foo' is not a valid netmask -- components: Library (Lib) messages: 410798 nosy: thomascellerier priority: normal severity: normal status: open title: ipaddress.ip_{address,network,interface} raise TypeError instead of ValueError if given a tuple as address type: behavior versions: Python 3.8 ___ Python tracker <https://bugs.python.org/issue46415> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46045] NetBSD: do not use POSIX semaphores
Thomas Klausner added the comment: ping - this patch needs a review -- ___ Python tracker <https://bugs.python.org/issue46045> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46053] NetBSD: ossaudio support incomplete
Thomas Klausner added the comment: ping - this patch needs a review -- ___ Python tracker <https://bugs.python.org/issue46053> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue34602] python3 resource.setrlimit strange behaviour under macOS
Change by Thomas Klausner : -- nosy: +wiz nosy_count: 8.0 -> 9.0 pull_requests: +28694 pull_request: https://github.com/python/cpython/pull/30490 ___ Python tracker <https://bugs.python.org/issue34602> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46308] Unportable test(1) operator in configure script
Change by Thomas Klausner : -- keywords: +patch pull_requests: +28693 stage: -> patch review pull_request: https://github.com/python/cpython/pull/30490 ___ Python tracker <https://bugs.python.org/issue46308> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46308] Unportable test(1) operator in configure script
New submission from Thomas Klausner : The configure script uses the test(1) '==' operator, which is only supported by bash. The standard comparison operator is '='. -- components: Installation messages: 410120 nosy: wiz priority: normal severity: normal status: open title: Unportable test(1) operator in configure script type: compile error versions: Python 3.11 ___ Python tracker <https://bugs.python.org/issue46308> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue42369] Reading ZipFile not thread-safe
Thomas added the comment: @khaledk I finally got some time off, so here you go https://github.com/1/ParallelZipFile I can not offer any support for a more correct implementation of the zip specification due to time constraints, but maybe the code is useful for you anyway. -- ___ Python tracker <https://bugs.python.org/issue42369> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue38415] @asynccontextmanager decorated functions are not callable like @contextmanager
Thomas Grainger added the comment: actually it was already done in 13 months! -- ___ Python tracker <https://bugs.python.org/issue38415> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue38415] @asynccontextmanager decorated functions are not callable like @contextmanager
Change by Thomas Grainger : -- nosy: +graingert nosy_count: 3.0 -> 4.0 pull_requests: +28454 pull_request: https://github.com/python/cpython/pull/30233 ___ Python tracker <https://bugs.python.org/issue38415> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46150] test_pathlib assumes "fakeuser" does not exist as user
New submission from Thomas Wouters : test_pathlib contains, in PosixPathTest.test_expanduser, a check that expanduser on a nonexistent user will raise RuntimeError. Leaving aside the question why that's a RuntimeError (which is probably too late to fix anyway), the test performs this check by assuming 'fakeuser' is a nonexistent user. This test will fail when such a user does exist. (The test already uses the pwd module for other reasons, so it certainly could check that first.) -- components: Tests messages: 409030 nosy: twouters priority: normal severity: normal status: open title: test_pathlib assumes "fakeuser" does not exist as user versions: Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue46150> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue34624] -W option and PYTHONWARNINGS env variable does not accept module regexes
Change by Thomas Gläßle : -- stage: patch review -> resolved status: open -> closed ___ Python tracker <https://bugs.python.org/issue34624> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue34624] -W option and PYTHONWARNINGS env variable does not accept module regexes
Thomas Gläßle added the comment: Ok, it seems at least the incorrect documentation has been fixed in the mean time. I'm going to close this as there seems to be no capacity to deal with this. -- ___ Python tracker <https://bugs.python.org/issue34624> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45996] Worse error from asynccontextmanager in Python 3.10
Thomas Grainger added the comment: > Actually I don't agree with Thomas's logic... his argument feels like > consistency for its own sake. Do you expect sync and async contextmanagers to act differently? Why would sync contextmanagers raise AttributeError and async contextmanagers raise a RuntimeError? If it's sensible to guard against invalid re-entry for async contextmanagers then I think it's sensible to apply the same guard to sync contextmanagers. -- ___ Python tracker <https://bugs.python.org/issue45996> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46083] PyUnicode_FSConverter() has confusing reference semantics
New submission from Thomas Wouters : The PyUnicode_FSConverter function has confusing reference semantics, and confusing documentation. https://docs.python.org/3/c-api/unicode.html#c.PyUnicode_FSConverter says the output argument "must be a PyBytesObject* which must be released when it is no longer used." That seems to suggest one must pass a PyBytesObject to it, and indeed one of the error paths assumes an object was passed (https://github.com/python/cpython/blob/main/Objects/unicodeobject.c#L4116-- 'addr' is called 'result' in the docs). Not passing a valid object would result in trying to DECREF NULL, or garbage. However, the function doesn't actually use the object, and later in the function overwrites the value *without* DECREFing it, so passing a valid object would in fact cause a leak. I understand the function signature is the way it is so it can be used with PyArg_ParseTuple's O& format, but there are reasons to call it directly (e.g. with METH_O functions), and it would be nice if the semantics were more clear. -- components: C API messages: 408604 nosy: twouters priority: normal severity: normal status: open title: PyUnicode_FSConverter() has confusing reference semantics versions: Python 3.10, Python 3.11, Python 3.8, Python 3.9 ___ Python tracker <https://bugs.python.org/issue46083> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue1525919] email package content-transfer-encoding behaviour changed
Thomas Arendsen Hein added the comment: Default python3 on Debian buster: $ python3 Python 3.7.3 (default, Jan 22 2021, 20:04:44) [GCC 8.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import email.mime.text >>> mt = email.mime.text.MIMEText('Ta mère', 'plain', 'utf-8') >>> print(mt.as_string()) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: base64 VGEgbcOocmU= >>> email.encoders.encode_quopri(mt) >>> print(mt.as_string()) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: base64 Content-Transfer-Encoding: quoted-printable Ta=20m=C3=A8re So the encoded text looks good now, but there are still duplicate headers. Old output (python2.7) is identical to what Asheesh Laroia (paulproteus) reported for python2.5: --- Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: base64 Content-Transfer-Encoding: quoted-printable VGEgbcOocmU=3D --- -- status: pending -> open ___ Python tracker <https://bugs.python.org/issue1525919> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue30512] CAN Socket support for NetBSD
Change by Thomas Klausner : -- pull_requests: +28286 stage: -> patch review pull_request: https://github.com/python/cpython/pull/30066 ___ Python tracker <https://bugs.python.org/issue30512> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46053] NetBSD: ossaudio support incomplete
Change by Thomas Klausner : -- keywords: +patch pull_requests: +28285 stage: -> patch review pull_request: https://github.com/python/cpython/pull/30065 ___ Python tracker <https://bugs.python.org/issue46053> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46053] NetBSD: ossaudio support incomplete
New submission from Thomas Klausner : When compiling Python on NetBSD, the ossaudio module is not enabled. 1. the code tries to export some #define that are not in the public OSS API (but that some other implementations provide) 2. on NetBSD, you need to link against libossaudio when using OSS -- components: Extension Modules messages: 408349 nosy: wiz priority: normal severity: normal status: open title: NetBSD: ossaudio support incomplete type: enhancement versions: Python 3.11 ___ Python tracker <https://bugs.python.org/issue46053> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46045] NetBSD: do not use POSIX semaphores
Change by Thomas Klausner : -- keywords: +patch pull_requests: +28272 stage: -> patch review pull_request: https://github.com/python/cpython/pull/30047 ___ Python tracker <https://bugs.python.org/issue46045> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46045] NetBSD: do not use POSIX semaphores
New submission from Thomas Klausner : On NetBSD by default, the following tests do not finish in > 1h: 1:07:13 load avg: 0.00 running: test_compileall (1 hour 7 min), test_multiprocessing_fork (1 hour 7 min), test_concurrent_futures (1 hour 6 min) Defining HAVE_BROKEN_POSIX_SEMAPHORES fixes this, and they finish: 0:00:32 load avg: 10.63 [408/427/17] test_compileall passed ... ... 0:02:37 load avg: 3.04 [427/427/22] test_concurrent_futures passed (2 min 33 sec) The last one fails: test_multiprocessing_fork with most of the subtests failing like this: ERROR: test_shared_memory_SharedMemoryServer_ignores_sigint (test.test_multiprocessing_fork.WithProcessesTestSharedMemory) -- Traceback (most recent call last): File "/scratch/lang/python310/work/Python-3.10.1/Lib/test/_test_multiprocessing.py", line 4006, in test_shared_memory_SharedMemoryServer_ignores_sigint sl = smm.ShareableList(range(10)) File "/scratch/lang/python310/work/Python-3.10.1/Lib/multiprocessing/managers.py", line 1372, in ShareableList sl = shared_memory.ShareableList(sequence) File "/scratch/lang/python310/work/Python-3.10.1/Lib/multiprocessing/shared_memory.py", line 327, in __init__ self.shm = SharedMemory(name, create=True, size=requested_size) File "/scratch/lang/python310/work/Python-3.10.1/Lib/multiprocessing/shared_memory.py", line 92, in __init__ self._fd = _posixshmem.shm_open( OSError: [Errno 86] Not supported: '/psm_b1ec903a' I think this is a separate issue, so I'd like to define HAVE_BROKEN_POSIX_SEMAPHORES for now. This has been done in pkgsrc since at least python 2.7 (in 2011), I haven't dug deeper. -- components: Interpreter Core messages: 408291 nosy: wiz priority: normal severity: normal status: open title: NetBSD: do not use POSIX semaphores type: behavior versions: Python 3.11 ___ Python tracker <https://bugs.python.org/issue46045> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21459] DragonFlyBSD support
Thomas Klausner added the comment: Not interested in this any longer, and Dragonfly's Dports doesn't carry this patch, so it's probably not needed any longer. -- stage: -> resolved status: open -> closed ___ Python tracker <https://bugs.python.org/issue21459> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21461] Recognize -pthread
Thomas Klausner added the comment: I must confess, I don't know. This patch has been in pkgsrc since at least the import of the first python 2.7 package in 2011, and I haven't dug deeper. If you think it is unnecessary, I'll trust you. I've just removed it from the python 3.10 package in pkgsrc. -- stage: patch review -> resolved status: open -> closed ___ Python tracker <https://bugs.python.org/issue21461> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21461] Recognize -pthread
Thomas Klausner added the comment: gcc supports this flag. According to the man page: This option consistently for both compilation and linking. This option is supported on GNU/Linux targets, most other Unix derivatives, and also on x86 Cygwin and MinGW targets. On NetBSD, using -pthread is the recommended method to enable thread support. clang on NetBSD also supports this flag. I don't have access to clang on other systems. -- ___ Python tracker <https://bugs.python.org/issue21461> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21461] Recognize -pthread
Change by Thomas Klausner : -- pull_requests: +28257 stage: -> patch review pull_request: https://github.com/python/cpython/pull/30032 ___ Python tracker <https://bugs.python.org/issue21461> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46000] NetBSD curses compatibility
Thomas Klausner added the comment: Done: https://github.com/python/cpython/pull/29947 -- ___ Python tracker <https://bugs.python.org/issue46000> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46000] NetBSD curses compatibility
New submission from Thomas Klausner : The code in Modules/_cursesmodule.c has an assumption on ncurses. The attached simple patch fixes this and works with both NetBSD curses and ncurses. -- components: Extension Modules files: patch-Modules___cursesmodule.c messages: 407825 nosy: wiz priority: normal severity: normal status: open title: NetBSD curses compatibility type: behavior versions: Python 3.10 Added file: https://bugs.python.org/file50480/patch-Modules___cursesmodule.c ___ Python tracker <https://bugs.python.org/issue46000> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45996] Worse error from asynccontextmanager in Python 3.10
Thomas Grainger added the comment: you can see the analogous sync contextmanager issue on python3.6 with: ``` import logging from contextlib import contextmanager @contextmanager def foo(): yield def test(): f = foo() f.__enter__() f.__enter__() test() ``` on python3.7+ you get the bpo-30306 behaviour ``` Traceback (most recent call last): File "sync.py", line 14, in test() File "sync.py", line 12, in test f.__enter__() File "/usr/lib/python3.8/contextlib.py", line 111, in __enter__ del self.args, self.kwds, self.func AttributeError: args ``` and python3.6 you get the same sort of error you see now for asynccontextmanagers: ``` Traceback (most recent call last): File "sync.py", line 14, in test() File "sync.py", line 12, in test f.__enter__() File "/usr/lib/python3.6/contextlib.py", line 83, in __enter__ raise RuntimeError("generator didn't yield") from None RuntimeError: generator didn't yield ``` -- ___ Python tracker <https://bugs.python.org/issue45996> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45996] Worse error from asynccontextmanager in Python 3.10
Thomas Grainger added the comment: I see the change here: https://github.com/python/cpython/commit/1c5c9c89ffc36875afaf4c3cc6a716d4bd089bbf#diff-e00601a380ba6c916ba4333277fe6ea43d2477804002ab1ae64480f80fec8e3aR177-R179 this is intentionally implementing https://bugs.python.org/issue30306 for asynccontextmanagers that was initially missing -- ___ Python tracker <https://bugs.python.org/issue45996> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45996] Worse error from asynccontextmanager in Python 3.10
Thomas Grainger added the comment: ah I can repeat this on python3.8.10 trio but not python3.9.9 trio: ``` import logging import trio from contextlib import asynccontextmanager @asynccontextmanager async def foo(): await trio.sleep(1) yield async def test(): async with trio.open_nursery() as n: f = foo() n.start_soon(f.__aenter__) n.start_soon(f.__aenter__) trio.run(test) ``` ``` Traceback (most recent call last): File "bar.py", line 17, in trio.run(test) File "/home/graingert/.virtualenvs/osirium-main/lib/python3.8/site-packages/trio/_core/_run.py", line 1932, in run raise runner.main_task_outcome.error File "bar.py", line 15, in test n.start_soon(f.__aenter__) File "/home/graingert/.virtualenvs/osirium-main/lib/python3.8/site-packages/trio/_core/_run.py", line 815, in __aexit__ raise combined_error_from_nursery File "/usr/lib/python3.8/contextlib.py", line 171, in __aenter__ return await self.gen.__anext__() RuntimeError: anext(): asynchronous generator is already running ``` -- ___ Python tracker <https://bugs.python.org/issue45996> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45996] Worse error from asynccontextmanager in Python 3.10
Thomas Grainger added the comment: or consider the trio version: ``` import logging import trio from contextlib import asynccontextmanager @asynccontextmanager async def foo(): await trio.sleep(1) yield async def test(): async with trio.open_nursery() as n: f = foo() n.start_soon(f.__aenter__) n.start_soon(f.__aenter__) trio.run(test) ``` ``` Traceback (most recent call last): File "/home/graingert/projects/examples/bar.py", line 17, in trio.run(test) File "/home/graingert/.virtualenvs/testing39/lib/python3.9/site-packages/trio/_core/_run.py", line 1932, in run raise runner.main_task_outcome.error File "/home/graingert/projects/examples/bar.py", line 15, in test n.start_soon(f.__aenter__) File "/home/graingert/.virtualenvs/testing39/lib/python3.9/site-packages/trio/_core/_run.py", line 815, in __aexit__ raise combined_error_from_nursery File "/usr/lib/python3.9/contextlib.py", line 179, in __aenter__ del self.args, self.kwds, self.func AttributeError: args ``` -- ___ Python tracker <https://bugs.python.org/issue45996> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45996] Worse error from asynccontextmanager in Python 3.10
Thomas Grainger added the comment: I think `AttributeError: args` is the desired/expected behaviour consider the sync version: ``` import logging from asyncio import sleep, gather, run from contextlib import asynccontextmanager, contextmanager @contextmanager def foo(): yield def test(): f = foo() f.__enter__() f.__enter__() test() ``` ``` Traceback (most recent call last): File "/home/graingert/projects/example/sync.py", line 15, in test() File "/home/graingert/projects/example/sync.py", line 13, in test f.__enter__() File "/usr/lib/python3.9/contextlib.py", line 117, in __enter__ del self.args, self.kwds, self.func AttributeError: args ``` -- ___ Python tracker <https://bugs.python.org/issue45996> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue43805] multiprocessing.Queue hangs when process on other side dies
Thomas Kluyver added the comment: It's not my decision, so I can't really say. But the Queue API is pretty stable, and exists 3 times over in Python (the queue module for use with threads, in multiprocessing and in asyncio). So I'd guess that anyone wanting to add to that API would need to make a compelling case for why it's important, and be prepared for a lot of wrangling over API details (like method names and exceptions). If you want to push that idea, you could try the ideas board on the Python discourse forum: https://discuss.python.org/c/ideas/6 . You might also want to look at previous discussions about adding a Queue.close() method: issue29701 and issue40888. -- ___ Python tracker <https://bugs.python.org/issue43805> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue43805] multiprocessing.Queue hangs when process on other side dies
Thomas Kluyver added the comment: I think this is expected. The queue itself doesn't know that one particular process is meant to put data into it. It just knows that there's no data to get, so .get() blocks as the docs say it should. This doesn't apply to issue22393, because the pool knows about its worker processes, so if one dies before completing a task, it can know something is wrong. You could add a method to 'half close' a queue, so it can only be used for receiving, but not sending. If you called this in the parent process after starting the child, then if the child died, the queue would know that nothing could ever put data into it, and .get() could error. The channels API in Trio allows this, and it's the same idea I've just described at the OS level in issue43806. -- nosy: +takluyver ___ Python tracker <https://bugs.python.org/issue43805> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue43806] asyncio.StreamReader hangs when reading from pipe and other process exits unexpectedly
Thomas Kluyver added the comment: In the example script, I believe you need to close the write end of the pipe in the parent after forking: cpid = os.fork() if cpid == 0: # Write to pipe (child) else: # Parent os.close(ctx) # Read from pipe This is the same with synchronous code: os.read(prx, 1) also hangs. You only get EOF when nothing has the write end open any more. All the asyncio machinery doesn't really make any difference to this. For a similar reason, the code writing (the child, in this case) should close the read end of the pipe after forking. If the parent goes away but the child still has the read end open, then trying to write to the pipe can hang (if the buffer is already full). If the child has closed the read end, trying to write will give you a BrokenPipeError. -- nosy: +takluyver ___ Python tracker <https://bugs.python.org/issue43806> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45775] Implementation of colorsys.rgb_to_yuv and vice versa
Thomas Stolarski added the comment: I figured this would probably be the case, but since YIQ also requires a profile (and the FCC one it implements is pretty weird for digital work), I thought I'd give it a shot anyway. Would it be worth moving the test/formatting changes over to a different ticket or should we just leave them? -- ___ Python tracker <https://bugs.python.org/issue45775> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45822] Py_CompileString does not respect the coding cookie with the new parser if flags are empty
Thomas Wouters added the comment: Py_CompileString() in Python 3.9 and later, using the PEG parser, appears to no longer honours source encoding cookies. A reduced test case: #include "Python.h" #include const char *src = ( "# -*- coding: Latin-1 -*-\n" "'''\xc3'''\n"); int main(int argc, char **argv) { Py_Initialize(); PyObject *res = Py_CompileString(src, "some_path", Py_file_input); if (res) { fprintf(stderr, "Compile succeeded.\n"); return 0; } else { fprintf(stderr, "Compile failed.\n"); PyErr_Print(); return 1; } } Compiling and running the resulting binary with Python 3.8 (or earlier): % ./encoding_bug Compile succeeded. With 3.9 and PYTHONOLDPARSER=1: % PYTHONOLDPARSER=1 ./encoding_bug Compile succeeded. With 3.9 (without the env var) or 3.10: % ./encoding_bug Compile failed. File "some_path", line 2 '''�''' ^ SyntaxError: (unicode error) 'utf-8' codec can't decode byte 0xc3 in position 0: unexpected end of data Writing the same bytes to a file and making python3.9 or python3.10 import them works fine, as does passing the bytes to compile(): Python 3.10.0+ (heads/3.10-dirty:7bac598819, Nov 16 2021, 20:35:12) [GCC 8.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> b = open('encoding_bug.py', 'rb').read() >>> b b"# -*- coding: Latin-1 -*-\n'''\xc3'''\n" >>> import encoding_bug >>> encoding_bug.__doc__ 'Ã' >>> co = compile(b, 'some_path', 'exec') >>> co at 0x7f447e1b0c90, file "some_path", line 1> >>> co.co_consts[0] 'Ã' It's just Py_CompileString() that fails. I don't understand why, and I do believe it's a regression. -- nosy: +gregory.p.smith ___ Python tracker <https://bugs.python.org/issue45822> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45775] Implementation of colorsys.rgb_to_yuv and vice versa
Change by Thomas Stolarski : -- keywords: +patch pull_requests: +27763 stage: -> patch review pull_request: https://github.com/python/cpython/pull/29512 ___ Python tracker <https://bugs.python.org/issue45775> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45775] Implementation of colorsys.rgb_to_yuv and vice versa
New submission from Thomas Stolarski : Since the implementation of `rgb_to_yiq` roughly 30 years ago now, the advent of HDTV has resulted in most broadcasting and video processing having moved towards Rec 709 (or BT.709). While I know colorsys has been on the chopping block for a while, it seemed a little silly that what is now arguably the most common color space used in video editors isn't available as part of this module, despite being a matrix mapping like much of the others. YUV is a bit more contentious on its definition than YIQ is, but it is still widely used to refer to Rec 709 and this is the ATSC standardization for HDTV. I've written a PR for both conversions and will add it to this ticket once it's up. -- components: Library (Lib) messages: 406067 nosy: thomas.stolarski priority: normal severity: normal status: open title: Implementation of colorsys.rgb_to_yuv and vice versa type: enhancement versions: Python 3.11 ___ Python tracker <https://bugs.python.org/issue45775> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45748] "import ctypes" segfaults on Python 3.6 and Ubuntu 21.10
New submission from Thomas Trummer : 3.7 and later are all working fine. First good commit: 55fe1ae9708d81b902b6fe8f6590e2a24b1bd4b0 First bad commit: fdbd01151dbd5feea3e4c0316d102db3d2a2a412 git checkout v3.6.15 #0 0x76cc52a0 in PyCFuncPtr_new (type=0x559157f8, args=0x76ce6dd8, kwds=0x0) at /home/tom/pydev/cpython/Modules/_ctypes/_ctypes.c:3557 #1 0x556400f9 in type_call (type=0x559157f8, args=0x76ce6dd8, kwds=0x0) at Objects/typeobject.c:895 #2 0x555db4ca in _PyObject_FastCallDict (func=0x559157f8, args=0x76dc1f48, nargs=1, kwargs=0x0) at Objects/abstract.c:2331 #3 0x556b8e7c in call_function (pp_stack=pp_stack@entry=0x7fffcb58, oparg=, kwnames=kwnames@entry=0x0) at Python/ceval.c:4875 #4 0x556bc9d3 in _PyEval_EvalFrameDefault (f=, throwflag=) at Python/ceval.c:3335 #5 0x556b774c in PyEval_EvalFrameEx (throwflag=0, f=0x76dc1dc8) at Python/ceval.c:754 #6 _PyFunction_FastCall (co=, args=, nargs=nargs@entry=0, globals=) at Python/ceval.c:4933 #7 0x556b9269 in fast_function (kwnames=0x0, nargs=0, stack=, func=0x76d78378) at Python/ceval.c:4968 #8 call_function (pp_stack=pp_stack@entry=0x7fffccf8, oparg=, kwnames=kwnames@entry=0x0) at Python/ceval.c:4872 #9 0x556bc9d3 in _PyEval_EvalFrameDefault (f=, throwflag=) at Python/ceval.c:3335 #10 0x556b83c7 in PyEval_EvalFrameEx (throwflag=0, f=0x558d2248) at Python/ceval.c:754 #11 _PyEval_EvalCodeWithName (_co=_co@entry=0x76cdf660, globals=globals@entry=0x76dfa438, locals=locals@entry=0x76dfa438, args=args@entry=0x0, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0, kwargs=0x0, kwcount=0, defs=0x0, defcount=0, kwdefs=0x0, closure=0x0, name=0x0, qualname=0x0, kwstep=2) at Python/ceval.c:4166 #12 0x556b9a25 in PyEval_EvalCodeEx (closure=0x0, kwdefs=0x0, defcount=0, defs=0x0, kwcount=0, kws=0x0, argcount=0, args=0x0, locals=0x76dfa438, globals=0x76dfa438, _co=0x76cdf660) at Python/ceval.c:4187 #13 PyEval_EvalCode (co=co@entry=0x76cdf660, globals=globals@entry=0x76dfa438, locals=locals@entry=0x76dfa438) at Python/ceval.c:731 #14 0x556b611d in builtin_exec_impl (module=, locals=0x76dfa438, globals=0x76dfa438, source=0x76cdf660) at Python/bltinmodule.c:983 #15 builtin_exec (module=, args=) at Python/clinic/bltinmodule.c.h:283 #16 0x5562b651 in PyCFunction_Call (func=func@entry=0x76eb7990, args=args@entry=0x76cdcd08, kwds=kwds@entry=0x76d6b288) at Objects/methodobject.c:126 #17 0x556c11af in do_call_core (kwdict=0x76d6b288, callargs=0x76cdcd08, func=0x76eb7990) at Python/ceval.c:5116 #18 _PyEval_EvalFrameDefault (f=, throwflag=) at Python/ceval.c:3404 #19 0x556b8d17 in PyEval_EvalFrameEx (throwflag=0, f=0x76dc2930) at Python/ceval.c:754 #20 _PyEval_EvalCodeWithName (_co=0x76eabdb0, globals=, locals=, args=, argcount=3, kwnames=0x0, kwargs=0x76d913c8, kwcount=0, kwstep=1, defs=0x0, defcount=0, kwdefs=0x0, closure=0x0, name=0x76e53ad0, qualname=0x76e53ad0) at Python/ceval.c:4166 -- components: ctypes messages: 405950 nosy: Thomas Trummer priority: normal severity: normal status: open title: "import ctypes" segfaults on Python 3.6 and Ubuntu 21.10 versions: Python 3.6 ___ Python tracker <https://bugs.python.org/issue45748> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue39247] dataclass defaults and property don't work together
Thomas added the comment: > An example of multiple descriptors would be to have: > @cached_property > @property > def expensive_calc(self): > #Do something expensive That's decorator chaining. The example you gave is not working code (try to return something from expensive_calc and print(obj.expensive_calc()), you'll get a TypeError). Correct me if I'm wrong, but I don't think you can chain descriptors the way you want unless the descriptors themselves have knowledge that they're acting on descriptors. E.g., given: class Foo: @descriptorA @descriptorB def bar(self): return 5 You would need descriptorA to be implemented such that its __get__ method return .__get__() of whatever it was wrapping (in this case descriptorB). Either way, at the class level (I mean the Foo class, the one we'd like to make a dataclass), all of this doesn't matter because it only sees the outer descriptor (descriptorA). Assuming the proposed solution is accepted, you would be able to do this: @dataclass class Foo: @descriptorA @descriptorB def bar(self): return some_value @bar.setter def bar(self, value): ... # store value bar: int = field(descriptor=bar) and, assuming descriptorA is compatible with descriptorB on both .__get__ and .__set__, as stated above, it would work the way you intend it to. -- ___ Python tracker <https://bugs.python.org/issue39247> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue39247] dataclass defaults and property don't work together
Thomas added the comment: Just to rephrase, because the explanation in my last message can be ambiguous: At dataclass construction time (when the @dataclass decorator inspects and enhances the class): for field in fields: if descriptor := getattr(field, 'descriptor'): setattr(cls, field.name, descriptor) elif default := getattr(field, 'default'): setattr(cls, field.name, default) Then at __init__ time: for field in fields: if ( (descriptor := getattr(field, 'descriptor')) and (default := getattr(field, 'default')) ): setattr(self, field.name, default) elif default_factory := getattr(field, 'default_factory'): setattr(self, field.name, default_factory()) Now, this is just pseudo-code to illustrate the point, I know the dataclass implementation generates the __init__ on the fly by building its code as a string then exec'ing it. This logic would have to be applied to that generative code. I keep thinking I'm not seeing some obvious problem here, so if something jumps out let me know. -- ___ Python tracker <https://bugs.python.org/issue39247> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue39247] dataclass defaults and property don't work together
Thomas added the comment: Agreed on everything but that last part, which I'm not sure I understand: > If we allow descriptor to accept an iterable as well you could have multiple > descriptors just like normal. Could you give an example of what you mean with a regular class? I've had a bit more time to think about this and I think one possible solution would be to mix the idea of a "descriptor" argument to the field constructor and the idea of not applying regular defaults at __init__ time. Basically, at dataclass construction time (when the @dataclass decorator inspects and enhances the class), apply regular defaults at the class level unless the field has a descriptor argument, then apply that instead at the class level. At __init__ time, apply default_factories only unless the field has a descriptor argument, then do apply the regular default value. If the implementation changed in these two ways, we'd have code like this work exactly as expected: from dataclasses import dataclass, field @dataclass class Foo: _bar: int = field(init=False) @property def bar(self): return self._bar @bar.setter def bar(self, value): self._bar = value # field is required, # uses descriptor bar for get/set bar: int = field(descriptor=bar) # field is optional, # default of 5 is set at __init__ time # using the descriptor bar for get/set, bar: int = field(descriptor=bar, default=5) # field is optional, # default value is the descriptor instance, # it is set using regular attribute setter bar: int = field(default=bar) Not only does this allow for descriptor to be used with dataclasses, it also fixes the use case of trying to have a descriptor instance as a default value because the descriptor wouldn't be used to get/set itself. Although I should say, at this point, I'm clearly seeing this with blinders on to solve this particular problem... It's probable this solution breaks something somewhere that I'm not seeing. Fresh eyes appreciated :) -- ___ Python tracker <https://bugs.python.org/issue39247> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue39247] dataclass defaults and property don't work together
Thomas added the comment: Scratch that last one, it leads to problem when mixing descriptors with actual default values: @dataclass class Foo: bar = field(default=some_descriptor) # technically this is a descriptor field without a default value or at the very least, the dataclass constructor can't know because it doesn't know what field, if any, this delegates to. This means this will show up as optional in the __init__ signature but it might not be. bar = field(default=some_descriptor, default_factory=lambda:4) # this could be a solve for the above problem. The dc constructor would install the constructor at the class level and assign 4 to the instance attribute in the __init__. Still doesn't tell the dc constructor if a field is optional or not when it's default value is a descriptor and no default_factory is passed. And it feels a lot more like hack than anything else. So ignore my previous message. I'm still 100% behind the "descriptor" arg in the field constructor, though :) PS: Sorry for the noise, I just stumbled onto this problem for the nth-times and I can't get my brain to shut off. -- ___ Python tracker <https://bugs.python.org/issue39247> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue39247] dataclass defaults and property don't work together
Thomas added the comment: Thinking a little more about this, maybe a different solution would be to have default values be installed at the class level by default without being overwritten in the init, as is the case today. default_factory should keep being set in the init as is the case today. With this approach: @dataclass class Foo: bar = field(default=4) # assigns 4 to Foo.bar but not to foo.bar (bonus: __init__ will be faster) bar = field(default=some_descriptor) # assigns some_descriptor to Foo.bar, so Foo().bar does a __get__ on the descriptor bar = field(default_factory=SomeDescriptor) # assigns a new SomeDescriptor instance to every instance of Foo bar = field(default_factory=lambda: some_descriptor) # assigns the same descriptor object to every instance of Foo I don't think this change would break a lot of existing code as the attribute overwrite that happens at the instance level in the __init__ is essentially an implementation detail. It also seems this would solve the current problem and allow for a cleaner way to assign a descriptor object as a default value. Am I not seeing some obvious problem here ? -- ___ Python tracker <https://bugs.python.org/issue39247> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue39247] dataclass defaults and property don't work together
Thomas added the comment: Hello everyone, A quick look on SO and Google + this python issue + this blog post and its comments: https://florimond.dev/en/posts/2018/10/reconciling-dataclasses-and-properties-in-python/ show that this is still a problem where dataclass users keep hitting a wall. The gist here seems to be that there's two ways to solve this: - have descriptor be treated differently when found as default value in the __init__. I like this solution. The argument against is that users might want to have the descriptor object itself as an instance attribute and this solution would prevent them from doing it. I'd argue that, if the user intention was to have the descriptor object as a default value, the current dataclass implementation allows it in a weird way: as shown above, it actually sets and gets the descriptor using the descriptor as its own getter/setter (although it makes sense when one thinks of how dataclass are implemented, specifically "when" the dataclass modifies the class, it is nonetheless jarring at first glance). - add an "alias/name/public_name/..." keyword to the field constructor so that we could write _bar: int = field(default=4, alias="bar"). The idea here keeps the usage of this alias to the __init__ method but I'd go further. The alias should be used everywhere we need to show the public API of the dataclass (repr, str, to_dict, ...). Basically, if a field has an alias, we only ever show / give access to the alias and essentially treat the original attribute name as a private name (i.e.: if the dataclass maintainer changes the attribute name, none of the user code should break). I like both solutions for the given problem but I still have a preference for the first, as it covers more cases that are not shown by the example code: what if the descriptor doesn't delegate to a private field on the class? It is a bit less common, but one could want to have a field in the init that delegates to a resource that is not a field on the dataclass. The first solution allows that, the second doesn't. So I'd like to propose a variation of the first solution that, hopefully, also solves the counter argument to that solution: @dataclass class FileObject: _uploaded_by: str = field(init=False) @property def uploaded_by(self): return self._uploaded_by @uploaded_by.setter def uploaded_by(self, uploaded_by): print('Setter Called with Value ', uploaded_by) self._uploaded_by = uploaded_by uploaded_by: str = field(default=None, descriptor=uploaded_by) Basically, add an argument to the field constructor that allows developers to tell the dataclass constructor that this field requires special handling: in the __init__, it should use the default value as it would do for normal fields but at the class level, it should install the descriptor, instead of the default value. What do you think ? -- nosy: +Thomas701 ___ Python tracker <https://bugs.python.org/issue39247> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue25625] "chdir" Contex manager for pathlib
Change by Thomas Grainger : -- nosy: +graingert nosy_count: 10.0 -> 11.0 pull_requests: +27360 pull_request: https://github.com/python/cpython/pull/29091 ___ Python tracker <https://bugs.python.org/issue25625> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue29941] Confusion between asserts and Py_DEBUG
Change by Thomas Wouters : -- stage: -> resolved status: open -> closed ___ Python tracker <https://bugs.python.org/issue29941> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45434] [C API] Clean-up the Python.h header file
Thomas Wouters added the comment: Victor, what's the benefit of doing this work? Are there real problems this fixes? I'm worried about the churn in third-party extensions, examples, tutorials, etc, especially for audiences that upon seeing a compiler error won't immediately realise they need to include stdlib.h themselves. (Also, since Python.h sets things like _POSIX_C_SOURCE and _XOPEN_SOURCE, including them in the wrong order can produce even more confusing errors, or errors that only appear on some platforms.) -- nosy: +twouters ___ Python tracker <https://bugs.python.org/issue45434> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45435] delete misleading faq entry about atomic operations
Thomas Grainger added the comment: it's as part of this discussion in https://mail.python.org/archives/list/python-...@python.org/thread/ABR2L6BENNA6UPSPKV474HCS4LWT26GY/#IAOCDDCJ653NBED3G2J2YBWD7HHPFHT6 and others in #python-dev specifically https://github.com/python/cpython/blob/2f92e2a590f0e5d2d3093549f5af9a4a1889eb5a/Objects/dictobject.c#L2582-L2586 regarding if any of the items are builtins or not: the faq entry lists (L, L1, L2 are lists, D, D1, D2 are dicts, x, y are objects, i, j are ints) so I read that to mean x and y are user defined objects with user defined comparison and equality methods -- ___ Python tracker <https://bugs.python.org/issue45435> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45435] delete misleading faq entry about atomic operations
Change by Thomas Grainger : -- keywords: +patch pull_requests: +27181 stage: -> patch review pull_request: https://github.com/python/cpython/pull/28886 ___ Python tracker <https://bugs.python.org/issue45435> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45435] delete misleading faq entry about atomic operations
Change by Thomas Grainger : -- assignee: docs@python components: Documentation nosy: docs@python, graingert priority: normal severity: normal status: open title: delete misleading faq entry about atomic operations ___ Python tracker <https://bugs.python.org/issue45435> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45423] SSL SNI varies when host contains port number
New submission from Thomas Hobson : Not entirely sure if this is intended. When using urllib.request.urlopen, with a hostname and a varying port the SNI information sent differs. To my undersnding, the SNI info shouldn't include the port and should only include the actual host. Attached is an example script demonstrating the issue, where the only difference between the URLs is adding a port number. The server it points to is configured to only match "ci.hexf.me". -- assignee: christian.heimes components: SSL files: test.py messages: 403586 nosy: christian.heimes, hexf priority: normal severity: normal status: open title: SSL SNI varies when host contains port number type: behavior versions: Python 3.10, Python 3.9 Added file: https://bugs.python.org/file50338/test.py ___ Python tracker <https://bugs.python.org/issue45423> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45390] asyncio.Task doesn't propagate CancelledError() exception correctly.
Thomas Grainger added the comment: afaik this is intentional https://bugs.python.org/issue31033 -- nosy: +graingert ___ Python tracker <https://bugs.python.org/issue45390> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45279] avoid redundant _commit_removals pending_removals guard
Change by Thomas Grainger : -- keywords: +patch pull_requests: +26930 stage: -> patch review pull_request: https://github.com/python/cpython/pull/28546 ___ Python tracker <https://bugs.python.org/issue45279> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45279] avoid redundant _commit_removals pending_removals guard
New submission from Thomas Grainger : refactor to avoid redundant _commit_removals pending_removals guard -- components: Library (Lib) messages: 402554 nosy: graingert priority: normal severity: normal status: open title: avoid redundant _commit_removals pending_removals guard versions: Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue45279> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45278] RuntimeError on race on weakset concurrent iteration
Change by Thomas Grainger : -- nosy: +graingert ___ Python tracker <https://bugs.python.org/issue45278> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45276] avoid try 1000 in asyncio all_tasks by making weak collection .copy() atomic
Change by Thomas Grainger : -- keywords: +patch pull_requests: +26925 stage: -> patch review pull_request: https://github.com/python/cpython/pull/28541 ___ Python tracker <https://bugs.python.org/issue45276> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45276] avoid try 1000 in asyncio all_tasks by making weak collection .copy() atomic
New submission from Thomas Grainger : the weak collections should have the same threadsafe/thread unsafe guarantees as their strong reference counterparts - eg dict.copy and set.copy are atomic and so the weak versions should be atomic also -- components: Interpreter Core, asyncio messages: 402544 nosy: asvetlov, graingert, yselivanov priority: normal severity: normal status: open title: avoid try 1000 in asyncio all_tasks by making weak collection .copy() atomic versions: Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue45276> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45259] No _heappush_max()
Change by Thomas : -- versions: +Python 3.10, Python 3.11, Python 3.6, Python 3.7, Python 3.8 ___ Python tracker <https://bugs.python.org/issue45259> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45259] No _heappush_max()
Change by Thomas : -- nosy: +rhettinger, stutzbach -ThomasLee94 ___ Python tracker <https://bugs.python.org/issue45259> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45259] No _heappush_max()
New submission from Thomas : There is no heappush function for a max heap when the other supporting helper functions are already implemented (_siftdown_max()) -- components: Library (Lib) messages: 402351 nosy: ThomasLee94 priority: normal severity: normal status: open title: No _heappush_max() versions: Python 3.9 ___ Python tracker <https://bugs.python.org/issue45259> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com