[issue35040] [functools] provide an async-compatible version of functools.lru_cache
Liran Nuna added the comment: > A coroutine detection is a relatively slow check. > I don't think we need to do it in `functools.lru_cache`. Wouldn't a coroutine check only happen during decoration time? To successfully solve this easily and efficiently, we only really need to wrap the coroutine with `asyncio.ensure_future` if the decorated function is a coroutine, and it will only happen when a result comes back from the decorated function which would have minimal impact. Of course, I don't know much about the internals of `lru_cache` so my assumptions could be wrong. I should familiar myself with the implementation and figure out how doable it would be. -- ___ Python tracker <https://bugs.python.org/issue35040> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue35040] functools.lru_cache does not work with coroutines
New submission from Liran Nuna : lru_cache is a very useful method but it does not work well with coroutines since they can only be executed once. Take for example, the attached code (test-case.py) - It will throw a RuntimeError because you cannot reuse an already awaited coroutine. A solution would be to call `asyncio.ensure_future` on the result of the coroutine if detected. -- components: asyncio files: test-case.py messages: 328228 nosy: Liran Nuna, asvetlov, yselivanov priority: normal severity: normal status: open title: functools.lru_cache does not work with coroutines versions: Python 3.5, Python 3.6, Python 3.7, Python 3.8 Added file: https://bugs.python.org/file47887/test-case.py ___ Python tracker <https://bugs.python.org/issue35040> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue33918] Hooking into pause/resume of iterators/coroutines
Liran Nuna added the comment: > That's still doable with contextvars. You just need a custom mock-like object > (or library) that stores its settings/state in a context variable. contextvars only work with asyncio, what about the iterator case? In addition, you can't possibly expect authors to re-implement a library just because it may or may not be used with asyncio. In my example, re-implementing mock/patch is quite a task just to get such basic functionality. In other words, contextvars don't solve this issue, it just creates new issues to solve and causes code duplication. -- ___ Python tracker <https://bugs.python.org/issue33918> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue33918] Hooking into pause/resume of iterators/coroutines
Liran Nuna added the comment: > You should try to use the contextvars module that was specifically created to > handle local context state (for tasks & coroutines). Yury, from my original report: > I'm aware that this particular problem could be solved with the new context > variables introduced with python3.7, however it is just a simplification of > our actual issue. Not everything can use context managers. Imagine the context manager is mock.patch used in testing and you want to run two tests in "parallel", each with a different mocked method. mock.patch isn't aware of `await` so patching will be incorrect. Those are just some behaviors where context variables don't solve the issue I'm describing. -- ___ Python tracker <https://bugs.python.org/issue33918> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue33918] Hooking into pause/resume of iterators/coroutines
Liran Nuna added the comment: I would like to stress this issue happens with iterators as well, and this isn't a unique issue to asyncio only. I would like to propose four new magic methods for context managers to solve this: __pause__, __resume__, __apause__ and __aresume__ which will be called before/after a pause/resume happen before the coroutine/iterator continues. I'm not sure however if this is the correct venue for such discussion. -- ___ Python tracker <https://bugs.python.org/issue33918> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue33918] Hooking into pause/resume of iterators/coroutines
New submission from Liran Nuna : An interesting property of async programming is that execution order is nondeterministic and async function "pause" and "resume" execution as events come in. This can play havok with context managers, especially ones that wrap a global state change. I can best explain it with code - see attached file. If you were to run this, you'd notice that "Should be logged" does not get logged - this is because the execution order runs the context manager immediately and that affects the entire batch (created by asyncio.gather). Is there a way to hook into a pause/resume handling of coroutines so this kind of thing could be done correctly? I'm aware that this particular problem could be solved with the new context variables introduced with python3.7, however it is just a simplification of our actual issue. Iterators also suffer from this issue, as `yield` pauses and resumes execution. -- components: Interpreter Core files: async_context_managers.py messages: 320101 nosy: Liran Nuna priority: normal severity: normal status: open title: Hooking into pause/resume of iterators/coroutines type: behavior versions: Python 3.4, Python 3.5, Python 3.6, Python 3.7, Python 3.8 Added file: https://bugs.python.org/file47646/async_context_managers.py ___ Python tracker <https://bugs.python.org/issue33918> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
Liran Nuna <liran...@gmail.com> added the comment: Yuri, Those speed improvements are awesome and I'm really excited about them, performance is slowly starting to match asynq and would make us migrating our code to async/await more feasable! Today, python3.6.4 was released and these performance improvements did not make it in this version. I'm not familiar with python's release processes , what are the steps or time line for these to be backported to 3.6 and released on next minor version as to avoid a lengthy wait for python3.7? -- ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
Liran Nuna <liran...@gmail.com> added the comment: Yury, thank you very much for circling back. I wish I could be more helpful in pursuing performance improvements. -- ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
Liran Nuna <liran...@gmail.com> added the comment: > Also I suggest you to try uvloop Sadly, uvloop does not offer any noticeable improvement in performance for us. Our usage is very similar to the "benchmarks" I posted - we don't do any actual async I/O because asynq does not offer that. > I suggest you to profile it and post real profiles here Gladly! Would you like profiles of python within itself (as in, made with `cProfile`) or gmon.out profiles? the latter would be a little more difficult to run since we run a web server which needs to accept traffic, but I do have plenty of `cProfile` profiles I could share with you. > I just don't think that asyncio.gather can be the main bottleneck you have, > it must be something else I think my PR and the examples I have provided set a different mindset to this issue - the issue in question is that the "sync" performance of async/await is very poor when used to execute things in "sync". The benchmarks provided are an example of what a request life time looks like - there's a lot of scatter-gather to batch database queries that happen in sync (for the time being) which in the benchmark being simulated as simple math operations. To reiterate, I do not think `asyncio.gather` is the main performance bottleneck. I do not know how to better identify it with my limited knowledge about cpython. -- ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
Liran Nuna <liran...@gmail.com> added the comment: > which makes them faster in some specific micro-benchmarks I'm not talking about micro-benchmarks, we are actively using asynq in production. Recent efforts to migrate to async/await have hit a major performance hit - our response times nearly doubled. I agree that the PR I offers little (or no) improvement but I implore you to explore performance bottlenecks in async/await. -- ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
Liran Nuna <liran...@gmail.com> added the comment: The PR is merely a finding I ran into while benchmarking. In my executions I saw a consistent ~3-5% performance increase. At my company we are attempting to migrate from asynq to asyncio but performance suffers (our servers response times nearly double). I personally ran profiles and benchmarks and noticed this tiny bottleneck which I knew how to fix, so I submitted a PR to the python project in hopes it helps. -- ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
Liran Nuna <liran...@gmail.com> added the comment: Added comparable benchmark in asynq -- Added file: https://bugs.python.org/file47314/batch_asynq.py ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32204] async/await performance is very low
New submission from Liran Nuna <liran...@gmail.com>: The performance of async/await is very low when compared to similar code that implements similar functionality via iterators, such as Quora's asynq library (https://github.com/quora/asynq/tree/master/asynq). Based on my benchmarks, asynq is almost twice as fast as async/await. I have found some performance hanging fruit when benchmarking (See attached GitHub PR). $ time python batch_asyncio.py real0m5.851s user0m5.760s sys 0m0.088s $ time python batch_asynq.py real0m2.999s user0m2.900s sys 0m0.076s -- components: asyncio files: batch_asyncio.py messages: 307492 nosy: Liran Nuna, yselivanov priority: normal pull_requests: 4599 severity: normal status: open title: async/await performance is very low type: performance versions: Python 3.6 Added file: https://bugs.python.org/file47313/batch_asyncio.py ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32204> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue29051] Improve error reporting involving f-strings (PEP 498)
Changes by Liran Nuna <liran...@gmail.com>: -- nosy: +Liran Nuna ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue29051> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com