Paul Martin <[email protected]> added the comment:
I don't think changing the default executor is a good approach. What happens,
if two or more thread pools are running at the same time? In that case they
will use the same default executor anyway, so creating a new executor each time
seems like a waste.
Shutting down the default executor seems unnecessary and could impact lower
level code which is using it. The default executor is shutdown at the end of
asyncio.run anyway.
I also think it would be good to have a synchronous entry point, and not
require a context manager. Having a ThreadPool per class instance would be a
common pattern.
class ThreadPool:
def __init__(self, timeout=None):
self.timeout = timeout
self._loop = asyncio.get_event_loop()
self._executor = concurrent.futures.ThreadPoolExecutor()
async def close(self):
await self._executor.shutdown(timeout=self.timeout)
async def __aenter__(self):
return self
async def __aexit__(self, *args):
await self.close()
def run(self, func, *args, **kwargs):
call = functools.partial(func, *args, **kwargs)
return self._loop.run_in_executor(self._executor, call)
I'm not sure if a new ThreadPoolExecutor really needs to be created for each
ThreadPool though.
----------
nosy: +primal
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue32309>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com