I have a script that uploads files to Google Drive. It presently performs the 
uploads serially, but I want to do the uploads in parallel--with a reasonable 
number of simultaneous uploads--and see if that improves performance. I think 
that an Executor is the best way to accomplish this task.

The trouble is that there's no reason for my script to continue queuing uploads 
while all of the Executor's workers are busy. In theory, if the number of files 
to upload is large enough, trying to queue all of them could cause the process 
to run out of memory. Even if it didn't run out of memory, it could consume an 
excessive amount of memory.

It might be a good idea to add a "block" option to Executor.submit 
(https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.Executor.submit)
  that allows the caller to block until a worker is free to handle the request. 
I believe that this would be trivial to implement by passing the "block" option 
through to the Executor's internal Queue.put call 
(https://github.com/python/cpython/blob/242c26f53edb965e9808dd918089e664c0223407/Lib/concurrent/futures/thread.py#L176).
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/6XKOC7AXQAPH4CQQRHNTVOZ5GZOZRJYL/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to