fireattack <human.p...@gmail.com> added the comment:

Hi gaoxinge, thanks for the reply.

I assume what you mean is that while shutdown will wait, it won't accept any 
new job/future after it is called.

That makes sense, but this still doesn't work:

```
from concurrent.futures import ThreadPoolExecutor
from time import sleep

def wait_on_future():
    sleep(1)
    f2 = executor.submit(pow, 5, 2)    
    print(f2.result())
    executor.shutdown(wait=True) #Uncomment/comment this makes no difference

executor = ThreadPoolExecutor(max_workers=100)
f = executor.submit(wait_on_future)
```

This shows no error, but f2.result() is still not printed (`result()` has 
built-in wait). Actually, f2 is not executed at all (you can change pow to 
other func to see that).

Please notice this example is modified from an example from the official 
documentation: https://docs.python.org/3/library/concurrent.futures.html 

In that example, the comment says "This will never complete because there is 
only one worker thread and it is executing this function.", which is correct. 
But this is kinda misleading because it implies it will work if you increase 
the number of the worker. However, it will NOT work if you increase the number 
of worker, but also add a small delay, like shown above.

The point is, I failed to find a way to keep executor from main thread alive to 
accept future/job from a child thread if there is a delay. I tried 
shutdown/nothing/with/as_completed they all fail in similar fashion.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue40093>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to