Dear Joan,

Am 18.08.2021 14:36 schrieb Joao S. O. Bueno:
As it stands however, is that you simply have to change your approach:
instead of dividing yoru workload into different cores before starting, the
common approach there is to set up worker processes, one per core, or
per processor thread, and use those as a pool of resources to which
you submit your processing work in chunks.
In that way, if a worker happens to be in a faster core, it will be
done with its chunk earlier and accept more work before
slower cores are available.

thanks for your feedback and your idea about the alternative solution with the ProcessPool.

I think this I will use this for future projects. Thanks a lot.

Kind
Christian
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/M7FG2RW2YP5ROOYFFEV5PH2DSSP6EJQM/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to