Thanks for the replies.  I've got something basically working with 
concurrent.futures, so I guess I'll go with that.

I see there's a max_queue_size setting available in user_config.py.  I assume 
that is related to one or another of these examples?

> On Mar 27, 2023, at 5:24 PM, i...@gno.de wrote:
> 
> Pywikibot uses request for io methods. I had have a look at several similar 
> libraries with asyncio support but none of them were long term supported.
> 
> It is a good advice to ensure that pages are preloaded. Anyway Pywikibot does 
> not use asyncio (yet) but it used Threads to save pages asynchronously. The 
> common BaseBot.treat() or BaseBot.treat_page() cannot be used asynchronously 
> because it is not Thread safe. 
> 
> You can find concurrent programming examples within the framework.  
> weblinkchecker for example uses Threads to retrieve web pages in parallel. 
> archivebot is able to process all pages from a generator in parallel using 
> concurrent futures. Other examples with concurrent futures can be found in 
> login script, preload_sites, fixing_redirects and watchlist script.
> 
> I hope that helps a bit
> 
> Best
> xqt
> 
>> Am 27.03.2023 um 22:06 schrieb John <phoenixoverr...@gmail.com>:
>> 
>> 
>> I’ve not checked in the v3+ version but there used to be a preload page 
>> generator that batch retrieved stuff from the API. Then pass the preloaded 
>> page object on to the parallel processing part.
>> 
>> On Mon, Mar 27, 2023 at 3:58 PM Roy Smith <r...@panix.com 
>> <mailto:r...@panix.com>> wrote:
>> I need to issue a bunch of Page.get() requests in parallel.  My 
>> understanding is that pywikibot uses the requests library which is 
>> incompatible with async_io, so that's out.  So what do people use?  
>> Threading <https://docs.python.org/3.9/library/threading.html>?  Or, I see 
>> there's an async_io friendly requests port 
>> <https://github.com/rdbhost/yieldfromRequests>.  Is there a way to make 
>> pywikibot use that?
>> 
>> _______________________________________________
>> pywikibot mailing list -- pywikibot@lists.wikimedia.org 
>> <mailto:pywikibot@lists.wikimedia.org>
>> Public archives at 
>> https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/QDE2AHAX4O6G5YTHLGGKWK5LLKKRYUST/
>>  
>> <https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/QDE2AHAX4O6G5YTHLGGKWK5LLKKRYUST/>
>> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org 
>> <mailto:pywikibot-le...@lists.wikimedia.org>
>> _______________________________________________
>> pywikibot mailing list -- pywikibot@lists.wikimedia.org
>> Public archives at 
>> https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/AKLSJR5CSLB2RQMT5N2DZ7DYENW4JIXZ/
>> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org
> _______________________________________________
> pywikibot mailing list -- pywikibot@lists.wikimedia.org
> Public archives at 
> https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/F4OKUOJLTXEQ23TAK557U45IUMTTF3A3/
> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org

_______________________________________________
pywikibot mailing list -- pywikibot@lists.wikimedia.org
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/U46M47IROTE2DDUX2FWHB3JOW4IQWZTA/
To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org

Reply via email to