I am trying to run mw-zip to download a large book with lots of images from my
wiki. I have already throttled back the image download code such that it only
attempts to download one image at a time by changing:
self.image_download_pool = gevent.pool.Pool(10)
to
self.image_download_pool = gevent.pool.Pool(1)
in fetch.py.
Now, I am having a problem with connection timed out errors on the other
downloads. There are often a few of these during the process and they are not
consistent. I believe that it is swamping my network connection or my wiki by
trying to do too much stuff at once, similar to the image download problem I
had earlier and fixed. I am looking for a similar fix for the other fetches so
that it only attempts to do those one at a time, too. Either that, or is there
a way to lengthen the timeout value? Maybe add a call to
socket.setdefaulttimeout(n) somewhere; but, where?
Any help with this problem would be appreciated. I can usually, eventually, get
the book to download without errors by trying repeatedly; but, it is getting
increasingly harder and harder to get it to work as the book grows in size. I
think I need to throttle the downloads back or wait longer for it to timeout or
something. Any help?
Thanks,
William
--
You received this message because you are subscribed to the Google Groups
"mwlib" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.