https://bugzilla.wikimedia.org/show_bug.cgi?id=28146

--- Comment #9 from Brion Vibber <[email protected]> 2011-04-04 21:02:32 UTC 
---
As a workaround, in r85377 I've changed DjVuImage::retrieveMetaData() so it
runs individual page texts through UtfNormal::cleanUp() rather than the entire
dumped document.

Verified that without the fix, I run out of memory uploading the sample file at
128M memory_limit, and with the fix I can upload it just fine.

Still should be fixed in UtfNormal; languages with heavy mixes of ASCII and
non-ASCII use a LOT of memory due to being split into so many short strings,
which makes the preg_match_all() much worse in terms of memory usage than just
a copy of the string.

Very long page texts may also hit limits in these situations (the dump data for
the DjVu file is about 3 megabytes of French text, not inconceivable for a
realllllly long wiki page), and it'd be nice to fix.

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to