https://bugzilla.wikimedia.org/show_bug.cgi?id=65812

--- Comment #5 from [email protected] ---
{{:Sablon:összegtáblázat}} is the transclusion in question in huwiki which
generates a 408K token chunk in the tokenizer prior to the fix and which seemed
to essentially slow down in async-ttm after about close to 128K tokens were
processed and we traced this to a slowdown in concatenation once the accum size
crossed a threshold.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to