2011-05-04 08:13, Tim Starling skrev:
> On 04/05/11 15:52, Andreas Jonsson wrote:
>> The time it takes to execute the code that glues together the regexps
>> will be insignificant compared to actually executing the regexps for any
>> article larger than a few hundred bytes.  This is at least the case for
>> the articles are the the easiest for the core parser, which are articles
>> that contains no markup.  The more markup the slower it will run.  It is
>> possible that this slowdown will be lessened if compiled with HipHop.
>> But the top speed of the parser (in bytes/seconds) will be largely
>> unaffected.
> 
> PHP execution dominates for real test cases, and HipHop provides a
> massive speedup. See the previous HipHop thread.
> 
> http://lists.wikimedia.org/pipermail/wikitech-l/2011-April/052679.html
> 
> Unfortunately, users refuse to write articles consisting only of
> hundreds of kilobytes of plain text, they keep adding references and
> links and things. So we don't really care about the parser's "top speed".

We are talking about different things.  I don't consider callbacks made
when processing "magic words" or "parser functions" being part of the
actual parsing.  The reference case of no markup input is interesting to
me as it marks the maximum throughput of the MediaWiki parser, and is
what you would compare alternative implementations to.  But, obviously,
if the Barack Obama article takes 22 seconds to render, there are more
severe problems than parser performance at the moment.

Best Regards,

/Andreas Jonsson

_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to