Zak Greant (Foo Associates) wrote:
Greetings All,
I've been editing http://www.mediawiki.org/wiki/Unit_Testing (and am
happy for feedback and suggestions.)
Hello Zak,
Looks good overall, but there seem to be a bug with the
SeleniumFramework line :)
While editing, I took at look at the
Billinghurst wrote:
I am guessing that the search engine does not transclude pages before it
undertakes it
indexing function. Is someone able to confirm that for me?
Is there any fix that anyone can suggest, or even know where such an issue
can be raised
beyong Bugzilla? Would a fix
This would fix it for the default mysql search, although I'm not sure at
what level of overhead. The Lucene backend is using OAIRepository for
incremental updates, and builds whole indexes from XML dumps. Thus, the
expanded articles need to be present in *both* of those.
Cheers, r.
This
2010/12/19 Platonides platoni...@gmail.com:
That operation should be fast, as it should be hitting the cache from
having just rendered it.
Calling $wgParser-preprocess() won't hit the parser cache, I don't
think. I also don't think preprocessed wikitext is cached at all, just
the HTML output.
To continue the discussion on how to improve the performance, would it be
possible to distribute the dumps as a 7z / gz / other format archive containing
multiple smaller XML files. It's quite tricky to split a very large XML file in
smaller valid XML files and if the dumping process is already
Diederik van Liere wrote:
To continue the discussion on how to improve the performance, would it be
possible to distribute the dumps as a 7z / gz / other format archive
containing multiple smaller XML files. It's quite tricky to split a very
large XML file in smaller valid XML files and if
Which dump file is offered in smaller sub files?
On Sun, Dec 19, 2010 at 6:02 PM, Platonides platoni...@gmail.com wrote:
Diederik van Liere wrote:
To continue the discussion on how to improve the performance, would it be
possible to distribute the dumps as a 7z / gz / other format archive
Diederik van Liere wrote:
Which dump file is offered in smaller sub files?
http://download.wikimedia.org/enwiki/20100904/
Also see http://wikitech.wikimedia.org/view/Dumps/Parallelization
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Okay, no clue how I could have missed that. My google skills failed me :)
thanks for the pointer!
best
Diederik
On 2010-12-19, at 6:21 PM, Platonides wrote:
Diederik van Liere wrote:
Which dump file is offered in smaller sub files?
http://download.wikimedia.org/enwiki/20100904/
Also see
Στις 20-12-2010, ημέρα Δευ, και ώρα 00:21 +0100, ο/η Platonides έγραψε:
Diederik van Liere wrote:
Which dump file is offered in smaller sub files?
http://download.wikimedia.org/enwiki/20100904/
Also see http://wikitech.wikimedia.org/view/Dumps/Parallelization
Expect to see more of this
10 matches
Mail list logo