I'm assuming you're asking about the 2.0 codebase.
The question isn't as simple has "how many pages". Another factor is "how
much history does each page have?" Another would be, "How do you intend to
use it?" There are many other factors as well, from bandwidth, to I/O, to
memory.
I've run some tests against flexwiki.com, which is broad (about 5000
topics), deep (some pages have a fair amount of history), and varied
(everything from simple pages to moderately complex WikiTalk). David has the
very large wiki he alluded to that he's doing similar testing on, but we
haven't heard the results yet.
Right now, it looks like 2.0 is pretty fast for most scenarios. The one
place we know it sucks a bit is when writing WikiTalk to process all the
topics in a large namespace. I'm working on making that better.
Of course, any categorical statement about performance is inherently wrong.
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Astralis
Lux
Sent: Sunday, September 09, 2007 3:15 PM
To: flexwiki-users@lists.sourceforge.net
Subject: [Flexwiki-users] Has anyone run a wiki with 10,000 pages?
Has anyone tracked the performance of large wikis? What are the results?
_____
More photos; more messages; more whatever - Get MORE with Windows LiveT
HotmailR. NOW with 5GB storage. Get more!
<http://imagine-windowslive.com/hotmail/?locale=en-us&ocid=TXT_TAGHM_migrati
on_HM_mini_5G_0907>
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Flexwiki-users mailing list
Flexwiki-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/flexwiki-users