I run a MediaWiki 1.17.0 site for 3000 users (see the architecture below) and 
would appreciate some tips on improving performance. Specifically, what should 
we try next, given our current setup?  (I have read 
http://www.mediawiki.org/wiki/Manual:Performance_tuning.)

The platform is a single VMware virtual machine (CentOS Linux 5.6) with two 
CPUs (2.5GHz Opteron) and 3 GB RAM.  The whole MediaWiki/LAMP stack runs on 
this VM, including mySQL. This is on a fast intranet, so network speed is not 
an issue.  Other statistics include:

*       Page views per day: 22,000 (about 30 hits per minute during peak hours)
*       Edits per day: 1200
*       Users: 1800 registered editors and 1200 anonymous readers
*       Titles: 100,000
*       Revisions: 850,000
*       Page rendering time (based on the embedded HTML comment at the bottom 
of each page) is about 0.25 to 0.5 seconds today.
*       System load average usually runs between 1.00 and 4.00.  A little 
swapping occurs (around 135MB swap in use)
*       RAM buffers free: around 2.3 GB right now.
*       PHP 5.33, mySQL 5.0.77, httpd 2.2.3

For caching, we use eAccelerator (huge improvement) and $wgMainCacheType = 
CACHE_ACCEL.

Another important detail: Unlike Wikipedia (and most other wikis), 
approximately 10,000 of our pages make live SQL queries to non-MediaWiki 
databases, pull in the results, and display them to the user. This is important 
for our business, and our users are accustomed to seeing up-to-the-second live 
data.  (So we have not investigated Squid, for example, which I think would 
cache the rendered pages and therefore lose the "up-to-the-second" live data.)

The problems I am seeing are:

*       Sometimes an individual "Save Page" operation will sit for 20-30 
seconds before completing.
*       Occasionally some pages take a long time to render (10-15 seconds) for 
no discernable reason. (This is not due to the live SQL queries mentioned 
above.)

I'd like to eliminate these delays and decrease page rendering time to 0.1 
second or less.

I have determined that our extensions do not slow the wiki down much. After 
removing all of them, the speed stays about the same.

Given our architecture, what's the best next step we should investigate to 
improve performance?

*       File cache or Squid?  (And is there some easy way to tell Squid to 
exclude our dynamic SQL pages? They all run a particular wiki extension, so if 
there's something programmatic we can do in the extension, that's great. I am 
not very familiar with Squid.)
*       memcached?
*       Increase number of CPUs?
*       Multiple front-end servers?
*       Change from a VM to a physical machine?
*       Move mySQL to a separate server (possibly physical)?
*       Something else?

What additional measurements would be most helpful in making this decision?

Thanks for any advice,
DanB


_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to