Hi! I'm going to be involved in an event which will probably bring us about 2 million pageviews in one day... The server that is going to be hit will have a gigabit connection, and there are two load-balanced machines. But, they are getting fairly old, allready.
Since it is just this one day, getting more hardware is out of the question, we have to do what we have. However, the content served will not change fast, so caching is going to be important. Also, the front page and the images on that page is going to be viewed far more than the pages underneath. So, I had this idea, how about caching the most requested files in the RAM, rather on the disc...? That would eliminate a quite significant bottleneck, wouldn't it? Has anybody done anything in that direction...? Unfortunately, I don't know yet if I will get the job of setting this up, so I'm not sure AxKit will be used, but it is nevertheless something to think about just for the kicks... :-) Cheers, Kjetil -- Kjetil Kjernsmo Astrophysicist/IT Consultant/Skeptic/Ski-orienteer/Orienteer/Mountaineer [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] Homepage: http://www.kjetil.kjernsmo.net/ OpenPGP KeyID: 6A6A0BBC --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
