If you look at the graph at the bottom of the page, they say that their
windows 2000 box can perform over 1 million requests per second with IIS!
http://www.zdnet.co.uk/pcmag/ne/2000/11/01.html
Now, I may be naive in my math (and if so I need to know why) but on a
machine that has a 100 Mbps card, if I serve 1 million pages in one second,
that means my page size (web page Im serving) is
100000000 (ie. 100 Mb) / 1 Million = ~ 13 bytes
So if they are saying they did 1 million requests per second on a 100 Mbps
connection, then that means their web pages were 13 bytes long....
Are they nuts or am I interpreting the graph wrong?
Thanks
Lee
______________________________________________
FREE Personalized Email at Mail.com
Sign up at http://www.mail.com/?sr=signup
-
To unsubscribe from this list: send the line "unsubscribe linux-newbie" in
the body of a message to [EMAIL PROTECTED]
Please read the FAQ at http://www.linux-learn.org/faqs