On Wednesday, 15 June 2016 at 20:43:55 UTC, deadalnix wrote:
Simple exercise. You have 100 000 servers. Your application suddenly become 1% slower. How angry is your CFO when he discovers how many new machines he needs to buy ?
Probably not too angry at all. This is still just a 1% budget increase, which amounts to a rounding error. Say those 100K servers cost $2K each, meaning $200M for the lot. An extra $2M capital costs doesn't mean much in that context. Perhaps a bigger issue might be the ongoing extra cost for energy, which applies to all the machines, not just tne new ones. Look at it another way. Anyone running 100_000 machines will certainly not be running them all flat-out, to where a 1% increase will push out a requirement for more machines. One needs extra capacity anyway to handle usual surges in the volume of business being handled by the servers. Look at it yet another way. Sure, $2M is a big number in absolute terms, for most of us. But if I were that CFO, instead of yelling about the problem, I'd go to the CTO and tell him to take 100 machines out of service and have the developers use them to profile the application and find places where much more than 1% can be saved.
