https://bugzilla.wikimedia.org/show_bug.cgi?id=52253

--- Comment #7 from Tim Starling <[email protected]> ---
(In reply to comment #6)
> This may be a stupid question, but I got asked today and I didn't know the
> answer: if Wikimedia currently has a fairly large number of Web servers
> providing HTTP access, couldn't most of those servers be re-provisioned to
> serve HTTPS instead? I'm not sure why you would need 80 additional servers
> (not that the Wikimedia Foundation couldn't easily afford them, in any case).

The reason you need more servers is because serving HTTPS is more expensive
than serving HTTP, because of the encryption overhead. You could have the same
servers doing both HTTP and HTTPS, and I believe that is indeed the plan, but
you would still need more servers because the CPU cost of serving an HTTPS
connection is higher, so you need more cores to do the same connection rate.

Note that the 80 figure is just the current host count (9) multiplied by 9 and
rounded off. Ryan pointed out to me that 4 of those 9 servers are old and have
only 8 cores, whereas the newer servers have 24 cores. Since CPU is the
limiting factor, it's really the core count that should be multiplied by 9. We
have 152 cores currently doing HTTPS, so we would need an extra 1368, implying
we need 57 additional 24-core servers.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to