Am 27.03.2015 um 17:31 schrieb "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= <>":
On Friday, 27 March 2015 at 16:18:33 UTC, Sönke Ludwig wrote:
So what happens if 10 requests come in at the same time? Does moving
things around still help you? No.

Load balancing is probabilistic in nature. Caching also makes it
unlikely that you get 10 successive high computation requests.

You could say the same for the non-moving case. If you have a fully loaded node and mix request handling and lengthy computations like this, you'll run into this no matter what. The simple solution is to just either separate lengthy computations (easy) or to split them up into shorter parts using yield() (usually easy, too).

Caching *may* make it unlikely, but that completely depends on the application. If you have some kind of server-side image processing web service with many concurrent users, you'd have a lot of computation heavy requests with no opportunities for caching.

BTW, why would an event driven design be any better? You'd have
exactly the same issue.

1. No stack.

That reduces the memory footprint, but doesn't reduce latency.

2. Batching.

Can you elaborate?

But it is more tedious.

Reply via email to