Hi there,

I was wondering if HAProxy helps in the following situation:

- We have a wiki site which is quite slow
- Regular users don't have many problems
- We also get crawled by a search bot, which creates many concurrent 
connections, more than the hardware can handle
- Therefore, service is degraded and users usually have their browsers time out 
on them

Given that we can't make the wiki faster, I was thinking that we could solve 
this by having a per-source-IP queue, which made sure that a given source IP 
cannot have more than e.g. 3 requests active at the same time. Requests beyond 
that would get queued.

Is this possible?

Thanks,

Wout.

Reply via email to