I would probably do that sort of throttling at the OS level with iptables,
etc...

That said, before that I would investigate why the wiki is so slow...
Something probably isn't configured right if it chokes with only a few
simultaneous accesses.  I mean, unless it's embedded server with under 32MB
of RAM, the hardware should be able to handle that...


> -----Original Message-----
> From: Wout Mertens [mailto:wout.mert...@gmail.com]
> Sent: Sunday, November 15, 2009 9:57 AM
> To: haproxy@formilux.org
> Subject: Preventing bots from starving other users?
> 
> Hi there,
> 
> I was wondering if HAProxy helps in the following situation:
> 
> - We have a wiki site which is quite slow
> - Regular users don't have many problems
> - We also get crawled by a search bot, which creates many concurrent
> connections, more than the hardware can handle
> - Therefore, service is degraded and users usually have their browsers
> time out on them
> 
> Given that we can't make the wiki faster, I was thinking that we could
> solve this by having a per-source-IP queue, which made sure that a
> given source IP cannot have more than e.g. 3 requests active at the
> same time. Requests beyond that would get queued.
> 
> 
> Is this possible?
> 
> Thanks,
> 
> Wout.
> 
> No virus found in this incoming message.
> Checked by AVG - www.avg.com
> Version: 8.5.425 / Virus Database: 270.14.60/2495 - Release Date:
> 11/15/09 07:50:00


Reply via email to