Oopps, my bad...  It's actually tc and not iptables.  Google    tc qdisc
for some info.

You could allow your local ips go unrestricted, and throttle all other IPs
to 512kb/sec for example.

What software is the running on?  I assume it's not running under apache or
there would be some ways to tune apache.  As other have mentioned, telling
the crawlers to behave themselves or totally ignore the wiki with a robots
file is probably best.

> -----Original Message-----
> From: Wout Mertens [mailto:wout.mert...@gmail.com]
> Sent: Monday, November 16, 2009 7:31 AM
> To: John Lauro
> Cc: haproxy@formilux.org
> Subject: Re: Preventing bots from starving other users?
> 
> Hi John,
> 
> On Nov 15, 2009, at 8:29 PM, John Lauro wrote:
> 
> > I would probably do that sort of throttling at the OS level with
> iptables,
> > etc...
> 
> Hmmm.... How? I don't want to throw away the requests, just queue them.
> Looking for iptables rate limiting it seems that you can only drop the
> request.
> 
> Then again:
> 
> > That said, before that I would investigate why the wiki is so slow...
> > Something probably isn't configured right if it chokes with only a
> few
> > simultaneous accesses.  I mean, unless it's embedded server with
> under 32MB
> > of RAM, the hardware should be able to handle that...
> 
> Yeah, it's running pretty old software on a pretty old server. It
> should be upgraded but that is a fair bit of work; I was hoping that a
> bit of configuration could make the situation fair again...
> 
> Thanks,
> 
> Wout.
> 
> No virus found in this incoming message.
> Checked by AVG - www.avg.com
> Version: 8.5.425 / Virus Database: 270.14.60/2495 - Release Date:
> 11/15/09 19:50:00


Reply via email to