Hi,
Am 03.05.2013 11:27, schrieb Dirk-Willem van Gulik:
FWIIW - the same sentiments where expressed when 'greylisting[1]' in
SMTP came in vogue. For small relays (speaking just from personal
experience and from the vantage of my own private tiny MTA's) that
has however not been the case.
Hi André
Am 03.05.2013 14:37, schrieb André Warnier:
Basically, after a few cycles like this, all his 100 pool connections
will be waiting for a response, and it would have no choice between
either waiting, or starting to kill the connections that have been
waiting more than a certain amount
Am 03.05.2013 06:35, schrieb Ben Reser:
On Thu, May 2, 2013 at 4:53 PM, Guenter Knauf fua...@apache.org wrote:
isnt that one of the core issues - that folks who dont know what they do run
a webserver? And then, shouldnt these get punished with being hacked so that
they try to learn and finally
On 05/03/2013 07:24 AM, Ben Reser wrote:
On Tue, Apr 30, 2013 at 5:23 PM, André Warnier a...@ice-sa.com wrote:
Alternatives :
1) if you were running such a site (which I would still suppose is a
minority of the 600 Million websites which exist), you could easily disable
the feature.
2) you
On 3 mei 2013, at 10:55, Marian Marinov m...@yuhu.biz wrote:
If Apache by default delays 404s, this may have some effect in the first
month or two after the release of this change. But then the the botnet
writers will learn and update their software.
I do believe that these guys are
really impact such clients which by the
nature of what they are doing, are expected indeed to receive a lot of 404 responses.
I believe that this includes most URL-scanning bots, but extremely few legitimate
clients/users. I cannot prove that, but it seems to me a reasonable assumption
Am 03.05.2013 11:38, schrieb André Warnier:
I agree that 404's are legitimate responses.
And I agree that legitimate clients/users can expect to receive them.
But if they do receive them when appropriate, but receive them slower than
other kinds of responses, this is not
really breaking
Marian Marinov wrote:
On 05/03/2013 07:24 AM, Ben Reser wrote:
On Tue, Apr 30, 2013 at 5:23 PM, André Warnier a...@ice-sa.com wrote:
Alternatives :
1) if you were running such a site (which I would still suppose is a
minority of the 600 Million websites which exist), you could easily
disable
On Fri, May 3, 2013 at 10:54 AM, André Warnier a...@ice-sa.com wrote:
So here is a challenge for the Apache devs : describe how a bot-writer could
update his software to avoid the consequences of the scheme that I am
advocating, without consequences on the effectivity of their URL-scanning.
On 03 May 2013, at 11:54 AM, André Warnier a...@ice-sa.com wrote:
So here is a challenge for the Apache devs : describe how a bot-writer could
update his software to avoid the consequences of the scheme that I am
advocating, without consequences on the effectivity of their URL-scanning.
Tom Evans wrote:
On Fri, May 3, 2013 at 10:54 AM, André Warnier a...@ice-sa.com wrote:
So here is a challenge for the Apache devs : describe how a bot-writer could
update his software to avoid the consequences of the scheme that I am
advocating, without consequences on the effectivity of their
An interesting discussion. The admin of the server I use is rather critical
about malicious connections. His way to prevent continuing malicious
connections is to route the source IP address (incoming) to 127.0.0.1 after
X errors reported from a single IP address within Y minutes.
From the logic
Christian Folini wrote:
André,
On Wed, May 01, 2013 at 02:47:55AM +0200, André Warnier wrote:
With respect, I think that you misunderstood the purpose of the proposal.
It is not a protection mechanism for any server in particular.
And installing the delay on one server is not going to achieve
Am 02.05.2013 10:22, schrieb André Warnier:
These tools must be downloaded separately, installed, configured and
maintained, all by
someone who knows what he's doing. And this means that, in the end (and as
the evidence
shows), only a tiny minority of webservers on the Internet will
On Wed, 2013-05-01 at 14:40 +0200, Graham Leggett wrote:
On 01 May 2013, at 1:51 PM, André Warnier a...@ice-sa.com wrote:
But *based on the actual data and patterns which I can observe on my
servers (not guesses), I think it might have an effect*.
Of course it might have an effect -
On Wed, 2013-05-01 at 21:15 +0200, Christian Folini wrote:
real-time blacklist lookup (- ModSecurity's @rbl operator).
Try using that on busy servers (webhosts/ISP's)... might be fine for a
SOHO, but in a larger commercial world, forget it, the impact is far
far worse than the other
André,
On 02.05.2013 10:22, André Warnier wrote:
I'd like to say that I do agree with you, in that there are already many
tools to help defend one's servers against such scans, and against more
targeted attacks.
I have absolutely nothing /against/ these tools, and indeed installing
and
On Fri, 03 May 2013 01:53:01 +0200
Guenter Knauf fua...@apache.org wrote:
On 02.05.2013 10:22, André Warnier wrote:
But I am a bit at a loss as to what to do next. I could easily
enough install such a change on my own servers (they are all
running mod_perl). But then, if it shows that
On Fri, May 03, 2013 at 09:39:44AM +1000, Noel Butler wrote:
real-time blacklist lookup (- ModSecurity's @rbl operator).
Try using that on busy servers (webhosts/ISP's)... might be fine for a
SOHO, but in a larger commercial world, forget it, the impact is far
far worse than the other
On Wed, May 1, 2013 at 7:16 AM, André Warnier a...@ice-sa.com wrote:
If it tries just one URL per server, and walks off if the response takes
longer than some pre-determined value, then it all depends on what this
value is.
If the value is very small, then it will miss a larger proportion of
On Tue, Apr 30, 2013 at 5:23 PM, André Warnier a...@ice-sa.com wrote:
Alternatives :
1) if you were running such a site (which I would still suppose is a
minority of the 600 Million websites which exist), you could easily disable
the feature.
2) you could instead return a redirect response,
On Thu, May 2, 2013 at 4:53 PM, Guenter Knauf fua...@apache.org wrote:
isnt that one of the core issues - that folks who dont know what they do run
a webserver? And then, shouldnt these get punished with being hacked so that
they try to learn and finally *know* what they do, and do it right
On 30/04/2013 21:38, Ben Laurie wrote:
On 30 April 2013 11:14, Reindl Harald h.rei...@thelounge.net wrote:
Am 30.04.2013 12:03, schrieb André Warnier:
As a general idea thus, anything which impacts the delay to obtain a 404
response, should
impact these bots much more than it impacts
that, if it is installed on enough webservers on the
Internet, may slow down the URL-scanning bots (hopefully a lot), and thereby
inconvenience their botmasters.
You need to consider the environment that a typical URL scanner runs in, the
open internet, which consists of vast swaths of machines
to any machine.
It is something that, if it is installed on enough webservers on the
Internet, may slow down the URL-scanning bots (hopefully a lot), and thereby
inconvenience their botmasters. Hopefully to the point where they would
decide that it is not worth scanning that way anymore
for about 35-40% of the DB size of all of our customers.
It is something that, if it is installed on enough webservers on the
Internet, may slow down the URL-scanning bots (hopefully a lot), and thereby
inconvenience their botmasters. Hopefully to the point where they would
decide
not a targeted attack, it is
simply someone looking for easy access to any machine.
It is something that, if it is installed on enough webservers on the
Internet, may slow down the URL-scanning bots (hopefully a lot), and thereby
inconvenience their botmasters. Hopefully to the point where
On Wed, May 1, 2013 at 10:37 AM, Ben Laurie b...@links.org wrote:
So your argument is that extra connections use resources in servers
but not clients?
I only care about the servers. However, the clients are most likely
constrained by CPU or network. Slowing down all the requests at the
server
On 01 May 2013, at 11:34 AM, Marian Marinov m...@yuhu.biz wrote:
Actually, what we are observing is completely opposite to what you are saying.
Delaying spam bots, brute force attacks, and vulnerability scanners
significantly decreases the amount of requests we get from them.
So, our
Am 01.05.2013 11:37, schrieb Ben Laurie:
Well, no, actually this is not accurate. You are assuming that these
bots are written using blocking io semantics; that if a bot is delayed
by 2 seconds when getting a 404 from your server, it is not able to do
anything else in those 2 seconds. This
On 1 May 2013 11:11, Graham Leggett minf...@sharp.fm wrote:
On 01 May 2013, at 11:34 AM, Marian Marinov m...@yuhu.biz wrote:
Actually, what we are observing is completely opposite to what you are
saying.
Delaying spam bots, brute force attacks, and vulnerability scanners
significantly
Am 01.05.2013 13:14, schrieb Ben Laurie:
The fact you cannot explain the evidence does not invalidate the evidence
what evidence has this thread?
the whole idea of slow down 404 repsones is broken and must never be default
on any setup nor should it be implemented at all - period
On 01 May 2013, at 1:14 PM, Ben Laurie b...@links.org wrote:
The fact you cannot explain the evidence does not invalidate the evidence.
The evidence was just explained - a bot that does not get an answer quick
enough gives up and looks elsewhere.
The key words are looks elsewhere.
Jeez.
webservers on the
Internet, may slow down the URL-scanning bots (hopefully a lot), and thereby
inconvenience their botmasters. Hopefully to the point where they would
decide that it is not worth scanning that way anymore. And if it dos not
inconvenience them enough to achieve that, at least it should
On 1 mei 2013, at 13:31, Graham Leggett minf...@sharp.fm wrote:
The evidence was just explained - a bot that does not get an answer quick
enough gives up and looks elsewhere.
The key words are looks elsewhere.
For what it is worth - I've been experimenting with this (up till about 6
on our servers accounts for about 35-40% of the DB size of all of our
customers.
It is something that, if it is installed on enough webservers on the
Internet, may slow down the URL-scanning bots (hopefully a lot), and
thereby
inconvenience their botmasters. Hopefully to the point where
Am 01.05.2013 13:51, schrieb André Warnier:
There is so far one possible pitfall, which was identified by someone earlier
on this list : the fact that delaying
404 responses might have a bad effect on some particular kind of usage by
legitimate clients/users. So far, I
believe that such
Am 01.05.2013 14:00, schrieb Reindl Harald:
here you have something to read and learn that more and more attacks
are done this way by exhausting ressources without high bandwith and
THIS are the real problems server-admins have to fight and not the noise
you see on your small site
On 05/01/2013 03:00 PM, Reindl Harald wrote:
Am 01.05.2013 13:51, schrieb André Warnier:
There is so far one possible pitfall, which was identified by someone earlier
on this list : the fact that delaying
404 responses might have a bad effect on some particular kind of usage by
legitimate
I think we're mixing three issues
1) Prevent Starvation.
protecting a server from server side/machine starvation (i.e. running
out of file descriptors, sockets, mbuf's, whatever).
So here you are in the domain where there is no argument in terms of
protocol violation/bad
Dirk-Willem van Gulik wrote:
On 1 mei 2013, at 13:31, Graham Leggett minf...@sharp.fm wrote:
The evidence was just explained - a bot that does not get an answer quick
enough gives up and looks elsewhere.
The key words are looks elsewhere.
For what it is worth - I've been experimenting with
On 05/01/2013 03:22 PM, André Warnier wrote:
Dirk-Willem van Gulik wrote:
On 1 mei 2013, at 13:31, Graham Leggett minf...@sharp.fm wrote:
The evidence was just explained - a bot that does not get an answer quick
enough gives up and looks elsewhere.
The key words are looks elsewhere.
For
Am 01.05.2013 14:09, schrieb Marian Marinov:
On 05/01/2013 03:00 PM, Reindl Harald wrote:
and YES making DOS-attacks easier is treatet as security risk by any
professional auditor and there where i work threat middle means
fix it or shut down the customers project and the last time i got
On 01 May 2013, at 1:51 PM, André Warnier a...@ice-sa.com wrote:
But *based on the actual data and patterns which I can observe on my servers
(not guesses), I think it might have an effect*.
Of course it might have an effect - the real important question is will it have
a *useful* effect.
A
On Wednesday 01 May 2013, Graham Leggett wrote:
Of course it might have an effect - the real important question is
will it have a useful effect.
A bot that gives up scanning a box that by definition isn't
vulnerable to that bot (thus the 404) doesn't achieve anything
useful, the bot failed
situation : there are
hundreds of millions of webservers on the Internet which do /not/ implement any of these
tools. Which is one of the elements which makes running these URL-scanning bots be a
profitable proposition, until now.
In contrast, my proposal would not require any expertise or any time
André,
On Wed, May 01, 2013 at 02:47:55AM +0200, André Warnier wrote:
With respect, I think that you misunderstood the purpose of the proposal.
It is not a protection mechanism for any server in particular.
And installing the delay on one server is not going to achieve much.
In fact I did
the good bots) are accessing mostly links which
work, so they
rarely get 404 Not Found responses. Malicious URL-scanning bots on the other
hand, by
the very nature of what they are scanning for, are getting many 404 Not Found
responses.
As a general idea thus, anything which impacts the delay
Am 30.04.2013 12:03, schrieb André Warnier:
As a general idea thus, anything which impacts the delay to obtain a 404
response, should
impact these bots much more than it impacts legitimate users/clients.
How much ?
Let us imagine for a moment that this suggestion is implemented in the
On 30 Apr 2013, at 12:03 PM, André Warnier a...@ice-sa.com wrote:
The only cost would a relatively small change to the Apache webservers, which
is what my
suggestion consists of : adding a variable delay (say between 100 ms and 2000
ms) to any
404 response.
This would have no real effect.
, and the kind of access made by legitimate HTTP users/clients
: legitimate
users/clients (including the good bots) are accessing mostly links which
work, so they
rarely get 404 Not Found responses. Malicious URL-scanning bots on the
other hand, by
the very nature of what they are scanning
On Tuesday, April 30, 2013, Christian Folini wrote:
But you can try it out for yourself easily with
2-3 ModSecurity rules and the pause directive.
Someone suggested the same idea to me and I tried it out on one of my
servers by setting PHP as the 404 handler and having it loop there. (which
On Tue, Apr 30, 2013 at 3:03 AM, André Warnier a...@ice-sa.com wrote:
Let us imagine for a moment that this suggestion is implemented in the
Apache webservers,
and is enabled in the default configuration. And let's imagine that after a
while, 20% of
the Apache webservers deployed on the
On 30 April 2013 11:14, Reindl Harald h.rei...@thelounge.net wrote:
Am 30.04.2013 12:03, schrieb André Warnier:
As a general idea thus, anything which impacts the delay to obtain a 404
response, should
impact these bots much more than it impacts legitimate users/clients.
How much ?
Let us
On 30 April 2013 11:29, Graham Leggett minf...@sharp.fm wrote:
On 30 Apr 2013, at 12:03 PM, André Warnier a...@ice-sa.com wrote:
The only cost would a relatively small change to the Apache webservers,
which is what my
suggestion consists of : adding a variable delay (say between 100 ms and
On 30 Apr 2013, at 8:42 PM, Ben Laurie b...@links.org wrote:
This would have no real effect.
Bots are patient, slowing them down isn't going to inconvenience a bot in
any way. The simple workaround if the bot does take too long is to simply
send the requests in parallel.
Disagree.
2013/4/30 Graham Leggett minf...@sharp.fm
On 30 Apr 2013, at 12:03 PM, André Warnier a...@ice-sa.com wrote:
The only cost would a relatively small change to the Apache webservers,
which is what my
suggestion consists of : adding a variable delay (say between 100 ms and
2000 ms) to any
On Tue, Apr 30, 2013 at 08:54:47PM +0200, Lazy wrote:
mod_security + simple scripts+ ipset + iptables TARPIT in the raw table
this way You would be able to block efficiently a very large number of
ipnumbers, using
TARPIT will take care of the
delaying new bot connections at minimal cost
Am 30.04.2013 20:38, schrieb Ben Laurie:
On 30 April 2013 11:14, Reindl Harald h.rei...@thelounge.net wrote:
no - this idea is very very bad and if you ever saw a
DDOS-attack from 10 thousands of ip-addresses on a
machine you maintain you would not consider anything
which makes responses
Graham Leggett wrote:
On 30 Apr 2013, at 12:03 PM, André Warnier a...@ice-sa.com wrote:
The only cost would a relatively small change to the Apache webservers, which
is what my
suggestion consists of : adding a variable delay (say between 100 ms and 2000
ms) to any
404 response.
This would
On Tue, Apr 30, 2013 at 4:09 PM, André Warnier a...@ice-sa.com wrote:
But I have been trying to figure out a real use case, where expecting 404
responses in the course of legitimate applications or website access would
be a normal thing to do, and I admit that I haven't been able to think of
to time we need to go through their
databases, and verify that the links which they have stored are still current.
So for these customers we are regularly running programs of the URL checker type. These
are in a way similar to URL-scanning bots, except that they target a longer list of URLs
Ben Reser wrote:
On Tue, Apr 30, 2013 at 4:09 PM, André Warnier a...@ice-sa.com wrote:
But I have been trying to figure out a real use case, where expecting 404
responses in the course of legitimate applications or website access would
be a normal thing to do, and I admit that I haven't been
Ben Laurie wrote:
On 30 April 2013 11:29, Graham Leggett minf...@sharp.fm wrote:
On 30 Apr 2013, at 12:03 PM, André Warnier a...@ice-sa.com wrote:
The only cost would a relatively small change to the Apache webservers, which
is what my
suggestion consists of : adding a variable delay (say
misunderstood the purpose of the proposal.
It is not a protection mechanism for any server in particular.
And installing the delay on one server is not going to achieve much.
It is something that, if it is installed on enough webservers on the Internet, may slow
down the URL-scanning bots (hopefully a lot
65 matches
Mail list logo