André,
On 02.05.2013 10:22, André Warnier wrote:
I'd like to say that I do agree with you, in that there are already many
tools to help defend one's servers against such scans, and against more
targeted attacks.
I have absolutely nothing /against/ these tools, and indeed installing
and configuring such tools on a majority of webservers would do much
more for Internet security in general, than my proposal ever would.

But at the same time, there is the rub, as you say yourself : "All that
is missing is enough people configuring their servers as you propose."

These tools must be downloaded separately, installed, configured and
maintained, all by someone who knows what he's doing.
isnt that one of the core issues - that folks who dont know what they do run a webserver? And then, shouldnt these get punished with being hacked so that they try to learn and finally *know* what they do, and do it right next time? ;-)

And this means that, in the end (and as the evidence shows), only a tiny
minority of webservers on the Internet will effectively set up one of
those, and the vast majority of webservers will not.
And among the millions of webservers that don't, there will be enough
candidates for break-in to justify these URL scans, because URL-scanning
at this moment is cheap and really fast.

In contrast, my proposal is so simple from an Apache user point of view,
that I believe that it could spread widely, without any other measure
than configuring it by default in the default Apache distribution (and
be easily turned off by whoever decides he doesn't want it).

If my purpose was merely to shield my own servers, then I would not
spend so much time trying to defend the merits of this proposal.
Instead, I would install one of these tools and be done with it.  I am
not doing it, because on the one hand my servers - as far as I know of
course - do not exhibit any of these flaws which they are scanning for,
and on the other hand because these traces in the logs provide me with
information about how they work.

I apologise if I repeat myself, and if I am perceived as "hot" sometimes.
It may be because of a modicum of despair.  I don't know what I was
expecting as a reaction to this proposal, but I am disappointed - maybe
wrongly so.
I was ready for criticism of the proposal, or for someone proving me
wrong, on the base of real facts or calculations.  But what I am mostly
seeing so far, are objections apparently based on a-priori opinions
which my own factual observations show to be false.
Not only my own though : the couple of people here who have contributed
based on a real experience with real servers, do not seem to contradict
my own findings.  So I am not totally despairing yet.

But I am a bit at a loss as to what to do next.  I could easily enough
install such a change on my own servers (they are all running mod_perl).
But then, if it shows that the bots do slow down on my servers or avoid
them, it still doesn't quite provide enough evidence to prove that this
would benefit the Internet at large, does it ?
No. But you wrote above that its not your intention to protect yourself and your servers, but more that you want to cure the world and enable to run webservers by 'folks who dont know what they do', or???

Ok, 1st lets again assume that you really get here enough httpd developers who support your idea, and we get finally such functionality into httpd, and lets also assume the even more unlikely case that you get us to make this the default - but what do you expect when this will happen? *If* this really happens then this would go into trunk, that means unreleased code. Currently we have two maintained release branches, that are 2.2.x and 2.4.x, where the 1st 2.4.x release happened about 15 months ago. I dont know numbers, but I assume that currently after 15 months only 1 or 2 % have upgraded to 2.4.x, and the vast majority is still running the 2.2.x releases, or even still 2.0.x. Maybe that within the next 9 months another 2-3% will upgrade, so that we have probably 5% running latest version 2 years after 1st release. Now lets further assume that the httpd project decides to release a 2.6.x within these next 9 months (which seems very unlikely, but who knows ...) which contains your slow-down code, and now imagine self how long from now on it would take until your slow-down code would be in use by default of at least 10%? 3 years, 4 years, ....?? When would it show up an effect? After 5 years? And are the bots in 5 years still the same as nowadays? Furthermore, unless we are forced by security issues, there's no reason to break our policies and backport such a feature to the 2.4.y and 2.2.x branches .... yeah, now I imagine that I did you totally disappoint with the above, but hey, thats the reality how things work with the httpd project ...

Ok, now let me throw in some things which I can think of what you still can do in order to make the world's internet better within the next 5 years ... :-P

Instead of returning the 404s give them what they ask for; f.e. write a script which scans your logs and filter for those 404s, and within a few days you should have a nice list of those bot URLs; let the script automatically write/update a proxy config and proxy the bot requests to another dedicated test host, and make there your plays this them ... send them an index page with a PHP script and let it return the bytes very slow; or do it quick and see what comes next, etc. etc. mainly try to study the bots so that you really can predict how they do things, and how they behave next to such things you suggest; and even more interesting is to analyse the host which runs the bot: is it perhaps vulnerable? f.e. what happens if you send them a 1 GB index page back? perhaps they get buffer overflows? is perhaps the backdoor for the bot control vulnerable?

All that reminds me of a worm attack which happened some years ago;
IIRC it was Nimda, and after studying the beast a bit I was finally able to understand the weakness of the worm, and able to strike back (I know, this is illegal, and I dont recommend it - I just want to mention whats possible if you want to make the inet better ...); in order to give the botmaster control over the infected hosts the worm 1st installed a backdoor, and that backdoor was then also the door to finally stop the beast and kill it: it was possible to take over control of the infected host, remove the worm code, and then close the backdoor ...
Isnt this the real thing you want to do?

Gün.


Reply via email to