Marian Marinov wrote:
On 05/03/2013 07:24 AM, Ben Reser wrote:
On Tue, Apr 30, 2013 at 5:23 PM, André Warnier <a...@ice-sa.com> wrote:
Alternatives :
1) if you were running such a site (which I would still suppose is a
minority of the 600 Million websites which exist), you could easily disable
the feature.
2) you could instead return a redirect response, to a page saying "that one
was sold, but look at these".
That may be even more friendly to search engines, and to customers.

My point isn't that there aren't alternatives, but that 404's are
legitimate responses that legitimate users can be expected to receive.
  As such you'll find it nearly impossible in my opinion to convince
people to degrade performance for them as a default.  If it isn't a
default you're hardly any better off than you are today since it will
not be widely deployed.

If you want to see a case where server behavior has been tweaked in
order to combat miscreants go take a look at SMTP.  SMTP is no longer
simple, largely because of the various schemes people have undertaken
to stop spam.  Despite all these schemes, spam still exists and the
only effective counters has been:
1) Securing open-relays.
2) Removing the bot-nets that are sending the spam.
3) Ultimately improving the security of the vulnerable systems that
are sending the spam.

All the effort towards black lists, SPF, domainkeys, etc... has been
IMHO a waste of time.  At best it has been a temporarily road block.


If Apache by default delays 404s, this may have some effect in the first month or two after the release of this change.

I like that. So at least we are not at the "no effect" stage anymore. ;-)

But then the the botnet
writers will learn and update their software.
I do believe that these guys are monitoring mailing lists like these or at least reading the change logs of the most popular web servers. So, I believe that such change would have a very limited impact on the whole Internet or at least will be combated fairly easy.


And I believe that the Apache developers are smart people, as smart or smarter collectively than the bot writers. And one of the tenets of open-source software is that "security by obscurity is not security".

So here is a challenge for the Apache devs : describe how a bot-writer could update his software to avoid the consequences of the scheme that I am advocating, without consequences on the effectivity of their URL-scanning.


P.S. About discussing this on the dev list : I originally tried a couple more discrete channels. But I was either ignored, or sent back to the user's list. So I picked this list as somewhat in-between. This being said, I believe that letting the bot-writers know about such a change may actually help the scheme. If the bot-writers do not find a good way to avoid the consequences of the scheme, they might just decide to avoid URL-scanning, and focus their efforts elsewhere. As far as I am concerned, that would be the biggest prize of all.

Reply via email to