I just checked my log files, and unfortunately, these don't look like Code
Red scans. I think they're coming from the Nimda worm. IIRC, Code Red would
request a "default.ida" file with a bunch of garbage appended to the
request. This worm seems to try a number of different requests for files
like cmd.exe, script.exe, etc.
I wrote a script to add IP addresses from which Code Red attacks originated
to my hosts.deny file, but I think that file only gets consulted when httpd
is called from xinetd, not when it is run as a standalone application.
#!/bin/bash
cat /var/log/httpd/access_log | grep default.ida | awk '{ FS = "-" } { print
$1 }' |
sort | uniq >> /etc/hosts.deny
Does anyone know how this script might be modifed to prevent code red/nimda
scans? Ideally, I'd love to add each of these IP addresses to some type of
"DROP" rule in iptables - that way, they're forced to time out before moving
on to the next victim...
Thanks.
Mike
----- Original Message -----
From: "Brian Curtis" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, September 19, 2001 7:56 AM
Subject: Thwarting Code Red scans
> Hello list folk,
>
> Anyone have some pointers in thwarting, what seems to bad and so I was
> told, Code Red scans?
>
> I'm seeing several hundred requested per second for cmd.exe and
> root.exe via Apache -- All at different addresses that our boxes
> serve.
>
> It's a pretty intelligent scan, if you ask me. You'll never see a
> request from the same zombied machine's IP more than once per second.
> Therefore, things like portsentry seem to be useless.
>
> I've put up with this ongoing scan for almost 24 hours now, and I'm
> starting to get a bit ticked off. Although it's not directly
> affecting our services yet, I feel helpless not knowing an intelligent
> counter-measure. In about 22 hours, I now have an error log from
> Apache that has passed the 100mb mark. This file almost never goes
> over a meg or two in a week!
>
> I'm willing and ready to entertain any ideas on how I can prevent this
> bandwidth-sucking POS from hammering our boxes. I've thought about
> denying requests to these files using Apache, but you still have to
> serve the request, then the forbidden error. Seems like that may put
> more of a strain on things. I've also thought about extracting the IP
> addresses from the logfiles, but at a one-by-one process this will be
> extremely time consuming. Right now, I'm at a total loss.
>
> Thanks for any help you can provide.
>
> --
> Best regards,
> Brian Curtis
>
>
>
> _______________________________________________
> Seawolf-list mailing list
> [EMAIL PROTECTED]
> https://listman.redhat.com/mailman/listinfo/seawolf-list
>
_______________________________________________
Seawolf-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/seawolf-list