Hello list folk,

Anyone have some pointers in thwarting, what seems to bad and so I was
told, Code Red scans?

I'm seeing several hundred requested per second for cmd.exe and
root.exe via Apache -- All at different addresses that our boxes
serve.

It's a pretty intelligent scan, if you ask me.  You'll never see a
request from the same zombied machine's IP more than once per second.
Therefore, things like portsentry seem to be useless.

I've put up with this ongoing scan for almost 24 hours now, and I'm
starting to get a bit ticked off.  Although it's not directly
affecting our services yet, I feel helpless not knowing an intelligent
counter-measure.  In about 22 hours, I now have an error log from
Apache that has passed the 100mb mark.  This file almost never goes
over a meg or two in a week!

I'm willing and ready to entertain any ideas on how I can prevent this
bandwidth-sucking POS from hammering our boxes.  I've thought about
denying requests to these files using Apache, but you still have to
serve the request, then the forbidden error.  Seems like that may put
more of a strain on things.  I've also thought about extracting the IP
addresses from the logfiles, but at a one-by-one process this will be
extremely time consuming.  Right now, I'm at a total loss.

Thanks for any help you can provide.

-- 
Best regards,
 Brian Curtis



_______________________________________________
Seawolf-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/seawolf-list

Reply via email to