I work in an edu (for a couple more weeks at least), we'll block hacked
wikis.  Why?  Because we understand that giving people a platform for attack
just isn't good policy.  We need not have the internet controlled by the
bottom-feeders out of some ridiculous fear of "censorship" or hindering
"academic freedom".

We still arrest people for inciting riots and making death threats, that
doesn't mean we are infringing free speech either.

On Fri, Aug 29, 2008 at 6:53 AM, Dave Ellingsberg <
[EMAIL PROTECTED]> wrote:

> Tell me how this works for a large site that has one piece of malware!
> badhost.com contains every wiki ever written and cause badguys.com slipped
> on SQL trick in and redirect then we should block everything in
> badhost.com.  Does not work this way in an edu domain, somebody will cry
> academic freedom and heads will roll.
>
> blacklists have never been a solution! Censorship is just Censorship.
>
> foot.
>
> >>>
> From:   "Dan Drinnon" <[EMAIL PROTECTED]>
> To:     "'Chris Lee'" <[EMAIL PROTECTED]>, <
> botnets@whitestar.linuxbox.org>
> Date:   08/29/08 2:03 AM
> Subject:        Re: [botnets] URL formats
>
> Hello Everyone!
>
> First, my apologies for not doing my Lurk Time here -  I only started
> subscribing to this list today.
>
> Unfortunately, I do not have any lists to share as I have none, but I do
> have some ideas...
>
> We all know there have long been RBL's for spam sources on the net and as
> an
> SMTP admin for a major ISP, they are invaluable at keeping the spam out by
> preventing tcp/25 connections from blacklisted IP addresses.
>
> Well, I am also a DNS admin, and in the past, I have had to block queries
> to
> certain domains (mostly dealing with child porno) by court order from
> various states in the U.S.
>
> I have one suggestion on format of the data which is shared.  It should
> include the domain, the host, and the actual URL at a minimum, in CSV
> format.  For example:
>
> Badhost.com, www.badhost.com, http://www.badhost.com/badfile.exe
>
> Then, with Rsync, this data could be shared in near- real time and handled
> by
> administrators to suite their needs.  By parsing the data, A DNS
> administrator could create scripts and choose to set badhost.com to
> something like this in their named.conf:
>
> zone "badhost.com" {
>        type master;
>        file "empty.zone";
>        allow- update { none; };
> };
>
> And empty.zone would look something like this:
>
> @ 10800 IN SOA ns1.mydomain.com. root.mydomain.com. (
>               1 3600 1200 604800 10800 )
> @ 10800 IN NS ns1.mydomain.com.
>
> The end result is that a DNS query for any host in the list would return
> nothing.
>
> If the data were vast enough, and if enough administrators subscribed and
> used this 'blacklist', it would have a real effect against end users
> hitting
> bad URL's.  I have no problems blocking bad domains, just as maintainers of
> spam RBL's have no problems blocking bad IP's.  It is up to the ISP of the
> bad host to take care of the problem to become unblocked.
>
> As far as http or hxxp, it doesn't matter to me, I would only handle this
> information on the backend.
>
>
>
>
> ----- Original Message-----
> From: botnets- [EMAIL PROTECTED]
> [mailto:botnets- [EMAIL PROTECTED] On Behalf Of Chris Lee
> Sent: Thursday, August 28, 2008 5:54 PM
> To: botnets@whitestar.linuxbox.org
> Subject: Re: [botnets] URL formats
>
> hxxp seems to be advantageous for a few reasons:
>  1. you can still cut and paste the url
>  2. the protocol handlers won't load it up if you accidently click
> on it
>  3. you can add a protocol handler for hxxp for whatever you want
>  4. easier to recognize domains and patterns (rather than rotted urls)
>  5. already widely accepted in spam fighting groups
>  6. trivial to do and undo with no exception cases
>
> I figured I'd put down my thoughts to try to help a standard to move
> forward.
>
>
> On Aug 28, 2008, at 7:07 PM, silky wrote:
>
> > On Fri, Aug 29, 2008 at 3:32 AM, Chris Burton <[EMAIL PROTECTED]> wrote:
> >> Hi,
> >> I was wondering if it would be more helpful if we could propose a
> >> "standard"
> >> for posting broken URLs with some form of start/end indicator to
> >> allow
> >> easier automated processing from the listings?
> >
> > I was thinking that it would be nice to post them just rot13'd. Still
> > trivially decoded (i use leetkey add- in in ff) but not picked up by
> > indexers/etc. Advantage is that it can still be searched for common
> > patterns.
> >
> >
> >> ChrisB.
> >
> >
> > --
> > noon silky
> > http://www.themonkeynet.com/armada/
> > _______________________________________________
> > botnets@, the public's dumping ground for maliciousness
> > All list and server information are public and available to law
> > enforcement upon request.
> > http://www.whitestar.linuxbox.org/mailman/listinfo/botnets
>
> _______________________________________________
> botnets@, the public's dumping ground for maliciousness
> All list and server information are public and available to law enforcement
> upon request.
> http://www.whitestar.linuxbox.org/mailman/listinfo/botnets
>
> _______________________________________________
> botnets@, the public's dumping ground for maliciousness
> All list and server information are public and available to law enforcement
> upon request.
> http://www.whitestar.linuxbox.org/mailman/listinfo/botnets
>
> _______________________________________________
> botnets@, the public's dumping ground for maliciousness
> All list and server information are public and available to law enforcement
> upon request.
> http://www.whitestar.linuxbox.org/mailman/listinfo/botnets
>
_______________________________________________
botnets@, the public's dumping ground for maliciousness
All list and server information are public and available to law enforcement 
upon request.
http://www.whitestar.linuxbox.org/mailman/listinfo/botnets

Reply via email to