begin  quoting [EMAIL PROTECTED] as of Thu, Sep 15, 2005 at 05:49:29PM -0700:
> On Wed, Sep 14, 2005 at 06:27:36PM -0700, Mike Marion wrote:
> ...
> > http://www.ranum.com/security/computer_security/editorials/dumb/
> > 
> > Should be required reading for anyone that wants to ...
> > work in any level of IT
> ...
> Most of it is excellent, but I disagree strongly with two of
> his ideas.
 
Heh.

> He claims that security experts should not learn to hack
> into systems, because it's a waste of time to keep up with
> the latest developments and because the secure systems he
> advocates oftentimes can't be cracked in an instructive way.
> This is all true, but he is omitting important ideas.  I

The problem with this is when you apply that thinking to something like
cryptology, it turns out to be bunk.  Those who design cryptosystems damn
better know a little about cryptanalysis -- otherwise, they'll end up
with a usable cryptosystem only by accident.

Perhaps you don't need to be an *accomplished* or *gifted* cryptanalyst,
but you should how to break a cryptosystem so you can make it stronger.

> believe that security experts can sometimes gain valuable
> general background by learning to break some security
> measures.  The targets need not be modern or complete systems.

Yes.

> They could be historically
> or theoretically important security measures. They could be
> single layers of modern security systems where a real system
> would have multiple layers of security.  It also introduces
> a valuable element of realism to study real criminals.

Yes.

> None of this implies that you are obligated
> to admire the criminals or defend only against specific
> attacks that have already occurred in the real world.

Admiration doesn't necessarily require approval. :)
 
> I also feel that the idea that you can't obtain security by
> educating users is presented in a particularly bad way.  I

I hold that unless you educate your users, you're _doomed_.  It's
not possible to determine if a user is doing what they mean to do
or if they're just being stupid.

It's just a matter of how much and what sort of education you provide.

> admit that security by educating users is usually a futile
> attempt to prevent mistakes that are human nature.  My

You want to structure the system so that human nature doesn't *cause*
problems.  But to think that average people can use complicated technology
without training is... silly.  Sometimes people are self-trained, mostly
by trial and error -- but that's still training.

What we don't want is users self-training into bad habits, and then
training the users around them in those same habits.

> objection is that the users are portrayed as having no use
> for powerful tools rather than being portrayed as unable
> to reliably recognize or safely use gratuitously dangerous ones.

Some users should not be trusted with any tool that isn't the equivalent
of a smoothly rounded, padded, and lightweight hammer.   Other users
could be trusted with the equivalent of a chainsaw.  The problem with
classifying a huge group of people as "users" is that you can then easily
indicate how every one of 'em is stupid in some way and therefore none
of 'em are to be trusted at all.

With anything.

Ever.

> Sometimes only the person who actually does repetitious work
> knows what needs to be automated.  In such cases they may
> be the right person to program a little bit to solve the
> problem.  I have seen people who were regarded as ordinary
> clerical workers do this successfully.

The computer is wonderful for taking tedious, repetitious tasks, and
automating them. Depriving users of this capability is a terrible thing.

-Stewart "But don't let users operate under false assumptions" Stremler


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to