On Fri, May 9, 2008 at 3:42 PM, Gary McGraw <[EMAIL PROTECTED]> wrote:
> Hi andy (and everybody),
>
> Indeed.  I vote for personal computer liberty over guaranteed iron clad 
> security any day.  For amusing and shocking rants on this subject google up 
> some classic Ross Anderson.  Or heck, I'll do it for you:
> http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html

I've heard this point for years, and yet when we actually look at ways
of solving the consistent problems of software security, we always
come back to tamper-proof/restricted-rights as a pretty reasonable
starting point.

I don't know whether this mailing list is really the place for me to
advocate about this, but every time we get into a situation where we
talk  about high reliability (electronic voting for example) people
are all up in arms that we haven't followed pretty strict practices to
make sure  the machines don't get hacked, aren't hackable by even
experts, etc. hardened hardware, trusted computing bases, etc.

But, if you want to try and apply the same engineering principles to
protecting an individual's assets such as their home computer, bank
account credentials, etc. then you're trampling on their freedom.

I don't really see how we can viably have both.  Sure we're looking at
all sorts of things like sandboxing and whatnot, but given
multi-purpose computing and the conflicting goals of absolute freedom
and defense against highly motivated attackers, we're going to have to
make some choices aren't we?

I don't disagree that all of these technologies can be misused.  Most
can.  We've all read the Risks columns for years about ways to screw
things up.

At the same time individual computers don't exist in isolation.  They
are generally part of an ecosystem (the internet) and as such your
polluting car causes my acid rain and lung cancer.  Strict liability
isn't the right solution to this sort of public policy problem,
regulation is.  That regulation and control can take many forms, some
good, some bad.

I don't see the problem getting fixed though without some substantial
reworking of the ecosystem.  Some degree of freedom may well be a
casualty.

Please don't think I'm actually supporting the general decrease in
liberty overall.  At the same time I'm pretty sure that traffic laws
are a good idea, speed limits are a good idea, even though they
restrict individual freedoms.    In the computing space I'm ok
allowing people to opt-out but only if in doing to they don't pose a
manifest danger to others.  Balancing the freedom vs. the restriction
isn't easy of course, and I'm not suggesting it is.  I'm merely
suggesting that all of the research we've ever done in the area
doesn't point to our current model (relying on users to make choices
about what software to use) promising.

How to make this happen without it turning into a debacle is of course
the tricky part.

-- 
Andy Steingruebl
[EMAIL PROTECTED]
_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
_______________________________________________

Reply via email to