[SC-L] No general-purpose computer, or everything under surveillance?

2008-05-13 Thread David A. Wheeler
Dan Geer said:
The general-purpose computer must die or we must put everything under 
surveillance. Either option is ugly, but 'all of the above' would be 
lights-out for people like me, people like you, people like us. We're 
playing for keeps now. 
http://www.acmqueue.org/modules.php?name=Contentpa=showpagepid=436

I completely disagree with the way that people will likely interpret 
this quote.  We do NOT need to throw away our general-purpose computers, 
nor do we need to submit to Orwellian total population surveillance (by 
governments or by corporations).

What particularly worries me is that some large companies would benefit 
from approaches that eliminated competition in the name of security. 
You have to standardize on product X, and lock things down so that no 
nasty alternative products are executed!.  Yet that is a primary part 
of the problem.  In our current world, many people believe they CANNOT 
pick a more secure product, because it's not compatible with what 
everyone else is using.  At least in some cases, people WILL pick a 
product because it has better security (see the rise of Firefox, and how 
it finally caused Microsoft to wake up and start fixing Internet 
Explorer)... but look how hard it has been for a freely-available 
program, implementing mostly-documented standards, to compete.

If you interpret the definition of these terms of general purpose and 
surveillance differently, i.e., limit applications to least 
privilege, and locally monitor their behavior, then I'd agree.  But 
this is another way of saying we need to implement least privilege and 
local monitoring, which are well-established security principles.  And 
it's already happening, e.g.:
* Development is already moving away from general-purpose tools.  Most 
desktop and server software development should NOT be done in C or C++; 
they're too low-level and provide inadequate protection against 
mistakes.  Instead, they should voluntarily use languages that aren't 
QUITE as general-purpose, because they automatically prevent many 
mistakes from turning into security problems (e.g., through automatic 
memory management).  People are already moving towards such languages; 
we need to back in more assurance into them, but the opportunity is there.
* Deployment is already moving away from general-purpose privileges. 
SELinux lets people define very fine-grained privileges, so that a 
program does NOT have arbitrary rights.  OLPC goes even further; its 
security model is remarkable and worth learning from.
* Observing behavior (and making decisions based on them) is ALREADY 
what some systems and network systems do.

But the difference is who is in final control.  In the end, the users of 
computers should be in final control, not their makers, or we have given 
up essential liberty.  We can develop systems which provide suites of 
more specialized privileges to particular functions, without giving up 
essential liberty.  We have a long way to go in actually DOING this, but 
the opportunity is there.

I do not think we need to give up our liberty just to obtain some 
security. Benjamin Franklin already explained what happens to such people.

--- David A. Wheeler



___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] No general-purpose computer, or everything under surveillance?

2008-05-13 Thread Andy Steingruebl
On Tue, May 13, 2008 at 1:51 PM, David A. Wheeler [EMAIL PROTECTED] wrote:

  If you interpret the definition of these terms of general purpose and
  surveillance differently, i.e., limit applications to least
  privilege, and locally monitor their behavior, then I'd agree.  But
  this is another way of saying we need to implement least privilege and
  local monitoring, which are well-established security principles.  And
  it's already happening, e.g.:

That's fine in principle.  Have we ever seen a usable system based on
these principles that user's didn't reject/hate?  Look at the general
press and people's perceptions of the security of Leopard v. Vista. We
can complain all we want to about UAC and perhaps the constant
nagging but as Apple's commercials so clearly pointed out people
hate their computer explicitly and publicly trying to keep them safe.

  * Deployment is already moving away from general-purpose privileges.
  SELinux lets people define very fine-grained privileges, so that a
  program does NOT have arbitrary rights.  OLPC goes even further; its
  security model is remarkable and worth learning from.

That's great, but all of these schemes rely on:

 - Expert users to configure a policy for new software
 - Each piece of software to ship with a correct least-privilege
configuration (how do we get the malware authors to do this?)
 - A user who doesn't choose to override the default security settings
so they can see the dancing hamsters

  * Observing behavior (and making decisions based on them) is ALREADY
  what some systems and network systems do.

Same here.  We're still light years away from being able to do this in
practice.  We can't tell that the new financial management software
you just downloaded is supposed to ask for your bank password, and
that the game you just downloaded shouldn't.   And user's aren't
generally informed enough to make these kinds of decisions either,
especially given the user interface we typically give them.

Don't forget all of the wonderful fun we've had over the years getting
people to not open executables sent via email, not to visit sites with
a self-signed SSL certificate, to check for the lock icon in their
browser, to make sure that their wireless settings don't allow them to
connect to random wireless access points, etc

  But the difference is who is in final control.  In the end, the users of
  computers should be in final control, not their makers, or we have given
  up essential liberty.

I don't think you're fundamentally wrong in that I'm not (and I can't
speak for others) in favor of removing the controls completely.  But,
we ought to be shipping systems whose fundamental defaults are easier
to use, more secure, and really hard to override.  Compare IE6/FF2 to
IE7/FF3 on this front.  Sure you can still visit the site with the
self-signed certificate, and you can still visit a site that they've
categorized as a phishing site.  But it isn't quite as easy as it used
to be, and I'd say that's a good thing.

If you own a tablesaw it comes with a blade guard.  Its probably a
good idea that it does.  If you really want to you can remove it and I
don't really feel the need to stop you.  Unless I'm paying for your
insurance that is.  Your car also comes with pollution controls.
These pollution controls often inhibit your max speed, acceleration,
etc.  They are really hard to, or impossible to disable.  They also
make our environment cleaner.

Which is the right analogy for the personal computer?

-- 
Andy Steingruebl
[EMAIL PROTECTED]
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] No general-purpose computer, or everything under surveillance?

2008-05-13 Thread Gunnar Peterson
  But the difference is who is in final control.  In the end, the users of
  computers should be in final control, not their makers, or we have given
  up essential liberty.  We can develop systems which provide suites of
  more specialized privileges to particular functions, without giving up
  essential liberty.  We have a long way to go in actually DOING this, but
  the opportunity is there.
 


I believe the point of Dan Geer's paper is not that these are desired 
outcomes so much as realistic outcomes. If you cannot provide effective 
security (and we're not) and people are relying more and more on 
computers for real world things (and they are), then someone else (who 
is not a geek) is going to come in and more or less arbitrarily assign 
risk and responsibility to parties. For example (quoting Dan Geer's paper):

We've done this before—Regulation Z of the Truth in Lending Act of 1968 
says that the most a consumer can lose from misuse of a credit card is 
$50. The consumer can be an idiot, but can't lose more than $50. 
Consumers are, in fact, not encouraged to self-protect by such a 
limit—quite the opposite (and $50 in 1968 would be $275 today). No, if 
there is to be a preemption, the intelligence it requires will be based 
on a duty of surveillance that is assigned to various “deep pockets.” 
The countermeasures, in other words, are not risk-sensitive to where the 
risk naturally lies but risk-sensitive to where it is assigned. Look out 
side effects, here we come.

Something like Regulation Z may not come to pass in information 
security, but if I were a betting man, I think its a more likely outcome 
in the real world than a combination of principle of least privilege + 
perfect code + 4 billion highly trained users; none of which  I have seen.

-gp
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___