Re: [SC-L] How do we improve s/w developer awareness?

2004-12-03 Thread David A. Wheeler
der Mouse said:
>>Changing liability laws on the other hand is a simple solution.
> 
> But at what price?  It would kill off open source completely, as far as
> I can see, in the jurisdiction(s) in question.  (How many open source
> projects could afford to defend a liability suit even if they (a)
> wanted to and (b) had a won case?)
> 
> Of course, if you don't mind that, go right ahead.  You don't say where
> you are, but looking over your message I see reason to thin it's the
> USA, and I long ago wrote off the USA as a place to write code.  I
> think it could be a very good thing for the USA to try such laws; it
> would give us hard data about what their effect is, rather than the
> speculation (however well-informed) that's all we have to go on now -
> and it quite likely would have the pleasant side effect of pushing most
> open source projects out into the free (or at least freer) world.

One solution would be to exempt open source software from such
liability, under the theory that:
* If anyone can read & analyze the source code, the real security
   of the product is much easier for customers to determine
   (directly or by hiring someone).  Part of the problem now is that
   vendors always say they're wonderful... and if it's closed source,
   it's too difficult to determine if that's true.
   It's usually pretty easy to determine if an OSS program is
   poorly designed or well designed for security.
* If anyone can change the program & redistribute those changes,
   then anyone can fix things THEY think are problems immediately,
   instead of waiting for a vendor (who may not bother or take
   years).  After all, the customer has the urgency, not the vendor.
* Requiring liability for open source software would kill a great deal
   of innovation, since a common model for distributing ideas &
   promulgating standards is to distribute an
   open source software implementation.  The Internet would probably
   never have gotten far, except that the BSD TCP/IP stack could be
   freely copied into arbitrary places (as code or as concept).
   The same for DNS, web servers, etc.
* If it's NOT open source, then the vendor is probably charging a
   per-use or per-year fee, and thus can afford insurance, lawsuits, etc.
   This often isn't true for OSS; a project could separately charge
   for liability insurance, but because it's optional, the group
   becomes much smaller than "all users".

There's even a precedent for this: U.S. export crypto laws essentially
give a free pass (in most cases) to open source software.
And in general, for important things the U.S. often requires either
disclosure or liability; this approach gives consumers a choice.

I'm not certain that liability laws for vendors are the way to go.
Generally such laws just make people buy insurance for the
lawsuits; if nothing else changes, that just means that the
lawyers get rich and nothing useful happens.  Sometimes, the
insurance companies impose requirements to get that insurance.
But those requirements are to reduce the insurance company risk,
not to improve the innovation or capability of products, or
even the security of the products as viewed by an end-user.
It'd be easy to kill the baby on this road.

If you go down this path,
you'd probably be better off creating a nice, narrow list
of common security flaws, and then say that you can sue
if it's something on THAT list.  That way, rather than something
vague, they can at least eliminate a finite set of problems,
that's short and manageable. Not perfect, but a start.

Another approach is to permit suits against people who CHOSE
the product.  This has some
advantages, but it's not perfect either.  Problem here is that
popular products generally get a free pass ("everyone else
chose this shoddy product!"), which means that this can
_disincentivize_ vendors of popular products from fixing their
wares, and it can disincentivize competition ("no one would
be willing to risk using my new product because they might get sued").

Sigh.  Nothing is simple!

Anyway, just a few thoughts.

--- David A. Wheeler




[SC-L] Former cybersecurity czar: Code-checking tools needed

2004-12-03 Thread Jose Nazario
FYI ...


jose nazario, ph.d. [EMAIL PROTECTED]
http://monkey.org/~jose/http://infosecdaily.net/


http://www.computerworld.com/securitytopics/security/story/0,10801,97988,00.html

By Grant Gross
DECEMBER 02, 2004
IDG NEWS SERVICE

WASHINGTON -- Software vendors need automated tools that look for bugs
in their code, but it may be a decade before many of those tools are
mature and widely used, said the former director of cybersecurity for
the U.S. Department of Homeland Security.

Creating software assurance tools was one long-term focus of the DHS
National Cybersecurity Division during Amit Yoran's tenure there,
Yoran said today during the E-Gov Institute Homeland Security and
Information Assurance Conferences in Washington.

About 95% of software bugs come from 19 "common, well-understood"
programming mistakes, Yoran said, and his division pushed for
automation tools that comb software code for those mistakes.

"Today's developers ... oftentimes don't have the academic discipline
of software engineering and software development and training around
what characteristics would create flaws in the program or lead to
bugs," Yoran said.

Government research into some such tools is in its infancy, however,
he added. "This cycle will take years if not decades to complete," he
said. "We're realistically a decade or longer away from the fruits of
these efforts in software assurance."

Yoran, who resigned from his DHS position in September after being on
the job for a year, hinted at why he left, but sidestepped a question
about the reasons. In the private sector, he had a "real objective" on
how to move forward, he said.

"When you move into a strategic and somewhat ill-defined role of
'protect cyberspace,' that's a very difficult mission to get your arms
around," he said. "You show up to work on a Monday morning, you're
ready to put your fingers to the keyboard, you've got a team of folks
working with you, what do you do ... to secure cyberspace from within
the Department of Homeland Security?"

Most Internet resources are owned by the private sector, and the U.S.
government has been hesitant to pass cybersecurity mandates, noted
Yoran, former vice president of worldwide managed security services at
Symantec Corp. With no operational or regulatory control over most of
the Internet, the goal of securing cyberspace at DHS was difficult, he
said.

Asked if that lack of authority was a reason for leaving the post,
Yoran said his successor will need to "look at go-forward issues" in
cybersecurity that the division can best address.

Yoran, however, defended President George W. Bush's National Strategy
to Secure Cyberspace, released in February 2003. The strategy, which
sets out five major cybersecurity recommendations, did not advocate
regulation, and the White House took the right approach in developing
those recommendations by consulting with private industry, Yoran said.

"As the Department of Homeland Security ... implementing the national
strategy is not our job; it's not our responsibility," he said. "It's
the nation's job, it's the international technology community's job
and responsibility. We can just help."

The national strategy and efforts at DHS can help move cybersecurity
efforts beyond the current "cat and mouse game" of finding
vulnerabilities, assessing whether to patch them, and patching them
when the problems become painful to companies, Yoran said. He
predicted a "radical transformation" in the cybersecurity field within
two to four years as more companies and government agencies accept
technologies such as Web services, remote Internet access and RFID
(radio frequency identification) tags.

"In the next two to three years, you won't be able to define where
your network begins and ends," Yoran said. "The paradigms we rely on
today for protecting our information -- stronger firewalls, more
accurate intrusion detection -- those types of technologies will be
required, but they will be solving an increasingly small percentage of
the challenges that are going to be facing us."



RE: [SC-L] How do we improve s/w developer awareness?

2004-12-03 Thread owner-sc-l
<[EMAIL PROTECTED]>
From: "Peter Amey" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sender: [EMAIL PROTECTED]
Precedence: bulk
Mailing-List: contact <[EMAIL PROTECTED]> ; run by MajorDomo
List-Id: Secure Coding Mailing List 
List-Post: 
List-Subscribe: 
List-Unsubscribe: 
List-Help: 
List-Archive: 
Delivered-To: mailing list [EMAIL PROTECTED]
Delivered-To: moderator for [EMAIL PROTECTED]

[snip]
> 
> Remember that little incident in 2000 when the London 
> millennium bridge was
> closed immediately after opening due to excessive wobbling when people
> walked across it? I can't guarantee that my recollection is 
> accurate, but
> I'm sure they were trying to put this down to that software classic, a
> 'Design feature'.

The Millenium Bridge wobble is indeed instructive.  Engineering is usually 
a profession that is conservative and places great emphasis on codifying 
and learning from past mistakes.  Much bridge design work uses 
well-established, trustworthy principles.  The Millenium Bridge designers 
deliberately pushed the boundaries to produce something novel and exciting. 
 Never before had a suspension bridge had the suspension and decking in the 
same plane (i.e. the deck doesn't "hang" from the suspension, its balanced 
on/between the suspension).  The result was strong enough but had 
unexpected dynamics i.e. it wobbled!
I am confident that this experience is already in the text books, standard 
data tables and CAD tools.  The engineering body of knowledge had been 
added to and the problem should not recur.

This is where the software community can learn:

1.  We are appalling at learning from previous mistakes (other than in 
perfecting our ability to repeat them!)
2.  We routinely push the boundaries of what we try and achieve by leaping 
instead of stepping.
3.  We routinely adopt novel and untried technology in preference to proven 
and trustworthy alternatives.  Indeed, mature technology often seems to be 
rejected precisely because it is not new, novel or exciting enough.

The Millenium Bridge made news precisely because such engineering faiures 
are rare; software engineering failures make the news because they are so 
common the papers would be empty if they weren't reported! 

[snip]

Peter


**

This email is confidential and intended solely for the use of the 
individual to whom it is addressed.  If you are not the intended recipient, 
be advised that you have received this email in error and that any use, 
disclosure, copying or distribution or any action taken or omitted to be 
taken in reliance on it is strictly prohibited.  If you have received this 
email in error please contact the sender.  Any views or opinions presented 
in this email are solely those of the author and do not necessarily 
represent those of Praxis High Integrity Systems Ltd (Praxis HIS). 

 Although this email and any attachments are believed to be free of any 
virus or other defect, no responsibility is accepted by Praxis HIS or any 
of its associated companies for any loss or damage arising in any way from 
the receipt or use thereof.  The IT Department at Praxis HIS can be 
contacted at [EMAIL PROTECTED]

**