At 02:19 AM 8/3/99, Peter Gutmann wrote:

>[1] There isn't any rule of thumb for the work involved in attaining the
higher
>    assurance levels because it's done so rarely, although in terms of
cost and
>    time I've seen an estimate of $40M for an A1 Multics (it never
eventuated)
>    and DEC's A1 security kernel took nearly a decade to do, with 30-40
people
>    working on it at the end (just before it was cancelled).  A lot of this
>    overhead was due to the fact that this hadn't been done much and there
was
>    a lot of research work involved, an estimate I've had for doing a
>    commercial-product A1 system now would be about 3-5 years (probably
closer
>    to 5), ramping up from an initial 10 to 30 people at the end, and costing
>    maybe $15-20M.

ObCrypto: we all face the problem of judging whether or not a particular
implementation meets particular security objectives. Evaluation techniques
like formal assurance provide a candidate set of tools, so this is somewhat
worth examining. There is a particular bias towards formal methods in
several communities.

I'm currently putting together a paper that outlines in detail the labor
costs of the LOCK program, a government funded project to built a Unix
compatible A1 system. It was started in the late 80s, about when the VMS
program died, and sucked up between $20M and $30M before a descendent was
put into operation as the Standard Mail Guard. It was never formally
evaluated at A1 or anything else, though large chunks of assurance evidence
were reviewed by Govt representatives.

The A1 formal assurance stuff added a 58% premium to the development of
LOCK TCB code. That premium focused almost entierly on the effectiveness of
multilevel security (MLS) mechanisms. MLS has not been useful enough to
find their way into many applications, military or non-military.
Unfortunately, the processes developed for A1 assurance are extremely
difficult to adapt to non-MLS applications. 

In other words, developers of a non-MLS mechanism needs to do R&D into how
to formally model and specify their security requirements. But there is
nobody out there with the clout to review this newly created model and
judge its fidelity to reality. The Orange Book/TCSEC/NCSC approach provided
somewhat canned answers to basic things and defined the review process for
more complex things. But you have nowhere to go outside of this. At the
present there's no way to evaluate anything past EAL4 except perhaps by
going to NSA, which will probably make demands that no commercial product
can justify.

ObCrypto: I've seen occasional mentions of strategies for verifying
cryptographic protocols, but I've never seen anything really practical
published about it other than Gus Simmons' paper in CACM. Unfortunately,
his conclusion was that we still need to develop such techniques, which
should perhaps be characterized as "formalized paranoia." The only thing
I've seen in practice are lists of rules like the ones I published in
"Internet Cryptography" as "Security Requirements" for various techniques,
products, and sites. NSA's "Functional Security Requirements
Specifications" for crypto devices tend to take that approach, and rely on
point-by-point explanations of how a given thing complies with each
requirement.


Rick.
[EMAIL PROTECTED]
"Internet Cryptography" at http://www.visi.com/crypto/

Reply via email to