Jerrold Leichter <[EMAIL PROTECTED]> writes:

>There was also an effort in England that produced a verified chip.  Quite
>impressive, actually - but I don't know if anyone actually wanted the chip
>they (designed and) verified.

The Viper.  Because it needed to be formally verifiable, they had to leave out
most of the things that people are used to in modern CPUs and that make
writing an OS easy, leading to a vaguely early-60s level of CPU architecture
that probably would have been unpleasant to program for for anyone used to
modern CPUs, and requiring expensive custom development of almost everything
from scratch (you can't run Linux on that one).  Eventually the project went
into a meltdown over what was actually done (for example is verifying a set of
4-bit slices the same as verifying a 32-bit CPU?) and the legal battles lead
to the demise of the company that was to exploit it commercially (there's a
lot more to it than that including a fair bit of politics, that's a cut-down
version to save space).

>Very few real efforts were made to actually produce a "provably correct" OS.

There were actually quite a few efforts, starting in the 1970s, some of which
went on much longer than the 9-year VAX VMM effort.  PSOS -> SAT -> LOCK ->
SMG (it may be called something else again now) has been going for about 25
years.  However, this is a really complex topic (way too much to cover here),
so I'll cheat a bit and refer anyone who's really that interested in the
problems that people ran into to Chapter 4 of "Cryptographic Security
Architecture Design and Verification" to save me having to paraphrase 40 pages
of text here.

The point of my post wasn't to start yet another round of formal-methods
bashing, but to point to an example of measuring what we know how to measure
even if there are strong indicators that this isn't the best way to do it.


The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to