Re: [Cryptography] The Case for Formal Verification

2013-09-24 Thread Derek Jones
Tim,

 With all due respect, most of the points you make are ridiculous.

Could you please explain why you think they are ridiculous.

 For example, you point out that the certified C compiler will not
 make any guarantees about code that relies on undefined behavior.
 Well, of course! Being certified doesn't magically fix your specification!
 Certified just says the implementation matches the specification!

Where did the word certified come from?  I have not said anything
about magically fixing a specification.

The CompCert people are claiming a formally verified compiler and I
think this claim is very misleading to say the least.

 And to suggest that such a project is misguided because it places
 blind trust in coq (and because it is written in ocaml?!) shows a

I did not say that this project was misguided.

 misunderstanding of the proof tools. There is a very small core of
 trust that you need to have faith in and it is backed by solid theory

Yes, the axioms.  These need to be small in number and 'obviously'
correct; something that cannot be said of the source code of coq.
The nearest I can think of is the Lisp subset written in itself in
under a 100 lines (my memory is vague on this point)

 and is a much more solid foundation for building on than just about
 any other software we have in computer science. I don't see how in
 any way you can compare the f2c translator to this effort.

Why not?  What work has been done on the coq+other tools that has not
been done on f2c?

I describe work that was done to try and show the correctness of
a Model Implementation for C here:
http://shape-of-code.coding-guidelines.com/2011/05/02/estimating-the-quality-of-a-compiler-implemented-in-mathematics/

[ My original post to this list bounced, so I am reposting it here now), it
cam ebefore the message it is replying to]


___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-09-22 Thread Tim Newsham
 The CompCert people are claiming a formally verified compiler and I
 think this claim is very misleading to say the least.

I don't find it misleading at all and I think perhaps the
confusion is your notion of what formally verified means.

 And to suggest that such a project is misguided because it places
 blind trust in coq (and because it is written in ocaml?!) shows a
 misunderstanding of the proof tools. There is a very small core of
 trust that you need to have faith in and it is backed by solid theory

 Yes, the axioms.  These need to be small in number and 'obviously'
 correct; something that cannot be said of the source code of coq.
 The nearest I can think of is the Lisp subset written in itself in
 under a 100 lines (my memory is vague on this point)

What I am saying is that you do not need to trust the coq code.
There is actually very little of the coq tool that you do need to
trust. That is the part of coq that performs type checking. The
type checker is what verifies proofs.  It is a small part of coq and
most of coq is not trusted and generates terms that still must
pass the type checker. Neither do you need to put a lot of faith
in the ocaml compiler, the cpu or spend a lot of time worrying about
cosmic rays flipping the bits during computation.  Yes, some flaw
could potentially lead to an incorrect computation on a certain cpu
or a certain memory bit in a certain machine during a certain computation.
But the coq type checker has been compiled on many machines and
with many versions of the ocaml library and the odds of a random error
behaving in a way that just happens to validate an invalid proof term
more miniscule (as in, I would be more worried about your spontaneously
tunneling into my livingroom due to quantum effects to resume the rest
of this conversation).

There is a small part of coq that you DO have to trust. If this part
is incorrect it could invalidate all of the proofs. However, this part
of coq is backed by strong theory and has had many eyes on it.
And while it is possible that an error here could lead to a bad proof,
it is by no means assured.

 Derek M. Jones  tel: +44 (0) 1252 520 667

-- 
Tim Newsham | www.thenewsh.com/~newsham | @newshtwit | thenewsh.blogspot.com
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-09-19 Thread Derek Jones
Perry E. Metzger perry at piermont.com writes:

 CompCert is a fine counterexample, a formally verified C compiler:
 http://compcert.inria.fr/
 It works by having a formal spec for C, and a formal spec for the
 machine language output. The theorem they prove is that the

The claim of CompCert being a formally verified C compiler is wildly overstated:
http://shape-of-code.coding-guidelines.com/2013/03/10/verified-compilers-and-soap-powder-advertising/

It is a good start, but they still have lots of work to do.

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-08-30 Thread Jerry Leichter
On Aug 29, 2013, at 7:00 PM, Phillip Hallam-Baker wrote:
 ...The code synthesis scheme I developed was an attempt to address the 
 scaling problem from the other end. The idea being that to build a large 
 system you create a very specific programming language that is targeted at 
 precisely that class of problems. Then you write a back end that converts the 
 specification into code for that very restricted domain. If you want a formal 
 proof you have the synthesizer generate it from the specification as well. 
 That approach finesses the problem of having to validate the synthesizer 
 (which would likely take category theory) because only the final target code 
 need be validated
Many years back, I did some work with Naftaly Minsky at Rutgers on his idea of 
law-governed systems.  The idea was that you have an operational system that 
is wrapped within a law enforcement system, such that all operations relevant 
to the law have to go through the enforcer.  Then you write the law to 
specify certain global properties that the implementation must always exhibit, 
and leave the implementation to do what it likes, knowing that the enforcer 
will force it to remain within the law.

You can look at this in various ways in modern terms:  As a generalized 
security kernel, or as the reification of the attack surface of the system.

Minsky's interests were more on the software engineering side, and he and a 
couple of grad students eventually put together a law-governed software 
development environment, which could control such things as how modules were 
allowed to be coupled.  (The work we did together was on an attempt to add a 
notion of obligations to the law, so that you could not just forbid certain 
actions, but also require others - e.g., if you receive message M, you must 
within t seconds send a response; otherwise the law enforcer will send one for 
you.  I'm not sure where that went after I left Rutgers.)

While we thought this kind of thing would be useful for specifying and proving 
security properties, we never looked at formal proofs.  (The law of the system 
was specified in Prolog.  We stuck to a simple subset of the language, which 
could probably have been handled easily by a prover.  Still, hardly transparent 
to most programmers!)
-- Jerry

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-08-29 Thread Phillip Hallam-Baker
On Thu, Aug 29, 2013 at 4:46 PM, Perry E. Metzger pe...@piermont.comwrote:

 Taking a break from our discussion of new privacy enhancing protocols,
 I thought I'd share something I've been mumbling about in various
 private groups for a while. This is almost 100% on the security side
 of things, and almost 0% on the cryptography side of things. It is
 long, but I promise that I think it is interesting to people doing
 security work.


Whitt Diffie was meant to be working on formal methods when he came up with
public key crypto...

My D.Phil Thesis was on applying formal methods to a large, non trivial
real time system (raw input bandwidth was 6Tb/sec, the immediate
fore-runner to the LHC data acquisition scheme). My college tutor was Tony
Hoare but I was in the nuclear physics dept because they had the money to
build the machine.

The problemI saw with formal methods was that the skills required were
already at the limit of what Oxford University grad students were capable
of and building systems large enough to matter looked like it would take
tools like category theory which are even more demanding.

The code synthesis scheme I developed was an attempt to address the scaling
problem from the other end. The idea being that to build a large system you
create a very specific programming language that is targeted at precisely
that class of problems. Then you write a back end that converts the
specification into code for that very restricted domain. If you want a
formal proof you have the synthesizer generate it from the specification as
well. That approach finesses the problem of having to validate the
synthesizer (which would likely take category theory) because only the
final target code need be validated.


That is the code I re-implemented in C after leaving VeriSign and released
onto SourceForge earlier this year and the tool I used to build the JSON
Schema tool.

I would probably have released it earlier only I met this guy at CERN who
had some crazy idea about hypertext.


-- 
Website: http://hallambaker.com/
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography