Robert Jueneman wrote: [sni] > To solve this problem, I proposed the notion of "strong but brittle > cryptography" - cryptography that would be designed to break very > easily, at a defined time in the future, but be very robust until that > time. > > This would involve say 100 institutions very interested in preserving > historical records (museums, universities, religious institutions, > newspapers, government archivists, etc.) coming together and generating > a set of very strong ECC key pair, using K out of N secret sharing > techniques, where the K shares needed to recover the key might be say > 70. > > Those keys would be embedded in certificates with validity periods of > 10, 20, 30, ... 100 years. > > Every ten years, the institutions would come together and reconstitute > the key that was due to expire on that particular date, and then publish > it to the world. > > Records intended to be kept secret until that date would be very secure, > but after that date would be easily readable by anyone, with no > cryptanalysis required.
A most excellent idea. This way we might find out the "truth" about Coventry, Pearl Harbor, the JFK assassination, 9/11, etc., one day. (Warning, bad pun ahead.) The key problem is that it is a minority controlled release. Using your 70 of 100 metric, 31 could potentially deny release by refusing to participate, if I understand how your idea would work in practice. The other, but lesser problem, is being sure that enough of the conveners attend every ten years to make the changes needed. Given that not every institution survives as long as the University of Bologna, it could become an issue. I think that these could be resolved by a variety of reverse tontin rules. The other issue, which is true of all cryptographic functions, is unanticipated vectors of attack. As with the conveners issues, I suspect a carefully thought out, multi-level encryption structure could protect against this but the real problem is that one would need to make sure that *all* the old data that still needs to be protected is migrated to the next algorithm. If even one copy exists that has not been migrated then you might as well not have bothered to migrate any data at all when the algorithm is cracked. The consequences of this is that data would be released before its end of secrecy date. This might or might not be a big deal depending on the nature of the data. For example, 9/11 data that was released at 40, not 50 years, would not probably be all that big a deal, but DNA records released too soon might be a big deal if insurance companies are allowed to discriminate based on them. Well worth looking at and thinking about as an approach nevertheless. Best, Allen _______________________________________________ FDE mailing list [email protected] http://www.xml-dev.com/mailman/listinfo/fde
