### Re: Merkle Signature Scheme is the most secure signature scheme possible for general-purpose use

On Sat, 4 Sep 2010 10:45:48 +1000 (EST) Dave Horsfall d...@horsfall.org wrote: Funny you should mention that. Back in the late 70s, a work colleague suggested that the Unix crypt() function was a ring (we both had mathematical backgrounds), which gave me the idea of repeatedly encrypting the encrypted root password. You mean a group. It is known that DES is not a group, but that result comes from a different use of the algorithm than the somewhat way DES is invoked in the crypt(3) algorithm, and I don't know if that modifies the result. None the less, I suspect it would be surprising if it turned out to be a group even in the use in crypt(3). Perry -- Perry E. Metzgerpe...@piermont.com - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to majord...@metzdowd.com

### Re: Merkle Signature Scheme is the most secure signature scheme possible for general-purpose use

On 01/09/2010 22:45, Zooko O'Whielacronx wrote: On Wed, Sep 1, 2010 at 2:55 PM, Ben Laurie b...@links.org wrote: Or, to put it another way, in order to show that a Merkle signature is at least as good as any other, then you'll first have to show that an iterated hash is at least as secure as a non-iterated hash (which seems like a hard problem, since it isn't). I'm not sure that I agree with you that security of a hash function used once on an arbitrarily large message is likely to be better than security of a hash function used a few times iteratively on its own outputs. That's the whole point - a hash function used on an arbitrary message produces one of its possible outputs. Feed that hash back in and it produces one of a subset of its possible outputs. Each time you do this, you lose a little entropy (I can't remember how much, but I do remember David Wagner explaining it to me when I discovered this for myself quite a few years ago). So, on that basis alone, I reject the most secure possible argument. But regardless of that, I think the fair comparison here is: ... show that an iterated hash is more likely to have preimage resistance than a non-iterated hash is to have collision-resistance. And I think it is quite clear that for any real hash function such as MD5, SHA-1, Tiger, Ripemd, SHA-2, and the SHA-3 candidates that this does hold! What do you think of that argument? I think I've failed to understand why you thing collisions are not a problem for Merkle trees. Also, regardless, you are now talking probabilities and so a claim of most secure possible still doesn't apply. Merkle trees, probably the best signature in the world? :-) Cheers, Ben. -- http://www.apache-ssl.org/ben.html http://www.links.org/ There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit. - Robert Woodruff - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to majord...@metzdowd.com

### Re: Merkle Signature Scheme is the most secure signature scheme possible for general-purpose use

On 09/03/2010 03:45 AM, Ben Laurie wrote: That's the whole point - a hash function used on an arbitrary message produces one of its possible outputs. Feed that hash back in and it produces one of a subset of its possible outputs. Each time you do this, you lose a little entropy (I can't remember how much, but I do remember David Wagner explaining it to me when I discovered this for myself quite a few years ago). I found this to be interesting: Danilo Gligoroski, Vlastimil Klima: Practical consequences of the aberration of narrow-pipe hash designs from ideal random functions, IACR eprint, Report 2010/384, pdf. http://eprint.iacr.org/2010/384.pdf The theoretical loss is -log2(1/e) = about 0.66 bits of entropy per log2(N additional iterations). This assumes that there is no systematic correlation between the hash input and the calculation of the output, which is not really a good assumption with the MD's and SHA's in current use. They accept, process, and output vectors of 32- or 64-bit words, even preserving their order to some extent. So it would seem reasonable to expect that to the extent that these actual functions differed from an ideal random function they could easily have the type of systematic bias which would be amplified through repeated iteration. I played with some simulations with randomly-generated mappings, the observed value would at times wander over 1.0 BoE/log2 N. It seems like this entropy loss could be largely eliminated by hashing the previous two intermediate results on each iteration instead of just one. But this basically amounts to widening the data path, so perhaps it would be cheating for the purposes of this discussion. - Marsh - The Cryptography Mailing List Unsubscribe by sending unsubscribe cryptography to majord...@metzdowd.com