Re: combining entropy

2008-10-29 Thread Ben Laurie
On Tue, Oct 28, 2008 at 7:55 PM, Leichter, Jerry [EMAIL PROTECTED] wrote: 2. The Byzantine model. Failed modules can do anything including cooperating by exchanging arbitrary information and doing infinite computation. So in the Byzantine model I can

Re: combining entropy

2008-10-29 Thread Bill Stewart
This isn't enough. Somehow, you have to state that the values emitted on demand in any given round i (where a round consists of exactly one demand on all N member and produces a single output result) cannot receive any input from any other members. Otherwise, if N=2 and member 0 produces true

Re: combining entropy

2008-10-28 Thread Leichter, Jerry
On Sat, 25 Oct 2008, John Denker wrote: | On 10/25/2008 04:40 AM, IanG gave us some additional information. | | Even so, it appears there is still some uncertainty as to | interpretation, i.e. some uncertainty as to the requirements | and objectives. | | I hereby propose a new scenario. It is

Re: combining entropy

2008-10-28 Thread John Denker
On 10/28/2008 09:43 AM, Leichter, Jerry wrote: | We start with a group comprising N members (machines or | persons). Each of them, on demand, puts out a 160 bit | word, called a member word. We wish to combine these | to form a single word, the group word, also 160 bits | in length.

Re: combining entropy

2008-10-28 Thread Leichter, Jerry
On Tue, 28 Oct 2008, John Denker wrote: | Date: Tue, 28 Oct 2008 12:09:04 -0700 | From: John Denker [EMAIL PROTECTED] | To: Leichter, Jerry [EMAIL PROTECTED], | Cryptography cryptography@metzdowd.com | Cc: IanG [EMAIL PROTECTED] | Subject: Re: combining entropy | | On 10/28/2008 09:43 AM

Re: combining entropy

2008-10-27 Thread Ben Laurie
On Sat, Oct 25, 2008 at 12:40 PM, IanG [EMAIL PROTECTED] wrote: Jonathan Katz wrote: I think it depends on what you mean by N pools of entropy. I can see that my description was a bit weak, yes. Here's a better view, incorporating the feedback: If I have N people, each with a single

Re: combining entropy

2008-10-27 Thread John Denker
On 10/25/2008 04:40 AM, IanG gave us some additional information. Even so, it appears there is still some uncertainty as to interpretation, i.e. some uncertainty as to the requirements and objectives. I hereby propose a new scenario. It is detailed enough to be amenable to formal analysis. The

Re: combining entropy

2008-10-27 Thread John Denker
Alas on 10/25/2008 01:40 PM, I wrote: To summarize: In the special sub-case where M=1, XOR is as good as it gets. In all other cases I can think of, the hash approach is much better. I should have said that in the special sub-case where the member word has entropy density XX=100% _or_ in

Re: combining entropy

2008-10-27 Thread Sandy Harris
John Denker [EMAIL PROTECTED] wrote: To say the same thing in more detail: Suppose we start with N generators, each of which puts out a 160 bit word containing 80 bits of _trusted_ entropy. That's a 50% entropy density. So you need a 2:1 or heavier compression that won't lose entropy. If

Re: combining entropy

2008-10-27 Thread Jonathan Katz
On Sat, 25 Oct 2008, John Denker wrote: On 10/25/2008 04:40 AM, IanG gave us some additional information. Even so, it appears there is still some uncertainty as to interpretation, i.e. some uncertainty as to the requirements and objectives. I hereby propose a new scenario. It is detailed

Re: combining entropy

2008-10-25 Thread John Denker
On 10/24/2008 03:40 PM, Jack Lloyd wrote: Perhaps our seeming disagreement is due to a differing interpretation of 'trusted'. I took it to mean that at least one pool had a min-entropy above some security bound. You appear to have taken it to mean that it will be uniform random? Thanks, that

Re: combining entropy

2008-10-25 Thread IanG
Jonathan Katz wrote: I think it depends on what you mean by N pools of entropy. I can see that my description was a bit weak, yes. Here's a better view, incorporating the feedback: If I have N people, each with a single pool of entropy, and I pool each of their contributions together

Re: combining entropy

2008-10-24 Thread John Denker
On 09/29/2008 05:13 AM, IanG wrote: My assumptions are: * I trust no single source of Random Numbers. * I trust at least one source of all the sources. * no particular difficulty with lossy combination. If I have N pools of entropy (all same size X) and I pool them together with XOR,

Re: combining entropy

2008-10-24 Thread Jonathan Katz
[Moderator's note: top posting is not tasteful. --Perry] I think it depends on what you mean by N pools of entropy. Are you assuming that one of these is sources is (pseudo)random, but you don't know which one? Are you assuming independence of these difference sources? If both these

Re: combining entropy

2008-10-24 Thread Ben Laurie
On Mon, Sep 29, 2008 at 1:13 PM, IanG [EMAIL PROTECTED] wrote: If I have N pools of entropy (all same size X) and I pool them together with XOR, is that as good as it gets? Surely not. Consider N pools each of size 1 bit. Clearly you can do better than the 1 bit your suggestion would yield.

Re: combining entropy

2008-10-24 Thread Stephan Neuhaus
On Oct 24, 2008, at 14:29, John Denker wrote: On 09/29/2008 05:13 AM, IanG wrote: My assumptions are: * I trust no single source of Random Numbers. * I trust at least one source of all the sources. * no particular difficulty with lossy combination. If I have N pools of entropy (all same

Re: combining entropy

2008-10-24 Thread Wouter Slegers
L.S., If I have N pools of entropy (all same size X) and I pool them together with XOR, is that as good as it gets? My assumptions are: * I trust no single source of Random Numbers. * I trust at least one source of all the sources. * no particular difficulty with lossy combination. I

Re: combining entropy

2008-10-24 Thread Thierry Moreau
IanG wrote: If I have N pools of entropy (all same size X) and I pool them together with XOR, is that as good as it gets? My assumptions are: * I trust no single source of Random Numbers. * I trust at least one source of all the sources. * no particular difficulty with lossy combination.

Re: combining entropy

2008-10-24 Thread Jon Callas
On Sep 29, 2008, at 5:13 AM, IanG wrote: If I have N pools of entropy (all same size X) and I pool them together with XOR, is that as good as it gets? My assumptions are: * I trust no single source of Random Numbers. * I trust at least one source of all the sources. * no particular

Re: combining entropy

2008-10-24 Thread Jack Lloyd
On Fri, Oct 24, 2008 at 10:23:07AM -0500, Thierry Moreau wrote: Do you really trust that no single source of entropy can have knowledge of the other source's output, so it can surreptitiously correlate its own? I.e, you are are also assuming that these sources are *independent*. I do not

Re: combining entropy

2008-10-24 Thread John Denker
On 10/24/2008 01:12 PM, Jack Lloyd wrote: is a very different statement from saying that lacking such an attacker, you can safely assume your 'pools of entropy' (to quote the original question) are independent in the information-theoretic sense. The question, according to the original

Re: combining entropy

2008-10-24 Thread Jack Lloyd
On Fri, Oct 24, 2008 at 03:20:24PM -0700, John Denker wrote: On 10/24/2008 01:12 PM, Jack Lloyd wrote: is a very different statement from saying that lacking such an attacker, you can safely assume your 'pools of entropy' (to quote the original question) are independent in the