On Tue, Oct 28, 2008 at 7:55 PM, Leichter, Jerry
[EMAIL PROTECTED] wrote:
2. The Byzantine model. Failed modules can do anything
including cooperating by exchanging arbitrary
information and doing infinite computation.
So in the Byzantine model I can
This isn't enough. Somehow, you have to state that the values emitted
on demand in any given round i (where a round consists of exactly one
demand on all N member and produces a single output result) cannot
receive any input from any other members. Otherwise, if N=2 and member
0 produces true
On Sat, 25 Oct 2008, John Denker wrote:
| On 10/25/2008 04:40 AM, IanG gave us some additional information.
|
| Even so, it appears there is still some uncertainty as to
| interpretation, i.e. some uncertainty as to the requirements
| and objectives.
|
| I hereby propose a new scenario. It is
On 10/28/2008 09:43 AM, Leichter, Jerry wrote:
| We start with a group comprising N members (machines or
| persons). Each of them, on demand, puts out a 160 bit
| word, called a member word. We wish to combine these
| to form a single word, the group word, also 160 bits
| in length.
On Tue, 28 Oct 2008, John Denker wrote:
| Date: Tue, 28 Oct 2008 12:09:04 -0700
| From: John Denker [EMAIL PROTECTED]
| To: Leichter, Jerry [EMAIL PROTECTED],
| Cryptography cryptography@metzdowd.com
| Cc: IanG [EMAIL PROTECTED]
| Subject: Re: combining entropy
|
| On 10/28/2008 09:43 AM
On Sat, Oct 25, 2008 at 12:40 PM, IanG [EMAIL PROTECTED] wrote:
Jonathan Katz wrote:
I think it depends on what you mean by N pools of entropy.
I can see that my description was a bit weak, yes. Here's a better
view, incorporating the feedback:
If I have N people, each with a single
On 10/25/2008 04:40 AM, IanG gave us some additional information.
Even so, it appears there is still some uncertainty as to
interpretation, i.e. some uncertainty as to the requirements
and objectives.
I hereby propose a new scenario. It is detailed enough to
be amenable to formal analysis. The
Alas on 10/25/2008 01:40 PM, I wrote:
To summarize: In the special sub-case where M=1, XOR
is as good as it gets. In all other cases I can think
of, the hash approach is much better.
I should have said that in the special sub-case where
the member word has entropy density XX=100% _or_ in
John Denker [EMAIL PROTECTED] wrote:
To say the same thing in more detail: Suppose we start
with N generators, each of which puts out a 160 bit word
containing 80 bits of _trusted_ entropy. That's a 50%
entropy density.
So you need a 2:1 or heavier compression that won't lose
entropy. If
On Sat, 25 Oct 2008, John Denker wrote:
On 10/25/2008 04:40 AM, IanG gave us some additional information.
Even so, it appears there is still some uncertainty as to
interpretation, i.e. some uncertainty as to the requirements
and objectives.
I hereby propose a new scenario. It is detailed
On 10/24/2008 03:40 PM, Jack Lloyd wrote:
Perhaps our seeming disagreement is due to a differing interpretation
of 'trusted'. I took it to mean that at least one pool had a
min-entropy above some security bound. You appear to have taken it to
mean that it will be uniform random?
Thanks, that
Jonathan Katz wrote:
I think it depends on what you mean by N pools of entropy.
I can see that my description was a bit weak, yes. Here's a better
view, incorporating the feedback:
If I have N people, each with a single pool of entropy,
and I pool each of their contributions together
On 09/29/2008 05:13 AM, IanG wrote:
My assumptions are:
* I trust no single source of Random Numbers.
* I trust at least one source of all the sources.
* no particular difficulty with lossy combination.
If I have N pools of entropy (all same size X) and I pool them
together with XOR,
[Moderator's note: top posting is not tasteful. --Perry]
I think it depends on what you mean by N pools of entropy.
Are you assuming that one of these is sources is (pseudo)random, but you
don't know which one? Are you assuming independence of these difference
sources? If both these
On Mon, Sep 29, 2008 at 1:13 PM, IanG [EMAIL PROTECTED] wrote:
If I have N pools of entropy (all same size X) and I pool them
together with XOR, is that as good as it gets?
Surely not. Consider N pools each of size 1 bit. Clearly you can do
better than the 1 bit your suggestion would yield.
On Oct 24, 2008, at 14:29, John Denker wrote:
On 09/29/2008 05:13 AM, IanG wrote:
My assumptions are:
* I trust no single source of Random Numbers.
* I trust at least one source of all the sources.
* no particular difficulty with lossy combination.
If I have N pools of entropy (all same
L.S.,
If I have N pools of entropy (all same size X) and I pool them
together with XOR, is that as good as it gets?
My assumptions are:
* I trust no single source of Random Numbers.
* I trust at least one source of all the sources.
* no particular difficulty with lossy combination.
I
IanG wrote:
If I have N pools of entropy (all same size X) and I pool them
together with XOR, is that as good as it gets?
My assumptions are:
* I trust no single source of Random Numbers.
* I trust at least one source of all the sources.
* no particular difficulty with lossy combination.
On Sep 29, 2008, at 5:13 AM, IanG wrote:
If I have N pools of entropy (all same size X) and I pool them
together with XOR, is that as good as it gets?
My assumptions are:
* I trust no single source of Random Numbers.
* I trust at least one source of all the sources.
* no particular
On Fri, Oct 24, 2008 at 10:23:07AM -0500, Thierry Moreau wrote:
Do you really trust that no single source of entropy can have knowledge of
the other source's output, so it can surreptitiously correlate its own?
I.e, you are are also assuming that these sources are *independent*.
I do not
On 10/24/2008 01:12 PM, Jack Lloyd wrote:
is a very different statement from saying that
lacking such an attacker, you can safely assume your 'pools of
entropy' (to quote the original question) are independent in the
information-theoretic sense.
The question, according to the original
On Fri, Oct 24, 2008 at 03:20:24PM -0700, John Denker wrote:
On 10/24/2008 01:12 PM, Jack Lloyd wrote:
is a very different statement from saying that
lacking such an attacker, you can safely assume your 'pools of
entropy' (to quote the original question) are independent in the
22 matches
Mail list logo