On 2 Aug 2002, Paul Crowley wrote:
I meant to say, another example of a believed one-way function that is
guaranteed to be able to produce any output is one based on the
difficulty of discrete log:
f:(x) = g^x mod p
is bijective if the domain and range is 1..p-1, but finding preimages
David Wagner [EMAIL PROTECTED] writes:
I don't know of any good cryptographic hash function
that comes with a proof that all outputs are possible.
What about the scheme
Pad - Encipher - Contract
described at
http://www.monmouth.com/~jsd/turbid/paper/turbid.htm#sec-uniform-hash
John S. Denker [EMAIL PROTECTED] writes:
David Wagner [EMAIL PROTECTED] writes:
I don't know of any good cryptographic hash function
that comes with a proof that all outputs are possible.
What about the scheme
Pad - Encipher - Contract
described at
I meant to say, another example of a believed one-way function that is
guaranteed to be able to produce any output is one based on the
difficulty of discrete log:
f:(x) = g^x mod p
is bijective if the domain and range is 1..p-1, but finding preimages
is the discrete log problem. Of course this
1) There were some very interesting questions such as
-- whether one can construct a hash function that
generates all possible codes.
-- ditto, generating them as uniformly as possible.
-- Whether off-the-shelf hash functions such as SHA-1
have such properties.
The answers are
David Wagner [EMAIL PROTECTED] writes:
I don't know of any good cryptographic hash function that comes with
a proof that all outputs are possible. However, it might not be too
hard to come up with plausible examples. For example, if we apply the
Luby-Rackoff construction (i.e., 3 rounds of
David Wagner [EMAIL PROTECTED] writes:
I don't know of any good cryptographic hash function that comes with
a proof that all outputs are possible. However, it might not be too
hard to come up with plausible examples. For example, if we apply the
Luby-Rackoff construction (i.e., 3
On Tue, 30 Jul 2002, James A. Donald wrote:
Randomness is of course indefinable. A random oracle is however
definable.
If SHA-1 is indistinguishable from a random oracle without prior
knowledge of the input, then we would like to prove that for an
attacker to make use of the loss of entropy
David Wagner [EMAIL PROTECTED] writes:
I once wrote a short note about the relevance of this to IPSec:
http://www.cs.berkeley.edu/~daw/my-posts/using-prngs
There's another way to avoid this problem, which is to separate the nonce RNG
and crypto RNG, so that an attacker seeing the nonce RNG
At 03:18 PM 7/29/2002 -0700, David Wagner wrote:
I don't even think anyone has analyzed the entropy preservation of a
theoretically perfect random oracle
Well, I know this particular point wasn't central to your email, but
I'm not sure I agree with you on this small point. I believe it
David Wagner said,
The problem can really be divided into two parts:
1. Is our entropy crunching algorithm secure when used with
a random oracle instead of SHA1?
2. Does SHA1 behave enough like a random oracle that the answer
to question 1. is in any way relevant to the real
--
On 30 Jul 2002 at 17:02, Amir Herzberg wrote:
I found that when trying to explain and define hash functions
and their properties, I didn't find a satisfactory definition
for the `randomness` properties.
Randomness is of course indefinable. A random oracle is however
definable.
If
Amir Herzberg wrote:
But there's a big difference: the random oracle `assumption` is clearly not
valid for SHA-1 (or any other specific hash function).
Well, the random oracle model has problems, but I think those problems
are a bit more subtle than just an assumption that is true or false.
So
On 2002-07-27, David Wagner uttered to [EMAIL PROTECTED]:
In particular, there is no function f which doesn't waste entropy,
unless |Y|=1. The proof is simple enough that you can probably find it
with a moment's contemplation, but let me know if you want to see it.
I would be lead to
On 2002-07-28, Sampo Syreeni uttered to David Wagner:
[Answering to my own mail. Sorry.]
and discard every 1/(p(x)-1/256)'th sample with value x.
Actually the pedantic solution would be to put an arithmetic
compressor/coder between the input and output, using the best model we've
got. That
An example: presume we take a simple first order statistical model. If our
input is an 8-bit sample value from a noise source, we will build a 256
bin histogram. When we see an input value, we look its probability up in
the model, and discard every 1/(p(x)-1/256)'th sample with value x. When
Barney Wolff wrote:
This leads me to ask what may be a laughably naive question:
Do we even know that the popular hash functions can actually generate
all 2^N values of their outputs?
It seems very unlikely that they can generate all 2^N outputs
(under current knowledge). However, they satisfy
Yes, but is there any lower bound known? 2^(N-1)? Or worse?
Clearly a hash with N-bit output that can't generate all 2^N
values can never actually have N bits of entropy. Depending on how
many forbidden values there are, this may or may not be significant.
Of course if the forbidden values
Oh dear. On re-reading your message I now suspect that what you asked
is not what I originally thought you asked. I see two questions here:
Q1: If we cycle through all N-bit messages, are all
2^N output values possible?
Q2: If we cycle through all messages (possibly very long
or
Barney Wolff asked:
Do we even know that the popular hash functions can actually generate
all 2^N values of their outputs?
David Wagner replied:
It seems very unlikely that they can generate all 2^N outputs
(under current knowledge).
I was temporarily astonished, but he clarified as
David Wagner wrote:
Oh dear. On re-reading your message I now suspect that what you asked
is not what I originally thought you asked. I see two questions here:
Q1: If we cycle through all N-bit messages, are all
2^N output values possible?
Q2: If we cycle through all messages
3) For a one-way hash function should not expect a _constructive_
proof that it generates all possible codes; such a construction
would violate the one-way property.
Nitpick: the last statement does not seem quite right to me. I'm thinking
of the notion of a one-way permutation. For
On Mon, 29 Jul 2002, David Wagner wrote:
DES, being extremely hardware friendly, can be (ab)used to
make a strong one-way hash. (E.g., raw input into both key and data maps
56+64 - uniformly distributed 64 bits.)
However, when used in this way, DES is not an especially good hash
2) I can't prove that a standard hash function such as SHA1
generates all possible codes, but I consider it likely. It would
be quite shocking if a strong hash function such as SHA1 generated
fewer codes than a weak function such as H0.
I think you could do a probabilistic
At 12:20 PM -0700 7/29/02, David Honig wrote:
Whether there is a need for very high bandwidth RNGs was discussed
on cypherpunks a few months ago, and no examples were found.
(Unless you're using something like a one-time pad where you need
a random bit for every cargo bit.) Keeping in mind that
To test a hash function h() whose range is S,
let F be the set of balanced functions from S - {0, 1}. (Balanced
meaning that each f in F maps exactly half of S to 0 and half to 1.)
If you can contrive to choose many members of F at random, and compose
them with h for many arguments of h,
The reason for batching entropy input is to prevent someone who has
broken your system once from discovering each small entropy input by
exhaustive search. (There was a nice paper pointing this out in. If
someone has the reference...)
I believe you are referring to the state compromise
David Wagner wrote:
Amir Herzberg wrote:
So I ask: is there a definition of this `no wasted entropy` property, which
hash functions can be assumed to have (and tested for), and which ensures
the desired extraction of randomness?
None that I know of. I'm not aware of much work in the
At 11:24 AM 7/25/02 -0400, John S. Denker wrote:
And most particularly I do not
care if the analog threshold of my soundcard shifts slightly
(as a function of recent history, temperature, phase of the
moon, or whatever).
A change in the analogue threshold of your digitizing step
will change
In several posts to the list, e.g. by Wagner and Denker, it is proposed to
use a `strong crypto hash function` h(), e.g. SHA1, to extract entropy from
a physical source, i.e.
+ Source -- Digitizer -- good hash
Namely, if the sample from the digitizer (of the physical source) is x, the
I wrote:
a) if the hash function happens to have a property I call no
wasted entropy then the whitening stage is superfluous (and
you may decide to classify the hash as non-simple);
David Honig responded:
Not wasting entropy does not mean that a function's output
is white ie uniformly
Amir Herzberg wrote:
So I ask: is there a definition of this `no wasted entropy` property, which
hash functions can be assumed to have (and tested for), and which ensures
the desired extraction of randomness?
None that I know of. I'm not aware of much work in the crypto literature
on this
Amir Herzberg wrote:
So I ask: is there a definition of this `no wasted entropy` property, which
hash functions can be assumed to have (and tested for), and which ensures
the desired extraction of randomness?
That's the right question.
The answer I give in the paper is
A cryptologic
John S. Denker wrote:
Amir Herzberg wrote:
So I ask: is there a definition of this `no wasted entropy` property, which
hash functions can be assumed to have (and tested for), and which ensures
the desired extraction of randomness?
That's the right question.
The answer I give in the paper is
- Original Message -
From: John S. Denker [EMAIL PROTECTED]
To: David Honig [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]; 'Hannes R. Boehm' [EMAIL PROTECTED]; 'Ian
Hill' [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Saturday, July 27, 2002 10:52 AM
Subject: Re: building a true RNG
I wrote
- Original Message -
From: [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Tuesday, July 23, 2002 1:59 PM
Subject: Re: building a true RNG (was: Quantum Computing ...)
You cannot measure entropy retrospectively. You need to have a
theory as to where the entropy
David Honig helped focus the discussion by advocating the
block diagram:
Source -- Digitizer -- Simple hash -- Whitener (e.g., DES)
Let me slightly generalize this to:
! Source -- Digitizer -- hash -- Whitener (e.g., DES)
i.e. we defer the question of whether the hash is simple or not.
I
At 08:39 AM 7/23/02 +0200, Eugen Leitl wrote:
On Mon, 22 Jul 2002, David Honig wrote:
Yes, it is a joke. However, it is also a viable if low-bandwidth
entropy source. I disagree that you need to be able to model
I've got a framegrabber with a 640x480 24 bit/pixel camera. It doesn't
At 10:59 PM 7/22/02 -0700, [EMAIL PROTECTED] wrote:
Entropy is not quite a physical quantity -- rather it is on the
slippery edge between being a physical thing and a philosophical
thing. If you are not careful, you will slip into a deep epistemic
bog and find yourself needing to ask how do
Eugen Leitl wrote:
Is there any point in compressing the video before running it through a
cryptohash?
No. (assuming you're talking about lossless compression)
In general, any invertible transformation neither adds or subtracts
entropy, and hence is extremely unlikely to make any difference
- Original Message -
From: Eugen Leitl [EMAIL PROTECTED]
Subject: Re: building a true RNG (was: Quantum Computing ...)
I've got a framegrabber with a 640x480 24 bit/pixel camera. It doesn't
compress, is rather noisy, and since self-adjusting I get the maximum
entropy at maximum
John S. Denker [EMAIL PROTECTED] writes:
Is there any point in compressing the video before running it through a
cryptohash?
There might be a minor point, namely computational efficiency.
I can't believe any compression software could be as fast as just
feeding the signal straight into
At 10:59 PM 7/22/02 -0700, [EMAIL PROTECTED] wrote:
Entropy is not quite a physical quantity -- rather it is on the
slippery edge between being a physical thing and a philosophical
thing. If you are not careful, you will slip into a deep epistemic
bog and find yourself needing to ask how
On 24 Jul 2002, Paul Crowley wrote:
I can't believe any compression software could be as fast as just
feeding the signal straight into SHA-1.
I haven't tried this, but assuming I'm digitizing dark video and only get
noise in the lower significant bits I can just mask out the constant
(zero)
--
On 22 Jul 2002 at 15:39, David Honig wrote:
You should be able to use any source which you know is not a
PRNG as the entropy-source in a true RNG. You should be able to
use entropy (and stat tests) to measure the source entropy after
digitization.
You cannot measure entropy
On Mon, 22 Jul 2002, David Honig wrote:
Yes, it is a joke. However, it is also a viable if low-bandwidth
entropy source. I disagree that you need to be able to model
I've got a framegrabber with a 640x480 24 bit/pixel camera. It doesn't
compress, is rather noisy, and since self-adjusting I
John S. Denker [EMAIL PROTECTED] writes:
Source -- Digitizer -- Simple hash -- Whitener (e.g., DES)
OK, we have DES as an example of a whitener.
-- Can somebody give me an example of a simple hash
that performs irreversible compression of the required
kind?
I can give you a number of
Eugen Leitl wrote:
... framegrabber with a 640x480 24 bit/pixel camera. It doesn't
compress, is rather noisy, and since self-adjusting I get the maximum
entropy at maximum darkness.
OK. Evidently it's dominated by thermal noise, not to
be confused with the Poisson noise recently featured
On Mon, 22 Jul 2002, John S. Denker wrote:
David Honig wrote yet another nice note:
I'm not trying to be dense, but I'm totally not
understanding the distinction here. The following
block diagram is excellent for focussing the discussion,
(thanks):
Source -- Digitizer -- Simple hash --
On Tue, 23 Jul 2002, John S. Denker wrote:
-- I am told (but don't understand) that there might exist
a weaker hash that somehow does require whitening. This
is the point of the conversation. Please address this
point if you can.
Perhaps they were refering to something like
David Honig wrote:
The thread here has split into QM True Randomness and
what do you need to build a true RNG...
Yup.
Specifically: The executive summary of the
principles of operation of my generator is:
-- use SHA-1, which is believed to be resistant
to collisions, even under
At 04:24 PM 7/22/02 -0400, John S. Denker wrote:
For the humor-impaired, let me point out that the lava
lamp is a joke. What it conspicuously lacks is a proof
of correctness -- that is, a nonzero lower bound on the
entropy rate of the raw data.
Yes, it is a joke. However, it is also a
David Honig wrote yet another nice note:
So work in a Faraday cage...
Tee, hee. Have you ever worked in a Faraday cage?
Very expensive. Very inconvenient.
Depending on what whitening means; see below.
You can imagine simple-hashing (irreversible compression)
as distinct from
53 matches
Mail list logo