Cryptography-Digest Digest #898, Volume #8 Wed, 13 Jan 99 11:13:04 EST
Contents:
Re: Practical True Random Number Generator (Mok-Kong Shen)
Re: Metaphysics Of Randomness (Alan DeKok)
Re: On the Generation of Pseudo-OTP (R. Knauer)
Re: HIGH ENTROPY ENCRYPTION IS A MUST!! (Alan DeKok)
Re: On the Generation of Pseudo-OTP (R. Knauer)
Re: On the Generation of Pseudo-OTP (Mok-Kong Shen)
Re: Practical True Random Number Generator (R. Knauer)
Re: On the Generation of Pseudo-OTP (R. Knauer)
Re: On the Generation of Pseudo-OTP (Mok-Kong Shen)
----------------------------------------------------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Practical True Random Number Generator
Date: Wed, 13 Jan 1999 15:19:30 +0100
R. Knauer wrote:
>
> On Wed, 13 Jan 1999 14:06:13 +0100, Mok-Kong Shen
> <[EMAIL PROTECTED]> wrote:
>
> >Since I am not physicist, I have to pose stupid questions.
>
> Why would you say that? It is very easy sometimes for
> non-practitioners to ask brilliant questions that practitioners would
> never think of because they are prejudiced. All forward movement in
> science comes about someone questions the status quo, and that is
> usually done by people who have not snuggled down comfortably in the
> middle of the herd.
>
> >For the commonly employed radioactive materials, at times they are
> >usually employed, how much does this decay effect influence (in
> >average) the ratio of (two) mesurements of successive intervals
> >demarcated by the time points where the detector registers the
> >particles in question?
>
> You would have to do an analysis like the one presented earlier today
> by one poster.
>
> >Is this effect smaller, equal, or greater
> >in orders of magnitude to the effect of appratus (those that are
> >commonly employed in such experiments) errors and the effect of
> >errors in utilizing radio time signals?
>
> That obviously depends on the design and performance of the apparatus.
>
> >If the first case applies and
> >the decay effect is very much smaller, then we can well neglect
> >the decay effect in the present context (i.e. the suggestion of
> >KloroX of 12 Jan to modify the program code to remove the bias
> >for two successive intervals would be unnecessary).
>
> You need to provide a proof for that.
Since I am not a physicist, I have not the needed knowledge, in
particular data, to prove or disprove that. That's why I asked
physicists to provide the said magnitude of errors in real-world
cases of physical experiments (well, for the present debate let's
say the most expensive and hence sophisticated experiments of the
type in question that have been done up to this day). If the
magnitude of the decay effect is much less, then the proof is
immediate, isn't it?
So who of the readers that are physicists would be kind enough
to give the required data?
> There are other errors beside the reduction in the amount of the
> radioisotope present over time. Whatever these might be, ranging from
> detector latency to electronic drift in the timing circuits, the
> technique under consideration removes bias in the output. It is
> difficult to parameterize these systematic errors without knowing more
> about the specific design of the device.
>
> What is important is to recognize that such systematic errors do
> exist, so it is necessary to do something about them regardless of
> their significance in any particular design. Better safe than sorry,
> especially when the cost for that safety is negligible.
>
> BTW, toggling the output like that is similar to the XOR operation,
> which is used to remove bias from bitstreams computationally.
Perhaps I should say I am fond of 'Occam's razor' in the general
sense. If something is not necessary, then drop it. The added
coding suggested by KloroX might be 'theoretically' (or 'pedantically')
satisfying. If it contributes nothing in practice as found by
a real-world error analysis, then why add that complication to the
program?
M. K. Shen
------------------------------
From: [EMAIL PROTECTED] (Alan DeKok)
Subject: Re: Metaphysics Of Randomness
Date: 13 Jan 1999 09:40:33 -0500
In article <[EMAIL PROTECTED]>,
R. Knauer <[EMAIL PROTECTED]> wrote:
> [ ... ]
>One conclusion from these considerations is that a number that is
>generated algorithmically, no matter what the algorithm might be, is
>not a true random number - because such a number always has some
>reason for its existence, a reason that you could possibly discover if
>you worked at it long enough or hard enough. That is why only the
>TRNG-based OTP cryptosystem is proveably secure - all other systems
>have an underlying calculable reason behind their encryption schemes.
^
theoretically
If you're talking about the limits of mathematics and calculation,
it's worth including a discussion on theory versus practice. For
example, analytical methods versus numerically stable methods.
Most mathematical algorithms have analytic forms which assume
infinite precision. Computational versions of these algorithms,
however, are numerically unstable, and thus nearly useless in every
day life. The solution is to go to a mathematically ugly method, but
which is numerically stable.
In this context, the theoretically "underlying calculable reason"
may still be incalculable in real life.
>That means that almost all of classical cryptography (the OTP being
>the sole exception) is not about security but is about obscurity,
>because if the cryptanalyst were able find the "reason" behind your a
>given encyption scheme, either formally or experimentally, he could
>break the cipher.
Agreed. Many crypotgraphic systems have been broken because people
find simpler "reasons" for the output than the 2^k complexity of the
key size.
This is an interesting problem. Is there a theoretically useful,
but practically useless encryption algorithm which *provably* is as
complex as it claims to be? Such an algorithm could be used as a
baseline to use for testing the strengths of new algorithms.
Right now, cipher design is a "black art". Narrowing the bounds of
the art may enable it to become better understood, and more of
math/science/engineering.
>It is indeed true that "obscurity does not imply security", in more
>ways than one.
If it's obcure enough to be practically incalculable, the obvious
assumption is that it's secure, too. While history has proven this
assumption wrong in some cases, (e.g. factoring large numbers) it's
better than the alternative.
Alan DeKok.
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 14:25:29 GMT
Reply-To: [EMAIL PROTECTED]
On 13 Jan 1999 13:29:28 GMT, [EMAIL PROTECTED] (John Briggs)
wrote:
>If you pick the number first, then the compression algorithm then
>you can compress down to arbitrarily small sizes.
Not if you must use that algorithm to reproduce the number, as is
required in Chaitin's algorithmic complexity theory. Reduction is one
criterion and reproducibliity is another.
>Note that the Chaitin's notion of algorithmic complexity is equivalent
>to choosing the _decompression_ algorithm first. Then the number.
>And then finding the minimum string that decompresses to that number.
I surely did not get that from a reading of his papers.
He states clearly that the complexity of a number can be reduced if
the algorithm accomplishes two things simultaneously:
1) It compresses the number.
2) It reproduces the number.
For example, let's say the number is 111...1, of N bits in length. If
I employ the following algorithm:
for(i=0;i<N; i++) fprintf("1");
I have significantly reduced the complexity of the number - that is, I
have "compressed" it or algorithmic reduced it - while providing an
algorithm to reproduce it at the same time.
The reason is that this algorithm reduces the complexity of the number
is that it only takes log2(N) bits to represent the coding of N in the
for( ) loop. The additional bits of overhead for the rest of the code
become negligible as N gets larger.
Therefore I have:
1) algorithmically "compressed" a number of size N to a number (the
code fragment itself) of size log2(N) + C, and
2) I have reproduced the number.
Calculating the bit parity of a number is not relevant here.
>No. For any number, there is an algorithm that compresses it down
>to a length of (at most) one. That's any number. Not just some
>numbers. Not just a trivial minority. Not just an overwhelming
>majority. Any number.
You cannot do that to the uncomputable real numbers, of which there
are an infinite count in any finite interval.
But if you don't believe that, try compressing Chaitin's Omega and
show us what you get. We await your result with great anticipation.
HINT: Chaitin constructed Omega deliberately so it could not be
algorithmically reduced.
Bob Knauer
"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle
------------------------------
From: [EMAIL PROTECTED] (Alan DeKok)
Subject: Re: HIGH ENTROPY ENCRYPTION IS A MUST!!
Date: 13 Jan 1999 09:48:58 -0500
In article <77hvuq$oa4$[EMAIL PROTECTED]>,
Markus Kuhn <[EMAIL PROTECTED]> wrote:
>
>Entropy is a pretty lousy concept for measuring
>security of crypto parameters, a very commonly
>made mistake: [ ... ]
>
>A key with 513 bit entropy sounds pretty secure, yet you can
>easily break 50% of all messages by guessing that they
>have been encrypted with the all zero key. Entropy is
>somewhat irrelevant here, the maximum probability
>(or minimum decision content) of every possible key
>is of much more concern for security.
Is there a standard metric for measuring bias? e.g. something like:
bias = (key size - entropy) / (key size)
I bias of '0' would be describing a uniform system, and a bias of
'1' would describe a completely biased system. In your example, the
bias is 50%, which is pretty bad.
Hmm... by this metric, RSA keys would also have a high bias.
Alan DeKok.
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 14:35:58 GMT
Reply-To: [EMAIL PROTECTED]
On 13 Jan 1999 13:41:37 GMT, [EMAIL PROTECTED] (John Briggs)
wrote:
>You made a claim that "the best compression algorithm is the one that
>produces the smallest output".
And, in the context of Chaitin's algorithmic complexity theory, that
compression algorithm must reproduce the original number, or it is not
relevant. That is made manifestly clear in Chaitin's papers.
>This claim leaves the input string
>unspecified. That's not good.
Why is that not good? Chaitin's theory applies to any number.
>So I asked you to clarify. Which
>input string?
Chaitin's theory applies to any input string - it does not matter.
But keep in mind that there are many numbers that cannot be
algorithmically reduced, not even by one bit. Chaitin's Omega is one
such number.
>When evaluating a compression algorithm, do you choose the compression
>algorithm first and then the number to be compressed? Or do you
>first choose the number to be compressed and then the algorithm?
In Chaitin's theory, the algorithm can be chosen after you have given
the number. But whatever algorithm is chosen, it must reproduce the
number.
>It's a simple question. The name "Chaitin" does not appear in
>the question. The term "algorithmic complexity" does not appear
>in the question.
It surely did when I first brought this up. I am not talking about
just any compression, I am talking about the specific theory of Greg
Chaitin.
>I'm not asking Chaitin. I'm not asking me. I'm asking you.
And I have given my response. It would help if you would come into
these discussions from the beginning.
>But I'll bet you don't even understand the question.
Now, that is a completely unwarranted ad hominen, which has just made
you totally irrelevant for further discussions.
At least I have read Chaitin's papers, something you have obviously
have never done.
<plonk>
Bob Knauer
"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 15:05:21 +0100
R. Knauer wrote:
>
> On Tue, 12 Jan 1999 12:26:32 +0100, Mok-Kong Shen
> <[EMAIL PROTECTED]> wrote:
>
> >So there is not of much practical value to insist on perfect security,
> >on obtaining (ideal) OTP.
>
> There is practical value in using the "perfect OTP" (as you call it)
> as a standard, just as there is practical value in using a Perfect
> Circle as a standard.
The context clearly indicates that my 'obtaining (ideal) OTP' means
actually generating a concrete (ideal) OTP, not in the sense of
using it as a theoretical concept in connection with the (also
theoretical) concept of perfect security.
M. K. Shen
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Practical True Random Number Generator
Date: Wed, 13 Jan 1999 15:19:15 GMT
Reply-To: [EMAIL PROTECTED]
On Wed, 13 Jan 1999 15:19:30 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:
>So who of the readers that are physicists would be kind enough
>to give the required data?
I believe one poster has already supplied an example. But you do not
need to be a physicist to do the analysis for yourself - just use the
radioactive decay law.
N(t) = N(0) exp( - t/T), where T is a parameter related directly to
the half life T1/2 of the isotope. In fact you can figure out what
that relationship is by noting: N(T1/2) = 1/2 * N(0).
But you must provide other details to do the analysis - for example,
details about your detector and associated electronics. If the half
life is short compared to the response time of the detector, then
there will be errors due to latency, which can be cancelled by
debiasing the output.
In general, such a detailed analysis gets rather complicated, so it is
best just to put de-biasing in the design and go on from there.
>Perhaps I should say I am fond of 'Occam's razor' in the general
>sense. If something is not necessary, then drop it.
Just remember Einstein's warning - that you cannot lop off so much
that the resulting theory doesn't provide an explanation.
Occam's Razor modified by Einstein states that the correct explanation
is the simplest one that is consistent with the facts, not just the
simplest one.
>The added
>coding suggested by KloroX might be 'theoretically' (or 'pedantically')
>satisfying. If it contributes nothing in practice as found by
>a real-world error analysis, then why add that complication to the
>program?
The addition of debiasing in the design of the TRNG could be looked
upon as "good design practice", like putting some extra rivets in the
contruction of a steel bridge. As long as it does not cost all that
much and does not cause more problems than it purports to solve, then
it can do no harm and might do some good.
There is an art to design, one that is based on experience. Designers
don't spend lots of time justifying their designs to the nth decimal
place - they just design their circuits based on good design practice.
In the case of this radioactive decay TRNG, which requires an interval
timer, it is good design practice to de-bias the output just because
it is good design practice.
Bob Knauer
"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 15:37:22 GMT
Reply-To: [EMAIL PROTECTED]
On Wed, 13 Jan 1999 14:59:01 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:
>> The statement you made above has been challenged many times before.
>Please give a pointer to literature. I'll be interested to know more
>of the issue.
I already have, several times in fact.
I would recommend you start with Hofstedter's "Godel, Escher, Bach",
then read Roger Penrose's two recent book, then study Greg Chaitin's
papers. Although that is certainly far from exhaustive, each is
accessible to the layman.
There are several other books to read, but if you read those you will
get the general idea. Another area tp keep in mind is Fuzzy Set Theory
(see Bart Kosko, "Fuzzy Thinking" for a layman's book and G. J. Klir
and B. Yuan, "Fuzzy Sets and Fuzzy Logic", for a hardcore treatment)
which is just emerging on the scene.
> Absolute assertions are possible if they are deduced from axioms
> by logical rules.
What do you mean by an assertion being "possible" - do you mean that
it is true?
>This means if you find a true absolute assertion then it must have
>been deduced from axioms by logical rules.
No! You still miss the point. There are assertions that are true which
cannot be proven using logical rules.
For example, a given computer program either halts or it does not
halt, but you cannot prove which it does using logical rules. The only
way you can know is to run it, and if it halts you know that it halts,
and if it doesn't halt then you suspect it might never halt.
You could analyze it with your mind, but even that is not infallible,
as most of the code from MicroShaft demonstrates so convincingly.
>This does NOT imply that
>all true absolute assertions CAN be deduced from axioms by logical
>rules.
I am confused. You seem to be saying the same thing and its
contradictory opposite.
What makes "a true absolute assertion" (statement one above) any
different from "all true absolute assertions" (statement two above)?
Bob Knauer
"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 15:57:54 +0100
R. Knauer wrote:
>
> Formal mathematics is not as "rigorous" as once believed at the turn
> of the century.
Perhaps we have a bit different interpretation of the word 'rigorous'.
In my opinion all formal mathematical methods are rigorous. The only
thing that Goedel and others found is that these are not 'all-
powerful', i.e. their capability is limited, not infinity.
>
> >But if that phrase 'If the .... random' cannot be true in the real
> >world,
>
> But it can be - radioactive decay is 100% random.
You can 'define' it that way. Otherwise, since the definitition
of randomness cannot be practically varified (because of Kolmogorov
complexity etc.) that statement is not verifiable.
> That's as random as it gets.
No problem in accepting this. The statement is however a 'relative'
assertion!
> The only way you can know for sure that your bitstream is even close
> to truly ramdom is to look at the generator. Therefore you need some
> method to analyze each proposed algorithmic generator on a case by
> case basis to decide just how close to random the output is.
No. You can use statitistical tests. I prefer Maurer's universal test.
There is a battery of other tests, if you can afford the
expenditure.
>
> If you can provide such methods, formal or experimental, for
> characterizating specific algorithmic generators, please let us know.
> But remember that you cannot apply the criteria to the output streams
> per se - the criteria must be applied to the algorithm itself.
>
> >By hardware noises I mean noises (randomness) from physical processes.
> >If you exclude these, where do you obtain your 'TRNG'?
>
> Radioactive decay is usually not considered "hardware noise".
But the fluctuations you derived from the data is noise or meant
to be noise.
>
> Removing bias from bitstreams can be done reliably with algorithms,
> such as Knuth's famous one. See FIPS 140-1 and Schneier.
>
> Supposedly it is correlation that is the difficult thing to remove
> algorithmically from bitstreams.
Do you understand bias to be simply deviation from the mean value
or what? If you understand the term in the general (proper) sense,
then in the present context anything deviating from a white noise is
a bias, in particular non-negligible auto-correlations.
>
> >Hardware noise may indeed be the best source in the practical sense.
>
> What do you mean by the term "hardware noise"?
Said above. Any fluctuating time series (supposed to be approaching a
white noise) that is obtained from a physical measurement.
>
> >But that anyway doesn't preclude 'intended approximation to an ideal
> >OPT' from being, say, the second best source and thus have utility in
> >practice because of cost advantages, etc.
>
> I agree that such sources should be considered. But I also would
> insist that there be some kind of certification that such sources are
> secure to some level, otherwise why use them?
>
> In my estimation, the trick is to find a way to remove all correlation
> that is inherent in these sources. Using algorithmic means to remove
> correlation is suspect because algorithms introduce their own form of
> correlation into bitstreams - so the expert cryptographers around here
> tell us (and I have no reason to doubt their purported wisdom).
>
> If Strong Mixing (FIPS 140-1) is the proper method for removing
> correlation, then I wonder if anyone has proven the level of security
> for such methods.
I initiated this thread with the sole purpose of gathering expert's
recommendations to remore auto-correlations.
>
> >I indicated certain viable techniques in the original article.
> >Whether they are good have ultimately to be verified by experiment.
> >But I thought that there is certainly value to know what merits the
> >individual techniques (or additional proposals) may possibly have
> >according to experts' evaluations before doing extensive experiments,
> >which could be expensive without guidance of experts' opinions.
> >That's why I initiated the the present thread.
>
> I would like to see you give reasons why you believe those techniques
> are secure, even if they are not "perfectly secure".
Basically this: The more your sequence approach white noise, the
less chance has the analyst to do inference. So if the techniques
turn out to be effective in reducing the auto-correlations to
neglibible values, then they are practically useful for achieving
a plausible (never error-free) level of security.
>
> Until you do, or appeal to someone else who can, it just looks like
> you grabbed a few methods out of a hat and slapped them together. You,
> I and all the rest of the crypto world, expert and amateur alike, know
> that such a practice is extemely dangerous. Some of the most complex
> algorithms ever invented, even those concocted by Giants in the crypto
> field, have turned out to be vulnerable to simple attacks. Schneier
> has any number of instances of such in his main book - and we are
> talking about Big Giants too.
Why not try instead of patiently waiting for a message from God
telling one some certainly absolutely good way to accomplish a job?
>
> I do not believe you can hide behind this euphemism "approximation"
> much longer. At some point you are going to have to come clean and
> show us (or get someone to show us) just how much of an
> "approximation" your proposed methods are.
I referred in many places to statistical tests, the auto-correlation
coefficients.
> By attaching your proposals to a loaded term like "pseudo-OTP", you
> have tried to characterize them as "approximations" to the OTP, with
> the implication that they are only "slightly" less perfect than the
> OTP. To bolster that, you try to point out that even TRNGs are not
> perfect, therefore it is unfair to claim that your proposals are no
> good because they are imperfect "approximations".
That's why I was careful engough to use the term 'intended
approximation'. O.k?
>
> But all that is disengenuous, since it has been argued by expert
> cryptographers that all stream ciphers except the OTP are very poor in
> terms of security. If you claim that your stream ciphers are very good
> "approximations" to the OTP, then you will have to provide reasons for
> us to accept that claim.
If statistical tests can't show unaccptable deviations from white
noise, that should be enough. Your physical devices ALSO can't offer
better than that in 'demostrating' the security they provide!
M. K. Shen
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************