Cryptography-Digest Digest #873, Volume #8 Sat, 9 Jan 99 18:13:04 EST
Contents:
Re: On the Generation of Pseudo-OTP ("Trevor Jackson, III")
Re: On the Generation of Pseudo-OTP ("Trevor Jackson, III")
Re: On the Generation of Pseudo-OTP (R. Knauer)
Twofish vs DES? ("TERACytE")
DES? ("TERACytE")
Re: On the Generation of Pseudo-OTP (Paul L. Allen)
Re: Secure Method Intact ("Trevor Jackson, III")
Re: Practical True Random Number Generator ("Trevor Jackson, III")
----------------------------------------------------------------------------
Date: Sat, 09 Jan 1999 16:26:00 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
R. Knauer wrote:
> There are cryptographers who would disagree that a hash is suitable to
> "randomize" the output of a faulty TRNG. The reason is that all
> algorithmic procedures introduce non-random characteristics such as
> periodicity and correlation by their very nature.
Wrong. The use of the universal term "all" is a danger signal. There are
algorithmic procedures that introduce zero periodicity and correlation. I.e.,
there are algorithms that do not reduce the entropy of the data.
> I would have thought that the scheme for removing correlation that
> Schneier discusses in his main book would have done the trick, since
> it involved the simple "mixing" of two or more streams. But someone
> who claimed to know what he was talking about said that is not the
> case.
>
> >While this system is not perfect it is much better than everything that
> >could be done in software: Whatever is done using software is
> >reproducable.
>
> So is hashing. Once the attacker discovers that you are hashing the
> output of a faulty TRNG, he can presumably use that against you.
>
> Bob Knauer
That depends on the quality of the hashing. There is no fundamental reasons
why a biased TRNG cannot be massaged into a good one. Let us assume that
biased means there is less than 1 bit if entropy per bit of data. Let us
assume that a good TRNG produces data bits each containing one bit of entopry
as far as well can tell.
Given these assumptions, it is certainly possible to use the raw output of a
flawed TRNG and produce a smaller amount of perfectly entropic (unpredictable)
data. A key to this process is the nature of the flaw in the biased TRNG.
Presumbly it must be known or the attacker could not use it. If it is known it
can be accounted for.
If the flaw is not known we have a separate case in which we cannot correct
the flaw, but someone someday might use it. This is related to the problem of
proving randomness, which can only be accomplished in the negative.
------------------------------
Date: Sat, 09 Jan 1999 16:33:59 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Paul L. Allen wrote:
> Yes, if I see the output is a million zeroes I will suspect that the
> generator is broken. But it *could* really be random output and the
> very next term in the sequence is non-zero. You have no way of knowing
> for sure - that sequence of a million zeroes is no less likely than
> any other sequence of a million digits.
Well, are there other numbers we should test for? If the one million bits
are the binary expansion of pi should we expect a problem? If the one
mission binary digits are (N-1)/N ones for N>2 should we expect a problem?
The statistical answer based on your comment re the probabaility of each
possible sequence of a million bits is unrelated to the actual
probabilities. Anyone building a generator whos sees flatline output of
any signifigant length is going to suspect a stuck-at fault. Why? Because
the probability of a stuck-at fault is greater than the probabaility of a
random flat-line output.
Now, is the probability of pi in binary larger than the probability of
some fault generating that as a special value? I doubt it, but the nature
of the generator would have to be considered before the question is
answerable.
Point is that there are two standards here, and at all times we are
required to use the more pessimistic (re Murphy) of the two.
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sat, 09 Jan 1999 22:02:58 GMT
Reply-To: [EMAIL PROTECTED]
On Sat, 09 Jan 1999 16:01:04 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:
>So use the RNG convention. POTP for the fake and TOTP for the real
>thing. We acept the fact that there are good PRNGs and bad PRNGs.
>Similarly we should expect there to be more or less good POTPs. It's a
>terminology issue not a fundamental issue.
I do not care what terminology is employed as long as is not
misleading. In the case of the term "pseudo-OTP", I consider it very
misleading because it implies that there are good OTPs and not so good
OTPs.
There is only one possible OTP cryptosystem that deserves that name,
which is referred to simply as the OTP cryptosystem because there is
no other kind. All pretenders to the OTP are called Stream Ciphers.
The term "pseudo-OTP" is an oxymoron.
>No. From the first principles of information theory, information --
>entropy -- is identical to unpredictability.
You must be careful not to confuse unpredicitbility in crypto with
obscurity.
For example, according to Greg Chaitin (op. cit.), a sequence that is
irreducible in size algorithmically is random in his theory. That ism
there is no way to decide formally what each of the bits is supposed
to be.
That seems to imply that if I take some plaintext and compress it with
the best compression algorithm available, that it is therefore
unpredictable. But we know better, because all that one needs to do is
uncompress it and the plaintext is exposed.
In fact it should be possible using text compression to achieve 1 bit
of entropy per bit of compressed data. But that hardly means the text
is unpredictable. High entropy may be a necessary condition, but it
appears not to be sufficient.
>A really good TRNG has 1 bit
>of entropy per bit of data.
But is that the same as saying that a TRNG is capable of outputting
every possible sequence of a given length equiprobably? A hash
algorithm can increase entropy to 1 bit in principle, yet it is not
totally secure for crypto.
If you can say the same about a stream cipher, then it qualifies as a
suitable generator for OTPs. But can you say that about stream
ciphers? Not in general you can't.
That is the whole purpose of this thread - to explore alternatives to
the TRNG. I am curious to know if a "digit extraction generator" would
qualify.
> But that is based on very careful sampling of
>the physical process. Even quantum events indicate correlations.
Not all do.
Some kinds of radioactive decay are completely uncorrelated. They are
the result of "spontaneous emission" caused by "vacuum fluctuations",
the latter which are not real external events. The same is true of
spectral emission from the excited state of an atom.
>Some of which appear to violate the speed of light rule.
Those are not *real* - they are "virtual" processes. Real external
world processes must obey special relativity.
There is an area where such an apparent violation appear to be
involved with non-local phenomena such as in the experiemnts of Alain
Aspect. But that is still quite contoversial as it is mixed in with
Bell's Theorem and Hidden Variables.
>Now text also has entropy. Raw text has less than one bit of entroy per
>bit of data, but there is entropy -- unpredictability -- available in the
>text. If you choose a poor sampling mechanism or over-sample the data
>you'll get less entropy per bit than the desired 1:1. But there is no
>question that entropy is available.
The patterns in text are sufficient to break book ciphers trivially.
>Thus, at the fundamental level there is no difference. Any differences
>are part of the respective sampling methods not the source of the entropy.
I do not see what you are saying. You seem to be saying that there is
no fundamental difference between an OTP and a book cipher. Yet there
is most defintely a fundamental difference - the OTP is proveably
secure and the book cipher is trivally breakable.
>Please amplify the natore of the error in light of the comments above re
>terminology and fundamentals.
I have tried many times. Please refer to my comments above.
>> We already discussed/debated that whole matter a year ago and
>> concluded that a text cipher, no matter how you buggered it, is still
>> fundamentally different from an OTP generated by a TRNG.
>Wrong,
Then we wait with great anticipation for you to show us otherwise. But
I warn you that this road is littered with the corpses of many failed
attempts over the years.
>If this conclusion is correct then we have an important addition to the
>science of information theory. What are the units by which we measure the
>amount of magical correlation in a text stream? Do newpaper text, poetry,
>source code, and TV scripts all share the same degree of this special
>correlation?
Please clarify what you mean here. You seem to be saying that text
does not have any correlations, which is patently untrue. Perhaps some
examples are in order.
>This stuff that is inseperable from text reminds me of the magical fluid
>that was part of all matter and that created heat when rubbed
>(philosgin?). Does it have any other special properties?
Phlogisten IIRC.
Ironically, as happens all the time in science, such alchemy had a
grain of truth in it. On the way to inventing thermodynamics there was
a theory of "chryos" (a mystical substance which produced cold), that
led to some mathematics which were later incorporated into
thermodynamics.
BTW, did you know that Issac Newton practiced official alchemy? He
spent inordinate amounts of his time in a fully equipped "laboratory"
working on alchemy, trying to find the philosophers stone and all. His
theory of gravity that he concocted to explain planetary motion
incorporated a concept straight out of the mysticism of alchemy -
namely, "action at a distance".
Before Newton, there was no such concept in science, only alchemy. In
fact there was no such concept of the "infinitesimal" which is
central to calculus. There were furious debates over such a concept
and one was not allowed to speak of something that vanished for fear
of excommunication by the Catholic Church. There is a book on all this
called "Newton - The Last Sorcerer".
Contemporary scientists will be considered the sorcerers of our era
when viewed by scientists a few centuries in the future. I am certain
they will consider our quantum and relativity theories as just so much
silly mysticism.
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
From: "TERACytE" <[EMAIL PROTECTED]>
Subject: Twofish vs DES?
Date: Sat, 9 Jan 1999 15:07:17 -0700
Which is better? Twofish or DES?
------------------------------
From: "TERACytE" <[EMAIL PROTECTED]>
Subject: DES?
Date: Sat, 9 Jan 1999 15:06:23 -0700
Does anyone know of where I can get detailed information about DES
weaknesses & shortcomings?
------------------------------
From: [EMAIL PROTECTED] (Paul L. Allen)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sat, 9 Jan 1999 21:26:27 +0000
Reply-To: [EMAIL PROTECTED]
In article <[EMAIL PROTECTED]>
[EMAIL PROTECTED] (R. Knauer) writes:
> On Sat, 9 Jan 1999 17:51:36 +0000, [EMAIL PROTECTED] (Paul L.
> Allen) wrote:
>
> >> We discussed this exact thing at length a year ago and concluded that
> >> what you say is theoretically correct, namely that 000...0 is a valid
> >> output, but it is so unlikely for any practical length sequence that
> >> it must be taken as a sign of malfunction.
>
> >*Any* pattern of output you specify is equally unlikely. All zeroes
> >is as likely, or unlikely as all ones, 123451234512345, 314159265359,
> >2718281828,... Well, they're all equally likely a priori. A posteriori
> >is another matter - whatever turned up, turned up.
>
> That is correct.
>
> The significance of a sequence with all 0s or all 1s is that it is
> indicative of common electronic problems like a shorted or open output
> and serves as prima facie evidence of a likely malfunction.
I'd be looking for a *lot* more indicators of malfunction than just all ones
or all zeros. Open output into high-Z inputs without pull-ups is going to
pick up all sorts of radiated crap and will probably look random. I'd also
be looking for bad screening resulting in power-line and/or compuer digital
signal pick-up by internal stages. Something as simple as all zeros or all
ones is just one of many failure modes. An integrating A-D converter that
fails could conceivable produce nice ramps. Or aperiodic samples of a
sawtooth that looks random.
> >Even ignoring the strange English, I don't know what you're getting at.
>
> You will have to read Chaitin's papers to see what I am getting at.
> Chaitin probes the fundamentals of number theory and asks what are the
> limits of decideablilty. He is following in the footsteps of Godel and
> Turing. If you want to be well-versed in the topic of randomness,
> Chaitin is a must read.
>
> http://www.umcs.maine.edu/~chaitin/
> http://www.cs.auckland.ac.nz/CDMTCS/chaitin/
>From what I could make out of your post, it looked like he used
compressibility as one of his tests. That's not valid on finite samples
of infinite output - think of Huffman compression where you set up
the compression based on "the quick brown fox jumps over the lazy red
dog" and then use it on a real corpus of English work - it would perform
very badly. For compressing English, because it's non-random, you can
get away with approximating by using a large sample to derive your
compression table. For dealing with output from a TRNG you're just
fooling yourself - make the sample a little longer and your compression
table would be less efficient than leaving the random data alone.
> >There is nothing I know of in probability theory that says you cannot
> >toss an unbiased coin 100 times in a row and get 100 heads. Not only that,
> >getting 100 heads is neither more nor less likely than any other sequence
> >of heads and tails you care to name.
>
> That is correct. What is your point?
A hundred heads, a million zeroes...
> >> Therefore if a TRNG starts outputting unlikely sequences like 000...0
> >> or 111...1, for any decent length N, that means it is broken, even
> >> though such sequences are theoretically possible.
>
> >I'd *suspect* it was broken. I'd check the circuitry. I'd see what
> >another model produced. But, unlike you, I would not conclude that
> >a TRNG outputting all zeroes *must* be broken.
>
> I think we are saying the same thing. When I say "must" I mean
> "extremely likely", not "necessarily". Your use of the term "suspect"
> is a bit too weak for me, although it is technically correct.
A million zeroes is possible, valid output from a TRNG. I'd suspect
the TRNG of being broken, but the problem is where you draw the line -
10, 100, 1,000, a million, a googolplex?
> > When you say "unlikely
> >sequences like 000...0" you are making a fundamental error. That sequence
> >is no more or less likely than any other sequence you could name of the
> >same length.
>
> I do not believe I meant to imply anything like that. My use of the
> term "unlikely" is consistent with the meaning I intended -
> vanishingly small probability of occurance.
Nope.
> A sequence of 1 million 0s has a "vanishingly small" probability of
> occurance, even though it is equiprobable with all other 1 million
> bit sequences.
Either you're saying that you're going to decide your TRNG whatever
million-digit sequence it produces (because they're all vanishingly
unlikely) or you meant to say that you considered a million zeroes
less likely than any other million-digit sequence. Either view
is wrong. The million-zero sequence is no more or less likely than
any other sequence of the same length. Calling it an "unlikely sequence"
is misleading and incorrect in the context you use it.
> I suspect you are being a little overbearing in your enforcement of
> the strict meanings of terms. This is not a refereed journal we post
> to - most people know exactly what is meant by "unlikely" in the
> context of the discussions here. In fact, that terms was used widely
> by real cryptanalysts in our discussions a year ago on this topic.
Many people here don't have a clue.
> >> But when we discussed the prospect of testing for other "non-random"
> >> sequences, for example ones with abnormal bias, posters here said that
> >> it is not correct to filter such sequences or you will begin to leak
> >> information.
>
> >That's the point. Every so often, a TRNG will produce the run "0", less
> >often it will produce the run "00", less often than that, it will produce
> >the run "000", etc. If you eliminated runs of n zeroes (for whatever value
> >of n you think indicates brokenness) you would be biassing the output of
> >the TRNG.
>
> If the run of all 0s of any practical length were eliminated it would
> not bias the TRNG in any pracitcal way.
Of course it would. You bugger up the runs property of the TRNG. It
is no longer statistically random.
> The significance of all 0s or all 1s has to do with the fact that they
> are signals of electronic problems and "must" not be ignored.
>
> >> You should ask how much information is leaked by interpreting that a
> >> sequence of all 0s or all 1's means a broken TRNG. I believe the
> >> answer is that no information is leaked - that is, by stopping the
> >> TRNG and perhaps fixing it, you do not compromise the total
> >> unbreakability of your OTP system. IOW, being safe doesn't cost you
> >> anything in terms of security.
>
> >See above. You're decreasing the randomness from 1 bit of randomness
> >per bit of data to something less. How much less depends on how many
> >zeroes you have to see in succession before you filter them out, and what
> >other sequences you filter out for similar reasons.
>
> See above. The only sequences we are proposing to filter are those
> with all 0s and all 1s. That is not going to impact the security of
> the TRNG for realistic sequence lengths.
You hope. It does impose a statistical bias on the generator. Probably
not enough to cause a significant problem if you set your limit on
all-ones or all-zeroes high enough. You hope.
> And, as mentioned several times in this discussion, you cannot filter
> any other biased sequences or you WILL mess up the TRNG.
You can't filter *any* sequence or you mess up the TRNG.
> I think we are in complete agreement on all issues here
I don't think so.
> if you will concede that runs of all 0s and all 1s signal a likely broken
> TRNG if the sequence is of practical length - say 10,000 bits, which is not
> that many to buffer while a skew test is being performed.
I'd say it's a possible indicator of brokenness. There's no way of
filtering them out without compromising the statistical properties of your
TRNG to some extent.
--Paul
------------------------------
Date: Sat, 09 Jan 1999 17:40:07 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Secure Method Intact
Actually yes, I had heard of it, but only in reference to the most secure
possible real-time, zero-cost, hands-free form of obfuscation: language
translation. The best known case of this is the use of Navajo radio operators
during WWII. The article describing the technique and its importance alluded to
the superior methods you described.
Now this belongs on the "what is left to invent?" thread. The inventor of the
"wife decoder" is going to be a hero forever.
Note that we have existential proofs of the feasibility of the concept. These
proofs are even constructive!
hapticz wrote:
> after fiddling with des, blowfish, rsa and others, i have discovered a
> truely secure process.
>
> it is within my wifes head, it is completely secure and public. only she and
> her women friends understand how it works and are able to utilize it with no
> hindrance to/from any known government. it allows them to communicate
> across vast reaches of topics and cultures yet requires no visible
> parapernalia other than the air between their mouth and ears.
>
> i have no clue as to how they are able to totally obfuscate the most mundane
> of ideas topics and concepts into a stream of completely random utterings
> and vocalizations.
>
> maybe some day they will reveal this marvelous system to me, but for now,
> oh welll......
>
> has anyone else noticed this??
>
> --
> best regards
> [EMAIL PROTECTED]
------------------------------
Date: Sat, 09 Jan 1999 17:50:23 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Practical True Random Number Generator
The name for the process you are suggesting be measured is "brownian
motion". The problem is that at any given temperature and pressure, it is
quite regular at the molecular level.
We'd also have to be careful that the sensors do not influence the
molecules being measured. Devices this small would be smaller even than
the nanomachines people are trying to build.
Anthony Stephen Szopa wrote:
> Practical True Random Number Generator
>
> I think I have just come up with a practical device to generate true
> random numbers.
>
> Electronically attach a small closed container, such as a hollow sphere
> or cube, to a computer. The inner surface of the container should be
> covered with sensitive high density micro sensors. A suitable gas
> should fill the hollow container. This gas of suitable density should
> be heated to a suitable temperature under a suitable pressure. The
> molecules colliding off the inner walls should be registered by the
> micro sensors lining the inner surface of the container.
>
> It is these registered random collisions over time that should generate
> your random bits.
>
> I'm sure someone with basic skills could have a very good working model
> in a few weeks if not sooner.
>
> Please give me one of your first production models or at least your
> detailed plans so I can make one myself.
>
> Thanks.
>
> [EMAIL PROTECTED]
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************