Cryptography-Digest Digest #876, Volume #8 Sun, 10 Jan 99 13:13:04 EST
Contents:
Re: --- sci.crypt charter: read before you post (weekly notice) ([EMAIL PROTECTED])
Re: On the Generation of Pseudo-OTP ("Trevor Jackson, III")
Re: What is left to invent? (R. Knauer)
Re: Practical True Random Number Generator (R. Knauer)
Re: coNP=NP Made Easier? (Lasse Reichstein Nielsen)
Re: How to deduce EXE header contents and other questions. (Anthony Naggs)
Re: On the Generation of Pseudo-OTP (R. Knauer)
Re: On the Generation of Pseudo-OTP (R. Knauer)
Re: On the Generation of Pseudo-OTP (R. Knauer)
Re: SCOTT19U ENTROPY (Tim Redburn)
Re: On the Generation of Pseudo-OTP (Kurt Wismer)
Re: On the Generation of Pseudo-OTP (R. Knauer)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED]
Subject: Re: --- sci.crypt charter: read before you post (weekly notice)
Date: Sun, 10 Jan 1999 15:09:00 GMT
On 10 Jan 1999 06:00:37 GMT, [EMAIL PROTECTED] (D. J.
Bernstein) wrote:
>A common myth is that sci.crypt is USENET's catch-all crypto newsgroup.
>It is not. It is reserved for discussion of the _science_ of cryptology,
>including cryptography, cryptanalysis, and related topics such as
>one-way hash functions.
>
>Use talk.politics.crypto for the _politics_ of cryptography, including
>Clipper, Digital Telephony, NSA, RSADSI, the distribution of RC4, and
>export controls.
>
>What if you want to post an article which is neither pure science nor
>pure politics? Go for talk.politics.crypto. Political discussions are
>naturally free-ranging, and can easily include scientific articles. But
>sci.crypt is much more limited: it has no room for politics.
Kind of funny--does *anyone* follow this? My guess would be that at
least half of the posts here don't belong here according to the faq.
The ironic part is that *at least* one of the contibutors to the faq
posts off-topic (according to the faq) almost daily.
Mike
Decrypt [EMAIL PROTECTED] with ROT13 for email address.
NOTICE TO BULK EMAILERS: Pursuant to US Code,
Title 47, Chapter 5, Subchapter II, 227, any
and all nonsolicited commercial E-mail sent to
this address is subject to a download and archival
fee in the amount of $500 US. E-mailing denotes
acceptance of these terms.
------------------------------
Date: Sun, 10 Jan 1999 09:38:57 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
fungus wrote:
> "Trevor Jackson, III" wrote:
> >
> > So use the RNG convention. POTP for the fake and TOTP for the real
> > thing.
>
> Whose convention is that?
>
> A POTP is a "stream cipher".
Yes. So? Is an OTP *not* a stream cipher???
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: What is left to invent?
Date: Sun, 10 Jan 1999 15:37:25 GMT
Reply-To: [EMAIL PROTECTED]
On Sun, 10 Jan 1999 03:20:16 +0000, Nicko van Someren
<[EMAIL PROTECTED]> wrote:
>An efficient reproducible Random Oracle.
>(For those out there unfamiliar with the term, a random oracle
>is a mythical device which when presented with a question it
>gives a truly random answer, with the proviso that if the question
>has been asked before it gives the same answer as last time.
>Many crypto proofs are only really proofs if your hash functions
>are random oracles. You can build the prefect stream cipher
>with a random oracle just by putting your key into a counter,
>taking the output of the counter into the RO, and using the
>output of the RO as the cipher pad. You can also make
>unbreakable block ciphers using ROs. Very handy things, but
>no one has ever made one.)
Is there anything written on this subject, on the Web possibly? I
can't seem to find anything in Schneier's main book about it.
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Practical True Random Number Generator
Date: Sun, 10 Jan 1999 15:38:49 GMT
Reply-To: [EMAIL PROTECTED]
On Sat, 09 Jan 1999 17:50:23 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:
>The name for the process you are suggesting be measured is "brownian
>motion". The problem is that at any given temperature and pressure, it is
>quite regular at the molecular level.
That's news to me - do you have references for that assertion?
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
From: [EMAIL PROTECTED] (Lasse Reichstein Nielsen)
Crossposted-To: sci.math,comp.theory
Subject: Re: coNP=NP Made Easier?
Date: 10 Jan 1999 16:15:42 GMT
In <776fqc$oh4$[EMAIL PROTECTED]> [EMAIL PROTECTED] (Mike McCarty)
writes:
>In article <[EMAIL PROTECTED]>,
>Matt Brubeck <[EMAIL PROTECTED]> wrote:
>)In article <[EMAIL PROTECTED]>, rosi <[EMAIL PROTECTED]> wrote:
>)
>)> Three things. First, you perhaps did not consider my other question.
>)> That is if NDTM is real, NP problems can be solved in P time.
>)
>)Well, a nondeterministic Turing machine (NDTM) is a Turing machine which,
>)at a given state and input, may have several potential transitions.
>)Therefore it may not behave the same when given the same input more than
>)once. The NDTM is just a theoretical model of computation; a "real" NDTM
>)would be useless, since its output is unpredictable.
Nah, nondeterministic algorithms aren't necessarily unusable, if there is
a known probability of the answer being correct. Then it's just called a
probabalistic algorithm instead :)
>This isn't the definition *we* used.
>)Given an instance of a decision problem, if the answer is "yes," an NDTM
>)may or may not accept, but if the answer is "no," then it definitely does
>)not accept. "NP time" means that there exists some NDTM for which there is
>Oh, I see. You used the word "unpredictable" in a way I am not
>accustomed. I also had to reparse "Therefore it may not behave the same
>when given the same input more than once." I thought that you were
>saying "For every given input string the machine has the possibility to
>produce a different result on successive runs."
So did I. The second sentence is related to when a NDTM is said to accept
an input (if there is a possible execution that accepts) or rejects it
(there is no possible execution that accepts), which is then used to define
NP.
>[snip]
>)In short: NDTMs are theoretical constructs only.
>Interesting. I have used virtual NDTMs for solving real life problems.
>In fact, I have written (and use) LEX and GREP type programs for
>pattern matching which implement NDTMs.
I think you might be confuzing some things here. Regular expressions are
usually recognized by Finite Automatons(FAs), and nondeterministic
FAs(NFAs) are no more powerfull than FAs, so you can translate NFAs to
FAs, that can be implemented "easily". Ofcourse TMs are MORE powerful
than FAs (as language acceptors), so you could implement LEX and GREP
with TM's, but why bother?
>[snip]
>)Side-note: Quantum computing, which is still largely theoretical, may
>)someday allow the construction of real devices which behave like
>)non-deterministic models of computation, and solve NP-hard problems in
>)polynomial time. However, this is still unclear; it also doesn't change
>)the theoretical bases for questions about P, NP, and coNP, though it does
>)change the practical consequences of the answers.
>This is true. But OTOH, we *already* have hardware which can be
>programmed to act as an NDTM. Quantum computers will simply be more
>efficient (hopefully, anyway).
Yes, we can make nondeterministic algoritms on traditional computers,
but we cannot implement the acceptance criteria (accept if it is possible)
in the timebound we give for nondeterminsitic time classes. That is:
No computer we have build (non-quantum) can decide a NP hard problem in
a polynomially bounded time unless P=NP (assuming our computers are
polynomially related to DTMs, which I believe they all are).
The problem here is that a NDTM is defined as a DTM with a bounded
choice of transitions, but the languages/problems decided by NDTMs are
defined by quantification over ALL possible executions of the NDTM. We
can build NDTM's (if we have access to some random-source), but we
cannot yet check for possibility of acceptance without exponential
blowup, as we have to check ALL the exponentially many executions.
Quantum computers might solve this for a subset of NP, but so far it
doesn't look like it's all of it, and therefore specifically no NP hard
problems.
>[snip]
>)Existence proofs and examples of NDTMs can found for well-known NP
>)problems in algorithms and computational theory textbooks, references, and
>)journals. I see no need to waste bandwidth and time by typing in big
>)state/transition tables for NDTMs.
>(To original poster:)
>For a reference, see Aho and Ullman "Syntax Directed Language
>Translation" (or something like that, the "Dragon" book). They give
>examples of NDTMs used to recognize regular expressions of various
>types.
My Dragon book is somewhere else right now, but I'm pretty sure they used
(N)FAs for regular expressions and (nondeterminstic) push down automata for
the context free parts (making it even simpler by using LALR(1) grammars only).
They didn't have to use the power of NDTMs at all.
/L
--
Lasse Reichstein Nielsen - [EMAIL PROTECTED]
This message may be reproduced freely for non commercial purposes.
"... but where do you want to go tomorrow?"
------------------------------
From: Anthony Naggs <[EMAIL PROTECTED]>
Crossposted-To: comp.lang.asm.x86
Subject: Re: How to deduce EXE header contents and other questions.
Date: 10 Jan 1999 16:33:41 GMT
After much consideration decided to share these wise words:
>I am in need of a way to deduce at least 13 consecutive bytes (better if
>more) of a DOS EXE file, knowing only its exact size (in bytes). This is
>required for decrypting a ZIP file to which I had lost a password. Brute
>force and dictionary-based approaches are of no help (I use long, unguessable
>alphanumeric sequences) and I am using the "known-text" algorithm,
>implemented in a program found at:
I'm afraid you have misunderstood. Unfortunately to break a zip file's
encryption you need to know 12 or 13 consecutive bytes of the
*compressed* version of the file. (With the best published attack on
Zip files.)
--
"Happy happy happy. Joy joy joy." - Stimpy
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sun, 10 Jan 1999 17:11:45 GMT
Reply-To: [EMAIL PROTECTED]
On Sun, 10 Jan 1999 09:38:57 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:
>Yes. So? Is an OTP *not* a stream cipher???
Yes. So?
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sun, 10 Jan 1999 16:57:54 GMT
Reply-To: [EMAIL PROTECTED]
On Sun, 10 Jan 1999 09:32:43 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:
>Actually, I think this is erroneous.
Nope, it is not.
>The name OneTimePad specifies a single-use key.
That is not all it specifies. It specifies a True Random Number as the
pad. Otherwise it would not be proveably secure.
>The best possible key is one in which each key element (char
>or bit) is independent of the others.
The best possible key is one generated by a TRNG, which is a physical
device capable of outputting all possible sequences of a given finite
length equiprobably. That includes outputting sequences that have bits
which are not "independent" of one another, such as in the sequence
111...1.
>But having typists pound on keyboards,
>as the Russians did, does not produce independent key elements. They had a
>True-OTP until they started reusing keys, but their key quality was nowhere
>near perfect.
The Russians *thought* they had a random source, and apparently they
had something that was very close since Venona required the reuse of
pads to crack their ciphers.
>Thus I believe that a true OTP is one with a single-use key. The quality of
>the key does not turn a single-use key system into an inferior stream cipher.
>It is still a true OTP.
Nope. An OTP uses a random pad, otherwise it would not be proveably
secure.
>It is not that simple. The key phrases you used was "all one needs to do is
>uncompress it" and "it should be possible to achieve 1 bit of entropy per bit
>of compressed data." Inside a compressed archive there exists all kinds of
>structure to guide the decompression process. If you strip out only the
>compressed text you will not be able to uncompress it without some loss of
>fidelity.
I suppose that anything that does not result in one bit output would
fall under that critique. After all, a hash that gives one bit output
(e.g., parity) has no information whatsoever in it regarding the
original text.
>I'm not sure 100% entropy density is possible in a lossless system.
You mean strong mixing (see RFC1750) could not increase entropy to
100%?
>Note that
>a hash is a lossy system to it is easily possible to get 100% entropy density.
>In fact the statement that there is irreducible structure to text implies that
>100% entropy density is literally impossible.
Did you mean "irreducible" or its opposite?
>Now which way do you want to have it?
I would like it to be fundamentally correct, if such a thing is
possible. It may not be possible if some of the concepts of Greg
Chaitin are universally applicable to number theory.
>Here you have to provide an example. The claim that a hash contains structure
>is identical to the claim that the entropy density is less than 100%. Can you
>show any system with 100% entropy density that is not crypto-secure?
I did not make the original assertion about hash functions, someone
else who represented himself as an expert cryptographer on sci.crypt
did. And no one else bothered to challenge him, so I accepted it
tentatively.
If you can show that a certain class of hash functions are proveably
secure, please do so.
>I believe that you cannot. 100% entropy density means that the bits are
>unrelated. Literally unpredictable. I suspect that criticism you got in
>previous discussions, which I have not reviewed, was based on the claim that
>you had failed to reach 100% entropy density, rather than the claim that 100%
>entropy density may be inadequate.
Nope, it was a simple assertion that hash functions could not be
considered to be cryptographically secure because they are based on
internal states which can be reproduced.
BTW, what algorithmic method do you propose, besides hash functions,
to produce 100% entropy from text?
>Well, we have no idea whether there are any correlations. This is a far cry
>from the claim that we can prove that there are none.
The claim is shorthand for: "At the present level of theoretical
understanding...". Of course, if you believe in Hidden Variables, then
there are correlations in all quantum processes.
But for today, QM maintains that there are indeed random physical
processes. For example, the wavefunction for a single particle in free
space is exp(i(k.r-wt)), where the dot is to be taken as the vector
dot product operator, k is the wave vector, and w is the angular
frequency. The latter are related to physical quantities as p=k_bar k
and E=h_bar w, h_bar being Planck's constant divided by 2 pi. The
variables r and t are the vector position and time respectively.
If you take the conjugate square of that wafefunction you get a
constant which says that the probability for finding the particle at a
particular place at a particular time is the same for all places and
times.
That means that the particle is capable of existing at any place at
any time equiprobably. That latter statement is very much the same as
the the definition of a TRNG. There are quantum mechanical processes
that are completely random in the same sense as crypto-grade
randomness.
>It would be nice to force the universe to match our theories, but it just ain't
>so.
But they are the best we have to go on. If man waited around for the
Perfect Circle to manifest itself in the physical world, we would
still be living in caves.
>There is a class of dual emission events in which two high energy
>particles are emitted in exactly opposite directions. The properties of the
>particles are related. If you measure one particle its wave function
>collapses, and so does the other particle!. The interesting thing about this
>is that the collapse of the second particle's wave function occurs at the same
>time as that of the first particle. No matter how far apart they are. This
>appears to violate the speed limit laws.
That is the experiment of Alain Aspect that I was describing earlier.
It is tied up intimately with Bell's Theorem regarding non-locality of
hidden variable theories. Best leave that stuff to quantum
cryptography when it emerges on the scene. Classical crypto is enough
for us mere mortals right now.
>The technical term for such correlation among particles is entanglement.
>Entangled particles have correlated behavior. Yet entanglement is created
>everytime a pair of particles interact. So (I speculate), in theory the entire
>cosmos may consist of heavily entangled particles. Thus there is room for an
>arbitrarily large amount of correlation at the quantum level.
I only claimed that there are *some* processes in QM that are
completely random.
>We just can't prove it or disprove it. Yet.
That is why it is best left to speculation and not brought into
classical crypto.
>> There is an area where such an apparent violation appear to be
>> involved with non-local phenomena such as in the experiemnts of Alain
>> Aspect. But that is still quite contoversial as it is mixed in with
>> Bell's Theorem and Hidden Variables.
>I do not recognize these references, but that is just my ignorance.
Regardless, you do know something about them as is evidenced by your
comments above. They are described in Roger Penrose's recent two books
on physics, to name one reference.
>Raw text sure. But not in text that has been hased to 100% entroy density. By
>definition, the bits of the hash are independent.
Again, I ask you to provide a class of hash functions which will
proveably produce 100% entropy from any input.
>A perfect OTP used correctly is provably not subject to analysis. This is a
>much weaker condition than true provable security.
I agree with the first statement, but I do not understand the second
one. Proveable security is limited to analysis, and does not involve
protocol. In terms of protocol the OTP system sucks big time. That is
the reason for this thread - to find a way that gets around the
problems with key management with the standard OTP system.
>An imperfect OTP (with lousy key quality such as the Russians generated) is
>still an OTP. It is theoretically breakable.
Then it is not really an OTP, it is just another stream cipher. The
OTP system is proveably secure.
>An incorrectly used OTP (key recycling) is certainly breakable.
Then it is not an OTP.
>Of course natural text has correlations. Preditable bits. Less than 100%
>entropy density. However, raw natural text does contain information.
>Entropy. Unpredictable bits. It is certainly possible to sample text to
>extract it and use it as a key for an OTP or other cipher. The only hard part
>about it is judging how much entropy exists in the original text.
I never said otherwise.
In fact I have said (a year ago and repeated here) that one might be
able to build a secure stream cipher from the least signigicant bits
of text, after antiskewing and decorrelation. The hangup was on the
proposed method of decorrelation. Strong mixing of different streams,
such as recommended in RFC1750 (which includes hashing), is supposed
to produce a correlation-free stream, but others criticized that
method and I could not argue against them. Perhaps you are in a
position to clarify this sticking point.
>If you can specify the required theoretical density enhancement I believe I can
>specify a theoretical procedure to realize it.
The OTP system demands that the pad be constructed from a TRNG which
is a physical device capable of outputting all possible sequences of a
given finite length equiprobably. If you can come up with a procedure
that does the same thing in a different manner, then you have come up
with something that can be used as a substitue for a physical TRNG in
the OTP system.
We wait with great anticipation, albeit with a proper amount of
skepticism.
>Well, Newton had a hand in those debates. A nefarious one. He and Leibnitz
>(sp?) both claimed to have invented the fundamental concepts of calculus.
And both did. That has happened before in the history of science and
mathematics.
>While Leibnitz had precedence (as seen from history; at the time comm delays
>provided enough skew to cloud the issue), Newton had a bully pulpit to
>strengthen his claim.
Newton was a fierce infighter for the glory of fundamental truth
because he sincerely believed that he was doing the work of God. He
could not accept a lesser position in the discovery of divine truth. A
lesser position in the divine plan he believed he was a major part of
would have crushed his brilliant mind beyond repair.
>He won the PR war. Later in life he stated that he greatly enjoyed "breaking
>Leibnitz's heart". Feet of clay...
He also tangled with the Englishman Robert Hooke and some other
geniuses on the continent. To Newton, "Not Invented Here" was
theologically impossible.
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sun, 10 Jan 1999 17:09:53 GMT
Reply-To: [EMAIL PROTECTED]
On Sun, 10 Jan 1999 06:29:39 -1000, Bill <[EMAIL PROTECTED]> wrote:
>Which compression algorithm would you recommend?
The one which reduces the plaintext the most in size.
(See Chaitin, op. cit.).
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
From: [EMAIL PROTECTED] (Tim Redburn)
Subject: Re: SCOTT19U ENTROPY
Date: Sun, 10 Jan 1999 17:06:47 GMT
I've been looking at the source for
scott19u and at first glance I can see where
the value 9205076 could come from.
Scott19u takes each 19bit word in turn and then
mod's it with it's position. If you take the entropy of this result
to simply be the entropy of log<base 2>(position) then
you get an entropy of 9205076.
However, when performing x mod n, unless n is a factor
of x, then the resulting entropy will be slightly less than
log<base 2>(n).
If I've worked it out correctly then the entropy of y mod n, where
y is a perfectly random number between 0 and x, can be
calculated as follows :
assumptions : x is the maximum number that is going to be used -
(in the case of scott19u this
is 524288 ( = 2^19) the value in T19L in the
source)
a = x mod n
b = (x - a) / n
entropy = (n - a) * ((b/x) log<base2> (b/x))
+ a * (((b+1)/x) log<base2>((b+1)/x))
using this formula in the code below, I get a value of 9187574 bits
for the entropy of the key before being used for the generation
of the S-Box, so I beleive that (assuming my maths is correct) this
value is a maximum for the S-Box entropy.
Please note that I have only checked as far as the key setup -
i.e. what happens to the key before it is used to generate the
permutation. I have not yet checked if the generation of the
permutation removes any more entropy.
Also I haven't checked either the formula or the code properly, so
corrections greatly appreciated. I derived the formula from shannons
paper - page 11. (the numbers seem reasonable though).
So in answer to the question, yes 8000000+ bits is probably correct,
but I think you need to be more precise, to give people
confidence in your algorithm. I suggest 9187574 bits.
////////////////////////////////////////////////////////////
#include <math.h>
#include <iostream.h>
#define T19LL (1245184)
#define T19L (0x80000)
double entropy(unsigned x, unsigned n)
{
double a = x % n;
double b = (x - a) / n; // = the whole number of times, n goes in x.
return (n - a) * ((b / x) * log (b / x)) / log (2)
+ a * (((b+1)/x) * log ((b+1)/x)) / log (2);
}
int main()
{
long double sum = 0;
for (int i = 0; i < T19L-2 ; i++)
sum += entropy(T19L, T19L-i-1);
cout<<(unsigned)(-sum)<<endl;
return 0;
}
//////////////////////////////////////////////////////////////
On Sat, 09 Jan 1999 14:19:12 GMT, [EMAIL PROTECTED] wrote:
> The entidy that prodded me into getting a web page that
>goes by name "Horst O.." (can't spell rest of name).
>Has finally realized that its calculaation of Entopy bits
>H(D) is wrong. He wants it changed. So I am willing to do
>this the question is what should I replace it with. I
>would really like a honest valid one. There was a theard
>on this topic several months ago but at that time I refused
>to change it since wanted to leave his work untouched even
>if wrong. However I have received Email that is from Horst
>and he did not give what I wanted for a change since if I
>bother to change it I want it RIGHT.
> He came up with one number that he said was based on
>work of "Sandy Harris" as an approximate number which
>I think is low. Then he came up with
> log( (2**19)! ) and got a number I did not agree with.
>I read Shannons paper and felt that it should be
>
> log<base2>( ((2**19)-1)! ) which is same as
>
> log( ((2**19)-1)! )/( log(2) )
>
> which is 9,205,076.12815.... This is based on a uniformily
>random key. But if one uses a uniformly random remainder table
>the keyraw.key file. it would be reduced by at most 1.00 so
>I would like to flat state it is above
>
>entropy is 8,000,000+ bits
>
>does anyone have heart burn or a different anwser.
>
>Also made the scott19u contest files. Sent them to
>Joe P. He has stated he would look for obvious errors
>and write a user friendly "read.me" file for it since
>my english not the best. But have yet to hear from
>him so hopefully soon the contest which one only needs
>to solve for 64 bits max is almost ready. I KNOW
>64 BITS IS LESS THAN 8,000,000+ BITS but I want to
>show it is strong and a hope it takes at least 3 months
>for some one to grind out this unsecure contest. The
>NSA will have anwser in an hour. I just hope they don't
>give it to one of there stooges to make contest end
>quickly. Also if method is weak some one may actually
>use there brain to vastly short cut this so a full 64 bit
>search is not necessisary. I am sure that problem is reduced
>in such a way that many such short cuts exisit. I have thought
>of several that reduce it to close to 56 bits.
> So people like Paul Onions may come up with something quite
>fast.
>
>David A. Scott
>
> P.S. as they so on the prono sites "ENJOY"
>
>http://cryptography.org/cgi-bin/crypto.cgi/Misc/scott19u.zip
>http://members.xoom.com/ecil/index.htm
>
>-----------== Posted via Deja News, The Discussion Network ==----------
>http://www.dejanews.com/ Search, Read, Discuss, or Start Your Own
------------------------------
From: [EMAIL PROTECTED] (Kurt Wismer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sun, 10 Jan 1999 17:42:46 GMT
R. Knauer ([EMAIL PROTECTED]) wrote:
: On Fri, 8 Jan 1999 22:51:22 +0000, [EMAIL PROTECTED] (Paul L.
: Allen) wrote:
: >> That, however, is not verifiable - not even "statistically". The best
: >> you can do is test a TRNG to see if it does something which violates
: >> its prime directive, like output all 0s in worst case.
: >Whoops! You can only check a finite length of output sequence. And it
: >is perfectly possible for a working TRNG to generate a series of zeroes
: >for whatever length of finite output sequence you care to postulate.
: We discussed this exact thing at length a year ago and concluded that
: what you say is theoretically correct, namely that 000...0 is a valid
: output, but it is so unlikely for any practical length sequence that
: it must be taken as a sign of malfunction.
strictly speaking, in an unbiased generator 000...0 is just as likely as
any other output...
it's just not a particularly useful key for an otp because in the end it
doesn't hide anything...
--
"some speak the sounds but speak in silent voices
like radio is silent though it fills the air with noises
its transmissions bring submission as ya mold to the unreal
mad boy grips the microphone wit' a fistful of steel"
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Sun, 10 Jan 1999 18:06:13 GMT
Reply-To: [EMAIL PROTECTED]
On Sat, 09 Jan 1999 16:09:42 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:
>Is it your position that it is impossible to come up with such a correlation
>removal technique? or that it is hard to do so and none is now known?
Actually I am not qualified to maintain a position in that regard - I
am only a messenger here. I am telling you what I was told by
self-proclaimed cryptanaysts, namely that a correlation is
fundamentally impossible to remove from streams by algorithmic means.
If you can argue to the contrary, please do because a procedure that
is proveably secure for removing correlation would be valuable in the
construction of a TRNG that is not based on a physical process, such
as digit expansion of transcendental constants or maybe even least
significant bit streams from text or music.
As a side note, it is my understanding that the techniques for
removing bias are proveably secure. Taking two consequitive bits and
filtering them according to a culling procedure as detailed in RFC1750
is touted as being totally effective in removing bias.
The three enemies of streams are periodicity, bias and correlation.
Perodicity is presumably not present in digit expansions of
transcendental constants, bias can be removed to the level of
proveable security, so only correlation remains to be dealt with in a
proveably secure manner.
Bob Knauer
"We hold that each man is the best judge of his own interest."
--John Adams
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************