Cryptography-Digest Digest #900, Volume #8       Wed, 13 Jan 99 16:13:05 EST

Contents:
  Re: On the Generation of Pseudo-OTP (Mok-Kong Shen)
  Re: On the Generation of Pseudo-OTP ("Trevor Jackson, III")
  Re: On the Generation of Pseudo-OTP (John Briggs)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: Shannon's paper ([EMAIL PROTECTED])
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Sarah Flannery and the "Cayley-Purser" Algorithm ([EMAIL PROTECTED])
  Re: Cayley-Purser algorithm? ([EMAIL PROTECTED])
  Re: Metaphysics Of Randomness ("Douglas E.Denny.")
  Re: Metaphysics Of Randomness (R. Knauer)
  Re: Metaphysics Of Randomness (R. Knauer)
  Re: Cayley-Purser algorithm? (Tim Smith)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: On the Generation of Pseudo-OTP (Terry Ritter)
  Re: On the Generation of Pseudo-OTP (John Briggs)

----------------------------------------------------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 19:17:39 +0100

Patrick Juola wrote:
> 

> >This means if you find a true absolute assertion then it must have
> >been deduced from axioms by logical rules.
> 
> Doesn't follow, I'm afraid.  I can make lots of true absolute
> assertions -- "my coffee cup is full," for instance -- that are
> not the product of logical deduction.

You are here not working with paper and pencil but are doing
deduction with your mind. Based on the sensory data, you infer
that the cup is full. The deduction may be wrong either if
you apply the deduction process wrongly (because, say, you are
distracted) or if the sensory data are wrong (meaning that
the premises are wrong). The logical system can be inconsitent,
as is the case with the mentally illed. But I am not sure on
the other hand, whether logic is omnipotent, though it appears
likely to be so. There is however the issue of efficiency of
deduction. If a deduction is inefficient, then it is not practical
and people will opt for non-logical decisions. Well, I am afraid
my knowledge is not sufficient to go into this rather philosophical
but certainly very interesting direction.

M. K. Shen

------------------------------

Date: Wed, 13 Jan 1999 13:20:15 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP

R. Knauer wrote:

> On Tue, 12 Jan 1999 07:45:43 -0500, "Trevor Jackson, III"
> <[EMAIL PROTECTED]> wrote:
>
> >The commentary surrounding these announcements indicates that the results may
> >be compressible.  If so transcendental numbers are probably not appropriate
> >sources of entropy.
>
> Why is that a big problem? The outputs of a TRNG are also compressible
> - some of them anyway - and that does not stop the TRNG from being
> proveably secure.

Not quite.  Individual output streams, instances of the potential outputs, may be
compressible by specific algorithms.  But you cannot design a general purpose
compressor that can reduce the size of all possible TRNG outputs.

Once you rule out the idea of special purpose compressors, which look at the data
and decide which subcompressor to use, you are stuck with the limits of
information theory.  Special purpose compressors reduce (ad absurdum) to a program
that can recognize all possible sequences and emit a unique code for each.  The
special purpose decompressor eats the code and produces the respective sequence.
The catch is the fact that the size of the code has to be at least as large as the
size of the sequence.

Consider this from the perspective of information theory.  In a binary sequence
from a perfect TRNG the odds of the "next bit" being 0/1 are 50:50.  Thus each
data bit represents one bit of information; 100% density.  If you could compress
all such sequences, you would achieve a density greater than 100%.

Try a small example: we lnoy care about 2 bits: "00", "01", "10", and "11".  The
four possible sequences are equally probable.  Can you compress them?


------------------------------

From: [EMAIL PROTECTED] (John Briggs)
Subject: Re: On the Generation of Pseudo-OTP
Date: 12 Jan 1999 16:26:52 GMT
Reply-To: [EMAIL PROTECTED]

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (R. Knauer) 
writes:
>On 11 Jan 1999 19:52:01 GMT, [EMAIL PROTECTED] (John Briggs)
>wrote:
>
>>But that ducks the question of what constitutes a maximum reduction.
>
>I am not trying to duck any question. 

But you are ducking it, nonetheless.

>The comment is no longer
>relevant since it is now clear that Chaitin's definition of randomness
>is not applicable to crypto. You are coming in on this a bit late.

Relevant or not, I'm asking the question.  And you're ducking it.

>His concept of randomness is that it represents the irreducibility of
>a number algorithmically. IOW, if there is no way to reduce the size
>of a number further, then it contains no further information and is
>therefore random, sicne the smallest algorithm needed to reproduce it
>is of the same complexity as the number itself.

I didn't ask about Chaitin.  I didn't ask about algorithmic complexity.

You made a statement about how to define the quality of a data
compression algorithm.  And I am trying to clarify the ground rules
that might make that definition meaningful.

I have noticed that you have a tendency to write loosely about
"random numbers" so that it is rarely clear whether you are speaking
about a particular number as being intrinsically "random" or about
an underlying distribution allowing a number to be "randomly selected".

The two notions are, of course, not equivalent.  And the question
is, in part, an attempt to determine which you typically mean.

>>Do you judge one particular plaintext?
>>Or the average across some distribution of plaintexts?
[snip]
>>I know my answer.  Unfortunately, I think yours is different.
>
>I don't have an answer. However, I would like to hear your comments.

My answer:  An optimal compression algorithm is one that produces the
smallest ciphertext on average when presented with plaintext selected
from a defined distribution.  You choose the distribution first,
then the compression algorithm, then the plaintext.

My guess at your answer:  An optimal compression algorithm is one that
produces the smallest ciphertext when presented with one particular
plaintext.  You choose the plaintext first, then the compression
algorithm.  (This obviously leads to a trivial optimum, but is the only
meaning I can think of that makes the name "Chaitin" applicable).

        John Briggs                     [EMAIL PROTECTED]

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 19:23:04 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 18:55:28 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>I mean it is only through deduction in a logical system can one obtain
>absolute assertions  (in distinction to relative, fuzzy, assertions),

There are still absolute, unequivocal assertions that cannot be
decided by any formal axiomatic system (what you call a "logical
system").

Try to prove this assertion: "This statement is false".

Any attempt to decide the truth or falsity of that statement will lead
to an infinite number of recursive self-references, which cannot be
handled due to the limited finite size of the axiomatic system.

Chaitin touches on this in his algorthmic information theory when he
points out that one cannot deduce anything about a proposition of
complexity greater than the size of the axiomatic system. The
statement above apparently requires an infinite amount of information
to decide.

>Well, I suppose one of Goedel's theorem amounts to saying that there 
>are always matters that can't be proved by a given axiom system.

There are always matters that cannot be proved by ANY formal axiomatic
system that is finite in size.
 
Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Tue, 12 Jan 1999 21:46:55 GMT
Reply-To: [EMAIL PROTECTED]

On 12 Jan 1999 20:57:33 GMT, [EMAIL PROTECTED] (John Briggs)
wrote:

>And you duck the question again.

Very astute. Did it take special training to be able to see that?

>I've asked you a simple, direct question three times now.  And you've
>refused to answer three times.  That's not my problem.  That's yours.

I am waiting for you to read Chaitin's papers. I cannot discuss his
theory with you unless you have.

Bob Knauer

"Anyone that can get elected, is not qualified to serve."
--Lance Bledsoe


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 19:32:14 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 16:38:06 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>> For one thing, making a key out of intelligible text like the strings
>> you propose is not a good idea - it is just a variation on the Book
>> Cipher, so it is weak.

>Never mind weak or strong. Tell me what do you decide if you are
>the analyst.

I would notice that there is a strong regularity in the ciphertext due
to the fact that you used book for the pad. From there I would apply
the techniques for breaking book ciphers based on frequencies of
occurances of characters in the English language. The first
intelligible message that popped out has to be the intended message.

I read somewhere that the NSA can crack such ciphers in no time at all
- like in the blink of an eye.

>Believe me, many of the so-called 'rigorous' proofs in cryptology
>are not rigorous in the absolute sense. Some depend on certain
>number-theoretical hypotheses that are 'believed' by most 
>mathematicians to be true but not yet proved. So these are ALSO 
>heuristics.

That is not what people ordinarily mean by the term "heuristics",
although I suppose it could be used that way.

Can you give an example of an heuristic from number theory.

>Extensive experimentation with care should resolve the issue.

What experimentation? I have not seen any particulars - only vague
references.

>> >If these
>> >are somewhat comparable to those of the (ideal) white noise, you are
>> >(in the statistical, not absolute, sense) sure that (to that degree)
>> >you have succeeded in removing the correlations.
 
>> Are you absolutely sure of that?

>That's the best you can do even with your hardware devices.

Not true. The radioactive decay TRNG is based on a fundamentally
random physical process, not some characterization of the bitstream
that it outputs.

Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 19:35:44 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 12:27:14 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:

>Shielding?  Against neutrinos?  You cannot be serious!  The 50% absorption
>value for lead is 6 trillion miles or so.

And therefore a netrino would not disturb the output, not unless you
put it next to 10 million tons of chlorine isotopes.

Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Shannon's paper
Date: Wed, 13 Jan 1999 19:14:21 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Bodo Moeller) wrote:
> [EMAIL PROTECTED] (Jayant Shukla):
>
> >    I am looking for C.E.Shannon's classic paper "Communication
> > Theory of Secrecy Systems" in postscript of word format.
>
> It's available in Postscript and PDF (luckily not in Microsoft Weird
> format) at <URL:http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html>.
>

Hi!

No, the one Bodo mentioned above is: Shannon, C. A Mathematical Theory of
Communication. Bell Syst. Tech. J., vol. 27, pp. 379-423, July 1948.

The one you want has the follwing journal and web reference (gif images,
though):

        Shannon, C. Communication Theory of Secrecy Systems. Bell
        Syst. Tech. J., vol. 28, pp. 656-715, 1949.  See also
        http://www3.edgenet.net/dcowley/docs.html for readable
        scanned images of the complete original paper.

If you do find HTML or pdf, pls info ;-)

Cheers,

Ed Gerck
_____________________________________________________________________
Dr.rer.nat. E. Gerck                                 [EMAIL PROTECTED]
http://novaware.com.br
 ---  Meta-Certificate Group member -- http://www.mcg.org.br  ---





============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 19:39:01 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 13:20:15 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:

>R. Knauer wrote:
>
>> On Tue, 12 Jan 1999 07:45:43 -0500, "Trevor Jackson, III"
>> <[EMAIL PROTECTED]> wrote:
>>
>> >The commentary surrounding these announcements indicates that the results may
>> >be compressible.  If so transcendental numbers are probably not appropriate
>> >sources of entropy.
>>
>> Why is that a big problem? The outputs of a TRNG are also compressible
>> - some of them anyway - and that does not stop the TRNG from being
>> proveably secure.
>
>Not quite.  Individual output streams, instances of the potential outputs, may be
>compressible by specific algorithms.  But you cannot design a general purpose
>compressor that can reduce the size of all possible TRNG outputs.
>
>Once you rule out the idea of special purpose compressors, which look at the data
>and decide which subcompressor to use, you are stuck with the limits of
>information theory.  Special purpose compressors reduce (ad absurdum) to a program
>that can recognize all possible sequences and emit a unique code for each.  The
>special purpose decompressor eats the code and produces the respective sequence.
>The catch is the fact that the size of the code has to be at least as large as the
>size of the sequence.
>
>Consider this from the perspective of information theory.  In a binary sequence
>from a perfect TRNG the odds of the "next bit" being 0/1 are 50:50.  Thus each
>data bit represents one bit of information; 100% density.  If you could compress
>all such sequences, you would achieve a density greater than 100%.
>
>Try a small example: we lnoy care about 2 bits: "00", "01", "10", and "11".  The
>four possible sequences are equally probable.  Can you compress them?
>

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 19:05:28 GMT
Reply-To: [EMAIL PROTECTED]

On 13 Jan 1999 10:14:57 -0500, [EMAIL PROTECTED] (Patrick Juola)
wrote:

>>Not if you must use that algorithm to reproduce the number, as is
>>required in Chaitin's algorithmic complexity theory. Reduction is one
>>criterion and reproducibliity is another.

>Not if the size of the *algorithm* isn't included in the size of
>the ``compression.''

I do not understand what you just said - and I did read your earlier
comment in the same regard.

I am sure you have read Chaitin's works - in fact, it might have been
you who recommed him to me a year ago. Therefore I know your comments
are not based on ignorance of his theory.

Please elaborate on what you are saying. Chaitin claims that C is of
order 10^3 or smaller (one was around 400) for his programs in his
modified LISP, which is relatively small compared to numbers one might
encounter when discussing randomness.

BTW, with regard to your earlier comment - indeed he does include the
Turing Machine itself in the calculation of algorithmic complexity. He
built his own TM using his own modification of LISP, and IIRC the
whole shooting match was of order 10^3 in size including a general
program to demonstrate his theorems.

Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED]
Subject: Sarah Flannery and the "Cayley-Purser" Algorithm
Date: Wed, 13 Jan 1999 19:22:30 GMT

Hi,

Here is a news report from The Times, england. Has anybody seen this
so called Cayley-Purser Agorithm? Any comments on whether this is an
important development?

========================================================================
Teenager cracks e-mail code

     BY AUDREY MAGEE, IRELAND CORRESPONDENT
  AN Irish schoolgirl was yesterday hailed as a mathematical
  genius after devising a code for sending secret messages by
  computer.

  Sarah Flannery used the science of cryptography to design a
  code that is ten times faster than the one currently used to
  convert confidential information so that it can be sent via the
  Internet and e-mail. She has been inundated with offers of
  jobs and scholarships from international computer companies
  and universities.

  Miss Flannery, 16, from Blarney, Co Cork, used matrices to
  formulate an alternative to RSA, the current data protection
  code, devised by three students at the Massachusetts
  Institute of Technology in 1977. The result is an algorithm, a
  mathematical blueprint, that is far faster than the RSA and
  equally secure.

  Miss Flannery, whose father, David, lectures in mathematics
  at Cork Institute of Technology, devised her code to enter the
  Irish Young Scientists and Technology Exhibition. She won at
  the weekend and left the judges unable fully to comprehend
  her project. They described her work as "brilliant" and one
  judge advised her to patent it.

  Miss Flannery said she was thrilled. "I had to go through lots
  of stuff before I finalised the theory," she said. "I reached
  critical points where I would get stuck for three weeks or so.
  I just kept thinking about it and then the whole thing slipped
  into place." The oldest of five children, she earned eight As in
  her junior certificate, the Irish equivalent of GCSEs, with
  extra tuition from her father.

  Miss Flannery is now deciding what to do with her new code,
  the Cayley-Purser, named after Arthur Cayley, an eminent
  19th-century Cambridge mathematician, and Michael
  Purser, a cryptographer who inspired her. She is considering
  publishing her findings rather than patenting as she does not
  want people to pay for her discovery.

  She will represent Ireland at the EU Science Contest in
  Greece in September.

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Cayley-Purser algorithm?
Date: Wed, 13 Jan 1999 19:46:53 GMT

I complained about the lack of specifics in the BBC online article and
received a response...

They say the Cayley-Purser algorithm *is* public-key and that the faster speed
is for "larger keys".

Still nothing specific about exactly what the algorithm is, so I doubt their
claim that it's more secure can be legitimate.  Who tested it?

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: "Douglas E.Denny." <[EMAIL PROTECTED]>
Subject: Re: Metaphysics Of Randomness
Date: Wed, 13 Jan 1999 20:03:45 +0000

R. Knauer <[EMAIL PROTECTED]> writes

>One conclusion from these considerations is that a number that is
>generated algorithmically, no matter what the algorithm might be, is
>not a true random number - because such a number always has some
>reason for its existence, a reason that you could possibly discover if
>you worked at it long enough or hard enough. That is why only the
>TRNG-based OTP cryptosystem is proveably secure - all other systems
>have an underlying calculable reason behind their encryption schemes.
>
>That means that almost all of classical cryptography (the OTP being
>the sole exception) is not about security but is about obscurity,

There is an inbuilt falacy in this above statement.
If your proposition above about any random number generated is not a true
random number, (for whatever reason)  then the OTP _cannot_  be an
exception as its (random) key is normally derived from an algorithm.

-- 
Douglas E.Denny.  Chichester. England.

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Metaphysics Of Randomness
Date: Wed, 13 Jan 1999 20:36:58 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 18:29:24 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>That's why I guess the claim of Chaitin as described by Knauer
>is questionable, at least to some extent.

I told you that some of Chaitin's ideas are applicable to crypto and
that some are not. To discover that for yourself you need read about
them on that web site I have gave you.

Don't rely on me to do your thinking for you. I am not qualified to
interpret Chaitin's work authoritatively.

If you believe something I have said is questionable, you need to
offer a counter argument so we can debate it.

Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Metaphysics Of Randomness
Date: Wed, 13 Jan 1999 20:39:57 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 17:09:00 GMT, [EMAIL PROTECTED]
(John Savard) wrote:

>>That means that almost all of classical cryptography (the OTP being
>>the sole exception) is not about security but is about obscurity,
>>because if the cryptanalyst were able find the "reason" behind your a
>>given encyption scheme, either formally or experimentally, he could
>>break the cipher.

>I wouldn't dismiss the value of "work factor" so quickly.

I am not - it's probably the only thing that makes most crypto
practical nowadays.

But isn't the work factor part of obscurity - namely that if you did
have enough time, energy and other resuorces at your disposal, you
could discover the underlying scheme and break the cipher?

Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED] (Tim Smith)
Subject: Re: Cayley-Purser algorithm?
Date: 13 Jan 1999 11:34:24 -0800

In article <77igmo$t7o$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
>They never bothered to mention if it's a public-key algorithm.  They just said
>it's thirty times faster than RSA (which isn't very impressive for private-key
>algorithms).
...
>The page this was reported on was:
>
>  http://news.bbc.co.uk/hi/english/sci/tech/newsid_254000/254236.stm

Take another look at the third paragraph.  It says it's a public key
algorithm.

--Tim Smith

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 20:29:40 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 13 Jan 1999 13:34:59 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:


>Pojnt is that an OTP does not have to have a perfect pad to be useful or even 
>properly called
>an OTP.  It needs to be intended for single use.

Not you too. <jeez>

Schneier defines the OTP cryptosystem:

1) It is a stream cipher;
2) The pad is used only once;
3) The pad is totally random.
4) Therefore the OTP system is proveably secure.

IOW, Schneier includes the in the specification that the pad is random
and the cipher is proveably secure. That's how I have always
understood it, and how I thought the crypto community understood it.

Maybe the genesis of the term OTP had it roots in any stream cipher
where the pad is used once, but I cannot see why anyone would think
that discarding a poorly constructed pad would do any good. Would it
good luck, like tossing a pinch of salt over your shoulder or smashing
your champagne glass in the fireplace?

"If I eat this pad, my cipher will be totally secure". Shazaam!

Bob Knauer

"Since the politician never believes what he says, he is surprised
when others believe him."
--Charles De Gaulle


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: On the Generation of Pseudo-OTP
Date: Wed, 13 Jan 1999 20:59:42 GMT


On Wed, 13 Jan 1999 11:11:58 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (R. Knauer) wrote:

>[...]
>That is why a crucial part of the overall specification for this
>scheme is the requirement to decorrelate the sequence by some
>technique like those chronicled in FIPS 140-1, e.g. "strong mixing".

Are you reading the same FIPS 140-1 that I am?  That would be:
"SECURITY REQUIREMENTS FOR CRYPTOGRAPHIC MODULES," 1994 January 11.
I find no instance of "strong mixing" or even "mixing" in that text.


>But I am not aware of any general proof that such a technique will
>actually remove correlation sufficiently for the result to be a
>crypto-grade random number.
>
>I see no discussion of decorrelation here, yet I have asked at least a
>dozen times that posters pick up on it. Does that mean that
>decorrelation is a topic no one cares to offer an enlightened opinion
>on? Or is it that all crypto is about decorrelating something, whether
>it is a bitstream or actual plaintext.

Well, "correlation attacks" may be *the* major topic in stream cipher
design over the last two decades; see "The Story of Combiner
Correlation: A Literature Survey": 

   http://www.io.com/~ritter/RES/COMBCORR.HTM

These attacks address the production of stream cipher confusion
streams by *combining* multiple sources, and the combiners allow
information from the original sources to leak to the output.  We see
attack after attack, and subsequent attempts to avoid the attacks.
But I am unaware of a general theory of "decorrelation" that could
avoid any such attack, or transform a sequence with internal
correlations into a sequence with now correlations.  The lack of such
a theory might make the topic somewhat difficult to discuss.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM



------------------------------

From: [EMAIL PROTECTED] (John Briggs)
Subject: Re: On the Generation of Pseudo-OTP
Date: 13 Jan 1999 20:53:08 GMT
Reply-To: [EMAIL PROTECTED]

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (R. Knauer) 
writes:
>On 13 Jan 1999 13:41:37 GMT, [EMAIL PROTECTED] (John Briggs)
>wrote:
>
>>You made a claim that "the best compression algorithm is the one that
>>produces the smallest output".
>
>And, in the context of Chaitin's algorithmic complexity theory, that
>compression algorithm must reproduce the original number, or it is not
>relevant. That is made manifestly clear in Chaitin's papers.

If it produces the original number then the length of its output is
pretty clearly the length of the original number.  And optimizing
for output length seems a bit silly.  Are you sure that's what you
meant to say?

>>This claim leaves the input string
>>unspecified.  That's not good.
>
>Why is that not good? ...

Because it makes your claim ambiguous.  The best compression algorithm
is one that produces the smallest output from which input?

>...Chaitin's theory applies to any number.

Yes it does.

>
>>So I asked you to clarify.  Which
>>input string?
>
>Chaitin's theory applies to any input string - it does not matter.

Focus.  We're talking about optimum compression algorithms.  What
criteria make a compression algorithm optimal?  The size of the
output.  What else does the size of the output depend on?  The input.

You can't optimize for the one without specifying the other.

>But keep in mind that there are many numbers that cannot be
>algorithmically reduced, not even by one bit. Chaitin's Omega is one
>such number.

And yet, all such numbers can be compressed.  For instance, you just
compressed Chaitin's Omega to 15 characters.

Much depends on how you count.

>>When evaluating a compression algorithm, do you choose the compression
>>algorithm first and then the number to be compressed?  Or do you
>>first choose the number to be compressed and then the algorithm?
>
>In Chaitin's theory, the algorithm can be chosen after you have given
>the number. But whatever algorithm is chosen, it must reproduce the
>number.

Doesn't that mean that output size is fixed and is, therefore,
not a variable that can be optimized for?

>>It's a simple question.  The name "Chaitin" does not appear in
>>the question.  The term "algorithmic complexity" does not appear
>>in the question.  
>
>It surely did when I first brought this up. I am not talking about
>just any compression, I am talking about the specific theory of Greg
>Chaitin.

You do throw the name around a lot.

>>I'm not asking Chaitin.  I'm not asking me.  I'm asking you.
>
>And I have given my response. It would help if you would come into
>these discussions from the beginning.

The response you have (finally) given, "It doesn't matter" is wrong.  

        John Briggs                     [EMAIL PROTECTED]

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to