Cryptography-Digest Digest #141, Volume #9       Thu, 25 Feb 99 18:13:03 EST

Contents:
  Re: Another extension to CipherSaber (wtshaw)
  My Book "The Unknowable" ([EMAIL PROTECTED])
  Re: Define Randomness (R. Knauer)
  Re: Define Randomness (R. Knauer)
  Re: Define Randomness (John Savard)
  Re: Define Randomness (Herman Rubin)
  Re: Pentium III Hardware Random Numbers (Terry Ritter)
  Not Quite Unbreakable... (John Savard)
  Re: Snake Oil (from the Feb 99 Crypto-Gram) (Jim Dunnett)
  Re: Define Randomness (R. Knauer)
  Re: Quantum Computation and Cryptography (R. Knauer)
  Re: Testing Algorithms (Withheld)
  Re: True Randomness - DOES NOT EXIST!!! (BRAD KRANE)
  Re: Pentium III Hardware Random Numbers ([EMAIL PROTECTED])

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Another extension to CipherSaber
Date: Thu, 25 Feb 1999 13:52:16 -0600

In article <[EMAIL PROTECTED]>, Darren New
<[EMAIL PROTECTED]> wrote:

> > Trying to narrow design standards for crypto to meet ever more-restrictive
> > criteria results in the ultimate end of having only one crypto algorithm
> > that can meet them; this is exactly what some want to happen, and to be
> > sure that it is a appropriately crippled product to boot.
> 
> We're not talking about crypto algorithms here. We're talking about
> ASCII armoring algorithms. I see no possible reason why you'd want
> everyone inventing their own ASCII armoring algorithms just for
> arbitrary divergence.
> 
> There's a good reason for lots of different crypto algorithms, but none
> of those reasons apply to making binary go thru email.
> 
To consider that all useful crypto algorithms are binary is really a
mistake, and I always chuckle when I see the term *ASCII Armoring* since
there is nothing sacred about any particular character set or any
particular information unit.  To think otherwise is to highly bias and
restrict crypto in unneeded ways, whch can include a natural pathway to
good emailability.

Two of the most important considerations in an encryption package are the
nature of the plaintext allowed and the nature of the ciphertext desired. 
There is no reason that a composite of all criteria cannot be merged.

The point that I am trying to make with my current series of simple,
perhaps dumb, block ciphers, is that the cosmetics of input and output
sets can be fully integrated into an algorithm.  The middle layer, the
actual encryption one is apt to recognize, can be of more substantial
qualitites that those which I present.  If you favor bit manipulations,
fine; if you want to do something else, fine;  if you want to mix building
blocks in a new method; fine again.  The possibilities are almost endless.

I wish to point out with these efforts that neo-classical methods can be
most useful, and extended in many ways; the choice is not simply between
*broken* classical ciphers, meant to be done entirely by hand, and with a
limited few promising methods that many are working with toward narrow
criteria, example, in the AES process. 

It may come as a shock that all good cryptography is not defined in a
closed manner, but is really more expansive than almost anyone can
consider.  I have my favorite areas, and am apt to stub my toe quickly in
those I have not spent much time, but others have, and the reverse is also
true.  It is in the best interests of crypto that as many people go in as
many directions as possible.  It may be that we never hear again of some
of them, but it is better to report whatever one finds, as best one can.
-- 
A much too common philosophy: 
It's no fun to have power....unless you can abuse it.

------------------------------

From: [EMAIL PROTECTED]
Crossposted-To: sci.math,sci.physics,sci.logic
Subject: My Book "The Unknowable"
Date: Thu, 25 Feb 1999 21:14:36 GMT

Hi, just wanted to say that my book The Unknowable will
be published this spring by Springer-Verlag.  Meanwhile
you can still preview it at

http://www.umcs.maine.edu/~chaitin/unknowable

and

http://www.cs.auckland.ac.nz/CDMTCS/chaitin/unknowable

Rgds,
Greg Chaitin

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 21:17:22 GMT
Reply-To: [EMAIL PROTECTED]

On 25 Feb 1999 13:21:44 -0500, [EMAIL PROTECTED] (Patrick Juola)
wrote:

>It can't cause correlations to emerge unless they were already there.

But isn't there some small amount of correlation in any stream that is
produced by a physical device? Furthermore, if one has a RNG that
produces bias, would not that same RNG also produce correlations? IOW,
don't the two come together?

>If you're asking whether it can possibly make an already correlated stream
>worse, the answer is yes

Exactly.

If you chose a physical process that is inherently biased, with the
hope of removing the bias by processing it out algorithmically, then
you may be "introducing" correlations by strengthening those that are
present originally.

The original intent of my comments was to point out the possibility
that using algorithms to clean up the mistakes of poor TRNG design
could be self defeating. Best design the TRNG correctly at the outset
and not rely on ad hoc cleanups.

Bob Knauer

"Did you ever notice that when a politician does get an idea
he usually gets it all wrong?"


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 21:59:28 GMT
Reply-To: [EMAIL PROTECTED]

On Thu, 25 Feb 1999 19:51:11 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:

>>It is my understanding that Bayesian inference can be used to break
>>stream ciphers if they leak enough information. And that is not the
>>only kind of inferential procedure. There is also Solomonoff
>>inference.

>I would like to know more about this.  If you have references to
>papers, articles on the web, etc., I would read them.  

I can only refer you to that book I have mentioned at least 100 times
here in the last month: Li & Vitanyi "An Introduction to Kolmogorov
Complexity and Its Applications".

As to the applicability of Bayesian inference for breaking stream
ciphers, Patrick Juola is the one to talk to. I also would like to
find some references to that method.

As I understand it, each step of the Bayesian method yields a
quantitiative measure of the correctness of the hypothesis. It is
called "probabily approximately correct". I would imagine that if you
were using the Bayesian method to discover the message, you could use
the level of calculated certainty to quantify the security of the
cipher.

But I am just as much in the dark as you - that's why I keep bringing
this up.

>I find this whole idea that cryptographers can somehow quantify the
>strength of a cipher or TRNG as quite unlikely.

Yet we can certify the OTP cryptosystem as proveably secure.

>If we had this, we
>could quantitatively select the best cipher, or the best TRNG, or the
>best PRNG, or even reliably distinguish between TRNG and PRNG.  This
>seems very much at odds with the cryptography I know.  

We *can* select the best cipher - the OTP. And we can select the best
TRNG - a radioactive decay TRNG. Both are proveably secure when
implemented properly. It is true that they cannot be implemented
*perferctly* but they can be implemented properly to an arbitrarily
small degree on imperfection.

It would seem that in principle we could quantify the security of a
stream cipher by subjecting it to systematic attempts at decryption
withcarefully selected test messages.

Notice that such a characterization is not being performed on the
random keystream - we know there is no way to characterize the
security of a random keystream itself. Using the keystream to build
test ciphers and then try to decipher them is a fundamentally
different methodology.

BTW, when one talks about "leaking information" there is the
implication that a certain quantity of information has been leaked,
and that implies that the quantity of leakage is a measure of the
unsecurity of the cipher. IOW, if half the information has leaked,
then the cipher is only 50% secure, etc.

Bob Knauer

"Did you ever notice that when a politician does get an idea
he usually gets it all wrong?"


------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 22:05:22 GMT

Anthony Stephen Szopa <[EMAIL PROTECTED]> wrote, in part:

>We would all probably agree that the Keno games in Reno or Lake Tahoe or
>Las Vegas, Nevada, produce the twenty numbers used in the game at
>random, by encasing eighty uniquely numbered ping pong balls in that
>plastic sphere and driving them into a tumult with a continuous stream
>of modestly compressed air.

>We would agree to the randomness of the process in part because the
>outcomes are essentially non reproducible.  Furthermore, we would agree
>to the randomness of the process because the outcome generally over long
>but modest runs would certainly demonstrate that each numbered ball has
>the same probability of being among the twenty chosen.

>But we do not have a perfect Keno machine, or perfect information, or a
>sufficiently powerful computer to process this data and give us our
>predictable outcome.  But if we had, we could predict the outcome!

>We now use Big Blue and make our calculations based on a given assigned
>starting point or initial conditions with certain ping pong balls
>located in specific places in space with certain velocities and spins,
>etc.  After about five days and several games we determine a string of
>100,000 numbers with each number being from 1 to 80.
>So the argument is twofold:  first randomness is relative.  What is
>random for some is predictable and non random for others.  And secondly,
>computer programs can produce genuinely true random numbers.

>I rest my case.

>And may you rest in peace trying to refute what I have described and
>concluded because you will surely die trying.

Actually, you have put forth a good argument here.

Rolling dice, or picking lottery balls - those are classical physical
processes. If they're sufficient to produce what is accepted as "true
randomness", then, why wouldn't an _equally detailed_ computer
simulation be acceptable?

One important point - a large amount of randomness goes into the
starting conditions of the physical version. We don't measure the
positions and velocities of all the air molecules in the room. We
don't measure the size of the cage with the balls to twenty decimal
points.

But if we use a pure computer program, without a source of true
randomness to scatter the initial conditions in a way unknown to us,
then we don't have all the free initial entropy that a physical
randomizer gives.

Computer programs can produce cryptosecure pseudorandom sequences.
Mimicing a physical system would be an extraordinarily inefficient way
of doing so.

You may well claim that distinguishing that sort of random number
stream from those produced by dice by calling the one "pseudorandom"
and the other "truly random" is only quibbling.

However, look at it from the other point of view: while an algorithmic
method of generating pseudorandom numbers can be very, very good, it
doesn't have to be. So, if one person can call his algorithmically
generated numbers "truly random", what about the next person, with a
somewhat simpler generator, and the next person after him?

Far better to keep claims straightforward. One fellow generates random
numbers from noise in gas tubes or something - these are true random
numbers, subject to certain cautions and limitations. Those who
generate random numbers algorithmically - they explain what algorithm
they're using, and make claims about how good their techniques are -
how well they approach the ideal of true randomness (which physical
systems fall short of, but in a different way).

And people judge their offerings by the quality of the PRNG - its
level of cryptosecurity. But no one is charged with the task of
setting a boundary where more secure PRNGs are called "truly random"
while less secure ones are only pseudorandom.

Yes, one can design a cipher - and it doesn't have to be a stream
cipher - that allows itself a larger key, and a longer running time,
than necessary for 'just enough' security. But if I can already claim,
with perfect justice, that my method offers more security than anyone
could possibly need - why should I go further, and spoil everything,
by claiming that my method is _perfect_? When that is not *strictly*
true, even if the computer big enough to prove it in practice could
never be built.

John Savard (teenerf is spelled backwards)
http://members.xoom.com/quadibloc/index.html

------------------------------

From: [EMAIL PROTECTED] (Herman Rubin)
Subject: Re: Define Randomness
Date: 25 Feb 1999 15:37:46 -0500

In article <[EMAIL PROTECTED]>,
Trevor Jackson, III <[EMAIL PROTECTED]> wrote:
>R. Knauer wrote:

>> On 24 Feb 1999 11:02:38 -0500, [EMAIL PROTECTED] (Patrick Juola)
>> wrote:

>> >No, but it's "random."  The von Neumann/Knuth pairwise transform would
>> >remove the bias and make it acceptable.

>> I never said it was not random in some sense other than crypto-grade.

>> Kolmogorov Complex strings are Kolmogorov random, but they are not
>> crypto-grade random. A TRNG can output any number, including those
>> that fail the criterion of Kolmogorov prefix complexity.

>> >The point of the original poster being, of course, that a "random"
>> >stream need not be an "unbiased random" stream.

>> You need an unbiased generator for it to be crypto-grade random.

>Not quite.  Any biased generator that captures entropy can be transformed
>by the addition of a post filter into an unbiased generator.  The
>generation cost goes up as the inverse of the filtration ratio.  In essence
>we can retain the entropy of the output and discard the redundancy in the
>output.

There are assumptions needed.  If you know that the results of the 
various outcomes are independent with the same probability, or even
a stationary Markov chain, this can be done using appropriate 
algorithms.  But if all that is known is that the bits are produced
by a random process, this is not the case.

Bias can be removed, but only if one has a restricted class of 
joint distributions.  Given a physical device, the probability
that the outcomes fall into any "natural" class of distributions
is essentially zero.

-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Pentium III Hardware Random Numbers
Date: Thu, 25 Feb 1999 19:52:50 GMT


On Thu, 25 Feb 1999 08:47:51 -1000, in <[EMAIL PROTECTED]>, in
sci.crypt Somniac <[EMAIL PROTECTED]> wrote:

>[...]
>It is impossible to eliminate thermal noise from integrated circuits. 
>Whatever RNG they use, it will have thermal noise as a contributor. It 
>may have several circuit techniques producing unpredictable bits, not 
>just thermal noise. A self-test routine can judge the randomness before 
>it is output. 

If we are just *testing* for sufficient randomness, we might as well
use a PRNG which will pass those tests.  And a PRNG would be far more
compatible with normal digital design than trying to use low-level
on-chip noise.

We cannot distinguish between PRNG and TRNG by general external tests.
But the whole point of the device is to produce unknowable numbers.
These are TRNG, and there is a difference.  That difference is the
unknown algorithm which can*recover the internal PRNG state, and then
reproduce the sequence.  The reason for having a physically-random
device is that we can assume that there can be no such algorithm.


>It is not necessary for it to pass all statistical tests 
>that are possible, as long as it seldom repeats itself and seems 
>unpredictable to adversaries seeking practical profit from 
>predicatability.

I guess the question is:  "To what extent is the design a
physically-random RNG?"  If it is only physically-random to a small
extent, it will become possible to predict the result better than the
ideal RNG.  This is weakness.  

Now, can The Opponents use this, with far less information available
to them?  I don't know, but then again, we *never* know what those
guys can do.  They are smarter than we are, and have far more
equipment, time, and experience.  So even the *possibility* of a
weakness makes me nervous.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Not Quite Unbreakable...
Date: Thu, 25 Feb 1999 22:16:10 GMT

In looking at how the Enigma was broken - and the importance of cribs
to the process - it occurred to me that that particular mode of attack
could be disabled by subjecting the ciphertext to a transposition
cipher before transmission.

In reducing things to the lowest common denominator, I then had the
idea to simplify things by replacing the Enigma by a Bazeries cylinder
(a Jefferson wheel cipher, an M-94).

Thinking of ways to ensure that each message was transposed
differently led me to some complications. But even if one doesn't do
that, surely the ciphertext from the first stage is good enough that
there is very little to use in multiple anagramming?

Let us suppose, though, that one uses a Bazeries cylinder, followed by
simple columnar transposition. And one's adversary has intercepted 26
messages of the same length - although more would be better, perhaps
as many as 600 would make things easier.

Rather than trying to do multiple anagramming - because there is
nothing immediately between columns to suggest which ones should touch
- I see that the kappa test is the place to start.

Essentially, with 26 messages, since there are 25 possible
displacements, there must be, for every stretch of N letters - where N
is the number of disks on the cylinder - two messages with the same
displacement. That will show up in the form of more coincidences.

With enough messages, one can create a map of the ciphertext, showing
which letters come from the first block of 25, which letters come from
the second, and so on...and that will be enough to tell a great deal
about the transposition.

A very difficult problem - not one I would be enthused about
personally attempting - but not outside the reach of possibility.

John Savard (teenerf is spelled backwards)
http://members.xoom.com/quadibloc/index.html

------------------------------

From: [EMAIL PROTECTED] (Jim Dunnett)
Subject: Re: Snake Oil (from the Feb 99 Crypto-Gram)
Date: Thu, 25 Feb 1999 21:28:23 GMT
Reply-To: Jim Dunnett

On 25 Feb 1999 10:25:56 GMT, [EMAIL PROTECTED] (Peter Gutmann)
wrote:

>... which those for whom it's most important (nontechnical types) will have 
>absolutely no understanding of.  Although the term "military-grade security" 
>is meaningless, it seems to be one of the better ways to tell J.Random Luser 
>that this is the strongest level of security available in a program.  Having 
>said that, I'd never use it in anything I wrote.

Our police were complaining last week that a reactivated 
peadophile group was using 'military grade code (!)',
by which I presume they mean PGP.

>From which it is reasonable to assume they haven't access
to cryptanalysis, or if they have the cryptanalysis is
ineffective.

-- 
Regards, Jim.                | An atheist is a man who has
olympus%jimdee.prestel.co.uk | no invisible means of support.
dynastic%cwcom.net           | 
nordland%aol.com             | - John Buchan  1875 - 1940.
marula%zdnetmail.com         |
Pgp key: pgpkeys.mit.edu:11371

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 22:30:27 GMT
Reply-To: [EMAIL PROTECTED]

On Thu, 25 Feb 1999 15:02:34 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:

>> No.  Each pair is operated upon independently, so the actions of one
>> pair cannot affect the actions of any other pair.  The input sequence
>> to the von Neumann transformation is assumed correlation-free, and
>> the transformation is stateless so it cannot introduce any correlations.

>Case closed.  NEXT!

You are far too quick to jump to conclusions until the issue has been
completely explored. Is that because you do not want someone
challenging your pat notions about crypto?

For one thing, the assumption that the sequence is correlation-free is
unfounded for an actual sequence, especially one that has noticable
bias. It is possible that the mechanism that is producing the bias
could also be producing some kind of long-range correlation that
becomes short-range correlation when bits are removed by the vN
procedure.

The issue is not closed until the fat lady sings, and I have not heard
much singing around here lately.

Bob Knauer

"Did you ever notice that when a politician does get an idea
he usually gets it all wrong?"


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Quantum Computation and Cryptography
Date: Thu, 25 Feb 1999 22:34:55 GMT
Reply-To: [EMAIL PROTECTED]

On Thu, 25 Feb 1999 12:58:59 -0600, Medical Electronics Lab
<[EMAIL PROTECTED]> wrote:

>> Are there any cryptographic algorithms designed to be secure against quantum
>> computers?

>Not designed for it certainly.  There may be people working on
>algorithms designed for use with QC's.

There has already been a demonstration of quantum encryption - and it
is proveably secure in the sense that as soon as someone eavesdrops,
the correspondents can detect it and stop the transmission.

>Not exactly practical yet.  For the moment only a 2 bit QC has been
>built.  Getting above 10 bits is going to be very challenging.  It will
>be a few decades before you see a 128 bit QC.

Do you have proof for that statement?

Bob Knauer

"Did you ever notice that when a politician does get an idea
he usually gets it all wrong?"


------------------------------

From: Withheld <[EMAIL PROTECTED]>
Subject: Re: Testing Algorithms
Date: Wed, 24 Feb 1999 22:24:11 +0000
Reply-To: Withheld <[EMAIL PROTECTED]>


[cut]

>> > 56-bit DES was once considered unbreakable but was recently broken in
>>
>> Was it really? I thought the DES spec was published with a lifetime set
>> to expire in the mid 1970's or something?
>>
>> --
>> Darren New / Senior Software Architect / MessageMedia, Inc.
>
>It was originally to have a lifetime of 'about'  20 years.  It was NEVER
>considered unbreakable.  The prior post is yet another example of
>n uninformed person posting misinformation.

What I meant to say was that computing power at the time was
insufficient for a brute-force crack to be viable, rather than any
implication that DES was claimed to be unbreakable forever.

Pardon my poor use of English - points taken!


-- 
Withheld

------------------------------

From: BRAD KRANE <[EMAIL PROTECTED]>
Subject: Re: True Randomness - DOES NOT EXIST!!!
Date: Thu, 25 Feb 1999 20:32:44 GMT


    Yes just the fact of measuring some thing changes it anyways. You would never be
able to exactally recreate some thing

Coen Visser wrote:

> [EMAIL PROTECTED] (Matthias Meixner) writes:
> >BRAD KRANE ([EMAIL PROTECTED]) wrote:
>
> >> True randomness does not exist. It always depends on some variable
> >> at some *FIXED* time. FIXED times are not anywhere near random.
> >> **EVERY** thing that goes on in the universe is hapening because of all
> >> the forces of **EVERY** thing else in the entire universe. If you where
> >> to take mesurements at one place in time in one universe and recorded
> >> it.
>
> [...]
>
> >So you say everything is predetermined, since it totally depends on all
> >forces of every thing else in the universe.
> >So your post and my answer are predetermined.
> >If I kill you, its not my fault, since it is also predetermined.
> >You can make a religion out of it and nobody can prove it to be wrong of
> >course, it is just a matter of believe.
>
> Hmm, well chaotic behaviour clashes with determinism. To be able to predict
> the positions of N objects (N > 2) on time T that exert a gravitational force on
> each other you need to measure their initial values to a certain precision.
> With every step in time the precision you need increases exponentially. So
> I guess you'll need something better than a Turing Machine.
>
> Regards,
>
>         Coen Visser


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Pentium III Hardware Random Numbers
Date: Thu, 25 Feb 1999 22:44:24 GMT

[EMAIL PROTECTED] (Terry Ritter) wrote:

> >Is there some compelling reason why we should trust Intel?
>
> Nope.  No more than we would trust, say, Schneier.

Schneier is building CPU's with built-in random bit sources?

> The issue is not the source of the argument, but rather the argument
> itself.

Maybe.  But I think current events re Intel suggest otherwise.

>          There are technical objections (which have nothing to do with
> conspiracies) to what we think is the Intel design.

Who needs a conspiracy to (say) misimplement even a 'perfect' design?

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to