Cryptography-Digest Digest #13, Volume #9        Sun, 31 Jan 99 21:13:02 EST

Contents:
  Re: Random numbers generator and Pentium III ("Trevor Jackson, III")
  Re: Sanity check take 2 (Edward Keyes)
  Re: RNG Product Feature Poll (Herman Rubin)
  Re: Press release - The Crypto Controversy: no problem (Jim Dunnett)
  Re: Random numbers generator and Pentium III ("Trevor Jackson, III")
  MS ActiveX Licensing Scheme (A. Bettik)
  Re: What is left to invent? (Toby Kelsey)
  Problems resistant to parallelization? (David A Molnar)
  Re: Sanity check take 2 (Edward Keyes)

----------------------------------------------------------------------------

Date: Sun, 31 Jan 1999 16:02:12 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Random numbers generator and Pentium III

R. Knauer wrote:

> On Sat, 30 Jan 1999 19:42:12 -0500, "Kazak, Boris" <[EMAIL PROTECTED]>
> wrote:
>
> >   The useable random sequence MUST be produced by the way unguessable
> >to the opponent and MUST pass the statistical test.
>
> What statistical test? There is none.

You keep repeating this same falsehood.  Please get it straight.  The fact
that there is no statistical test that is sufficient to qualify a sequence
as random DOES NOT mean that a sequence can be qualified as random without
passign the necessary statistical tests.

If you recall your basic statistic you'll remember that you have to work
with two sets of entities.  The full population and the population sample.
The full population may be infinite.  The sample population must be
finite.  Given a finite sample we can express to any desired degree of
confidence whether the full population has the characteristics we are
evaluating.

A population consisting of strset( sample, '0x55' ) is going to yeild a
very low confidence that the full sequence is at all random.

Any worhty RNG has to pass this kind of test for it to be trusted as a real
RNG.  Note the word trust.  It is analogous to the word confidence, which
we evaluate with statistical tests!

It's just not that hard a concept.

>
>
> > Failure to comply
> >with any one of these requirements is (again IMHO) sufficient ground
> >for rejection.
>
> I just began Li and Vitanyi's book "An Introduction to Kolmogorov
> Complexity and its Applications" which was recommended by Patrick
> Juola a year ago and more recently by Coen Visser.
>
> I bring this up because after having read the introduction and the
> first chapter, I can highly recommend this book for the informed
> layman. The first chapter is a must read for crypto enthusiasts, not
> just for Kolmogorov Complexity, but for the discussions of the many
> kinds of randomness - and the fundamental problems in trying to
> characterize it.
>
> BTW, I especially enjoyed reading the part where the author says that
> a number looses its randomness once it has been selected from an
> ensemble. That was a point I was trying to make several days ago.
>
> If you look in the index under the entry "random", you will find more
> kinds of randomness than you ever imagined possible. And according to
> the authors, only Algorithmic Complexity permits one to decide
> randomness based on the string itself. All other forms of randomeness
> require an analysis of the method of generation.
>
> But Algorithmic Complexity is unsuitable for crypto - even Greg
> Chaitin says that (private communication). The reason is because it
> rejects "regular strings", something that is not permitted in
> proveably secure crypto.
>
> The primary requirement for OTP crypto is total unpredictability, and
> that can only be achieved if pads can be generated in all possible
> ways equiprobably. If you start rejecting "regular strings" you will
> open the OTP up to attack.

In theory this is true.  However, there is no known attack that makes it
easier for an attacker to decipher a non-trivial message masked with a key
that has been filtered by 50%.  While the reduction in possible plaintexts
is 50%, the effective loss of information is one bit.  Out of as many bits
are in the message.  Thus the loss in strength is of theoretical importance
one.  It has no practical effect.

>
>
> Bob Knauer
>
> "I place economy among the first and most important virtues and
> public debt as the greatest dangers to be feared. We must not
> let our rulers load us with perpetual debt."
> --Thomas Jefferson




------------------------------

From: [EMAIL PROTECTED] (Edward Keyes)
Subject: Re: Sanity check take 2
Date: Sun, 31 Jan 1999 13:49:49 -0500

In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Edward Keyes) wrote:

> In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
> 
> However, a man in the middle could easily start to fake messages, since
> he would be able to replay old messages encrypted with an old session
> key.  Since no challenge-response is performed, a valid replayed message
> is indistinguishable from a new message that just happened to repeat
> an old session key.
> 
> >    Session keys are never reused, provided the S file is long enough.

Oops, I missed this sentence on my first pass.  Ignore the replay
attack scenario.  But now, of course, you need to arrange to periodically
transmit securely a new file of session keys between the parties, which
needs its own protocols... and if you can do that securely, why not
just send the messages the same way?  ;-)

+------------ Edward Keyes, [EMAIL PROTECTED] -------------+
|................ http://web.mit.edu/eakeyes/www/ ................|
|.... DaggerWare: "small, sharp, and with a heck of a point!" ....|
+- "A little inaccuracy saves a world of explanation." C.E.Ayres -+


------------------------------

From: [EMAIL PROTECTED] (Herman Rubin)
Subject: Re: RNG Product Feature Poll
Date: 31 Jan 1999 16:50:38 -0500

In article <[EMAIL PROTECTED]>,
R. Knauer <[EMAIL PROTECTED]> wrote:
>On Sun, 31 Jan 1999 14:22:38 GMT,
>[EMAIL PROTECTED] (Myself) wrote:

>>Also remember
>>that such an ionization device takes a moment to "recharge" and become
>>sensitive again after a decay, so even with a perfect sample you won't
>>get 100% of the decays you're interested in. (But does it matter?)

>You are not going to get 100% of the decays - some decays never get
>detected. And that does not make any difference if the TRNG is
>designed properly.

>Since any decay occurs at random, the interval between any two of them
>is random. Intervening events, like other decays, are totally
>irrelevant to the randomness of the two events that do get measured.

The intervals are random, but may or may not be independent.  

Also, how are you using the device to generate the random bits?
If you are waiting a length of time and using parity of the number
counted, the bias of the bit produced is not zero, and is improved
by having even a substantial amount of dead time.  I have a report
on this.

However, I suggest that several runs be taken, and the results
XOR'ed.  One cannot reasonably test randomness to more than one
part in 10^3, if that.  But if one combines 8 such bit streams,
and 5 pass the test, the result should be quite good.
-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558

------------------------------

From: [EMAIL PROTECTED] (Jim Dunnett)
Crossposted-To: alt.security.pgp,mics.legal.computing,talk.politics.crypto
Subject: Re: Press release - The Crypto Controversy: no problem
Date: Sun, 31 Jan 1999 22:36:48 GMT
Reply-To: Jim Dunnett

On 29 Jan 1999 06:52:53 GMT, [EMAIL PROTECTED] (test) wrote:

>I guess that the only consolation for the authorities is that traffic analysis 
>can still be used. I would imagine that under certain circumstances, traffic 
>analysis can yield pretty useful information.

It's better than nothing, but successful cryptanalysis
is obviously very much more preferrable.

It helps a little to know who the target communicates with,
how often and under what circumstances: eventually leading
to more or less valid assumptions as to what he may be
up to.

-- 
Regards, Jim.                | I would be happy to see the devil's
olympus%jimdee.prestel.co.uk | buttermilk banned from Society.
dynastic%cwcom.net           | 
nordland%aol.com             | - Iain Paisley, discussing Guiness.
marula%zdnetmail.com         |
Pgp key: wwwkeys.uk.pgp.net:11371

------------------------------

Date: Sun, 31 Jan 1999 19:20:14 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Random numbers generator and Pentium III

This is getting tedious.  Please read more carefully.

R. Knauer wrote:

> On Sun, 31 Jan 1999 16:02:12 -0500, "Trevor Jackson, III"
> <[EMAIL PROTECTED]> wrote:
>
> >You keep repeating this same falsehood.
>
> Yeah, me and von Neuman.
>
> "Anyone who considers arithmetical methods of producing random digits
> is, of course, in a state of sin. For, as has been pointed out several
> times, there is no such thing as a random number - there are only
> methods to produce random numbers, and a strict arithmetic procedure
> is of course not such a method."
>
> Although von Neumann was focusung on the method of generation is that
> statement above, his comment: "There is no such thing as a random
> number" is what I am defending here.
>
> >Please get it straight.
>
> No, you get it straight. You are the one who is going against
> mathematical tradition, not me.
>
> >The fact
> >that there is no statistical test that is sufficient to qualify a sequence
> >as random DOES NOT mean that a sequence can be qualified as random without
> >passign the necessary statistical tests.
>
> You are overlooking two most fundamental problems with that assertion:
>
> 1) If you use statistical tests to qualify numbers as random, you will
> reject a TRNG, since some of its outputs are not stochastically
> random.

Irrelevant.  We are NOT trying to qualify numbers as random.  Got that?  The
premise of your paragraph above is false.

We are trying to DISqualify a number generator.  That is what statistical tests
are for.  They detect patterns.  Patterns disqualify GENERATORS not numbers!

>
>
> 2) If you use statistical tests to qualify numbers as random, you will
> accept a PRNG, since some of its outputs are stochastically random.
>
> The best you can hope to accomplish with statistical tests is to put
> some kind of confidence limits on the suitability of the generator.
> But that falls far short of the agenda for the OTP system. Confidence
> limits on suitability are not only of no value for a system that must
> be proveably secure, but they are dangerous in that they can give a
> false sense of confidence.
>
> >It's just not that hard a concept.
>
> Actually randomness is an extremely hard concept in many ways. Just
> the fact that there are so many different and contradictory kinds of
> randomness attests to that.

Well, on this particular subtopic your obtuseness can be considered an
existential proof that the value of statistical evaluation of random sequences
is hard.  You can fix that.

>
>
> Li and Vitanyi in their book on Kolmogorov Complexity demonstrate that
> all probabilistic measures of randomness fail except for algorithmic
> complexity. They show many reasons why stochastic randomness is not
> suitable for finite strings. For example if p is the probability for
> the bit 1 to occur, then infinite strings will have an infinite number
> of 1s as aexpected only if p>=1/2. If p<1/2, then there are only
> finite number of 1s in an infinite sequence. IOW, if p = 0.5-epsilon,
> then the infinite number is not random.
>
> They also show that there is a very close relationship between
> algorthmic complexity and shannon entropy, your favorite concept.
> Unfortunately algorithmic conplexity is unsuited for crypto-grade
> randomness..
>
> >In theory this is true.  However, there is no known attack that makes it
> >easier for an attacker to decipher a non-trivial message masked with a key
> >that has been filtered by 50%.
>
> Patrick Juola and others have disagreed with that so many times we
> could write a book on it. Once you start filtering strings you open
> the OTP up to a Bayesian inference attack.

No.  Juola et al correctly asserted a theoretical weakness.  When quantified
that weakness turns out to be irrelevant for any non-trivial implementation.


------------------------------

From: A. Bettik <[EMAIL PROTECTED]>
Subject: MS ActiveX Licensing Scheme
Date: Mon, 01 Feb 1999 00:05:53 GMT

I'm interested in finding out how MS implements its activex registry
licensing scheme and wonder if anyone knows more? From what I can gather, two
complimentary techniques are available - a file key and a registry key. I
understand how the file key system works, but the registry I can't figure
out. From what I can see a registry key is created along with string data,
apparently encrypted somehow. Is this a known cipher and what does MS use as
the key? Anyone offer any pointers?

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: Toby Kelsey <[EMAIL PROTECTED]>
Subject: Re: What is left to invent?
Date: Fri, 29 Jan 1999 00:54:28 +0000

In article <[EMAIL PROTECTED]>, R. Knauer
<[EMAIL PROTECTED]> writes
>On Tue, 26 Jan 1999 12:51:50 +0000, Toby Kelsey
><[EMAIL PROTECTED]> wrote:
>
>>>For example, if your key is 56 bits, like with DES, then any
>>>intelligible message you uncover over 8.2 bits will be the intended
>>>message with high confidence. If you uncover a message that is
>>>significantly longer than 8.2 bits, then it is almost certain to be
>>>the intended message.
>
>>This presumes that decipherments always fall randomly in the space of
>>all possible (intelligible or not) messages.
>
>Why do you say that?

Because if a system is designed such that decryptions are biased
towards plausible messages your purely statistical calculation of
unicity distance no longer works.

>I was told that the probability for an intelligible message, other
>than the intended message, is vanishingly small when the message
>length exceeds the unicity distance. Therefore only one possible
>plaintext is intelligible - your intended message.

Yes, this is true for current cryptosystems which do not attempt to
generate plausible messages with incorrect keys.

It is not true generally.

Toby
-- 
Toby Kelsey

------------------------------

From: David A Molnar <[EMAIL PROTECTED]>
Subject: Problems resistant to parallelization?
Date: 1 Feb 1999 00:26:09 GMT


Hey there,

In "Time-lock puzzles and Time-release Crypto", Rivest, Shamir, and Wagner 

[available at http://theory.lcs.mit.edu/~rivest/RivestShamirWagner-timelock.ps]

define a scheme using a variant of the Blum-Blum-Shub generator that 
provides resistance to parallelization. The claim is that repeated 
squaring is somehow not inherently parallelizable. 

What counts for this definition of "paraellizable" ? i.e., which
of the following will not help me square any faster ?
        * distributed.net ?     
        * a Beowulf cluster ?
        * a Connection Machine?
        * an SMP machine?
        * ASCI Option Red?
        * Pentium IV?

Pointers to resources on how to evaluate this question are much, much
appreciated. 

What other problems are similarly resistant to parallelization? I would
guess problems that require much interaction between subparts or are
not easily divisible into subparts. Anyone have a list of such
problems or suggestions for evaluating a problem for this 
criterion?

Thanks! 

-David Molnar

------------------------------

From: [EMAIL PROTECTED] (Edward Keyes)
Subject: Re: Sanity check take 2
Date: Sun, 31 Jan 1999 18:54:57 -0500

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:

>   So here you assume that _somebody_ keeps this key in memory...
>   But a helper file can be kept on a floppy and guarded in a safe...

True, but that's quite inconvenient, especially for use on a mobile
computing platform.  Personally, I'm more willing to trust secrets
to my head than to a safe, since safes are only secured with a
memorized combination anyway (or a physical key that's also subject
to a physical attack).

>   Why another session key? The number pair   "Use 236315, 25" 
>   IS your session key, you just extract it from the helper file.

That was just an example... to give another one, say you wanted to
send your friend an encrypted message, to see if he can crack it...
but there's no way he could tell that message from random bytes,
and therefore couldn't authenticate you without additional info.

>   However, the checksum at the end of the file is a must, this way the
> ciphering program can automatically check if the file is decrypted with
> the correct key.

Right.  I was just pointing out that such a checksum would be a
necessary addition to your protocol, since all the authentication
rests on it.

>   What would be the point of retransmitting the old message without
> knowing its content? It can only alert the correspondents...

I deposit $1000 in my bank, and message X immediately is sent to their
central vault computer.  Perhaps I don't know the code, and don't know
exactly what the message says.  But if I can send it to the vault
computer a thousand more times, I may just end up a millionaire.

It is a mistake to think that an intelligent human is always the
recipient of a message.  It could easily be a dumb computer, too,
that doesn't know that people rarely repeat messages a thousand times.

>   Here you are contradicting yourself. Before you said that the 256-bit
> key is "memorized", which clearly means *people*. Now you say that
> communication is between machines. Sorry, but I cannot figure out the
> actual environment of your problem.

It's quite common... a human being gives a password to a computer,
and then the computer communicates with another computer to do things
for the human.  Maybe the messages are eventually intended for the
human's eyes, maybe they are just administrative (to keep the two
computer's time clocks synchronized, for instance).  In any case, the
point is to have a protocol that doesn't *rely* on a human continually
monitoring the channel to make sure things work correctly.  I daresay
you wouldn't want to have to read all the bits and bytes flashing around
between your browser and the net.

> Also, many of replay attacks can be
> redered harmless just by incorporating a time stamp or serial number 
> inside the message. This can also help to alert the correspondents 
> on the missing messages and out-of-order messages.

True, but features like that need to be specified in the protocol,
to be assured that they won't be left out of an implementation.

+------------ Edward Keyes, [EMAIL PROTECTED] -------------+
|................ http://web.mit.edu/eakeyes/www/ ................|
|.... DaggerWare: "small, sharp, and with a heck of a point!" ....|
+- "A little inaccuracy saves a world of explanation." C.E.Ayres -+


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to