Cryptography-Digest Digest #132, Volume #13      Fri, 10 Nov 00 00:13:01 EST

Contents:
  Re: Q: Computations in a Galois Field (Paul Crowley)
  Re: Hardware RNGs (Terry Ritter)
  Re: Hardware RNGs (David Schwartz)
  Re: About blowfish... ([EMAIL PROTECTED])
  Re: MORE THAN FULLY BIJECTIVE ! ("Matt Timmermans")
  Re: Hardware RNGs (Terry Ritter)
  Re: On obtaining randomness ("Matt Timmermans")
  Re: Hardware RNGs (Steve Portly)
  Re: MORE THAN FULLY BIJECTIVE ! (John Savard)
  Re: Hardware RNGs (David Schwartz)
  MY BANANA REPUBLIC (SCOTT19U.ZIP_GUY)
  Re: RSA security (DJohn37050)
  Re: Announcement: One Time Pad Encryption - 0.9.3 - freeware ("Trevor L. Jackson, 
III")
  Re: Updated XOR Software Utility (freeware) Version 1.1 from Ciphile  ("Trevor L. 
Jackson, III")
  Re: Hardware RNGs (Terry Ritter)

----------------------------------------------------------------------------

From: Paul Crowley <[EMAIL PROTECTED]>
Subject: Re: Q: Computations in a Galois Field
Date: Thu, 09 Nov 2000 23:13:14 GMT

Mok-Kong Shen wrote:
> > GF(2)^m is the space of vectors of bits.  For example, Rijndael mostly
> > treats byte value as representing values from GF(2^8), but the affine
> > transformation in the S-box can (AFAIK) only be sensibly defined in
> > GF(2)^8 - ie treating the byte simply as a vector of bits and doing a
> > matrix multiply followed by a vector addition.
> 
> The diffusion property of Rijndael's substitution is, I
> suppose, mainly dependent on the 1/x transformation, which
> is done in GF(2^8) and which was the object of my original
> question. As noted by others in another previous thread,
> the affine transformation seems to be able to be replaced
> by similar ones without adverse effects. It would be
> fine if someone would say something definite about these
> points and give the corresponding explanations.

The purpose of the affine transformation is to make sure that the
algebraic representation of the whole S-box is complex, to defeat
interpolation attacks.  It has also been chosen so that the S-box has no
fixed points (S(a) = a) and no "opposite fixed points" (S(a) = ~a). 
Section 7.2 of the Rijndael paper goes into this.
-- 
  __
\/ o\ [EMAIL PROTECTED]
/\__/ http://www.cluefactory.org.uk/paul/

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Hardware RNGs
Date: Thu, 09 Nov 2000 23:26:44 GMT


On Thu, 09 Nov 2000 15:00:34 -0500, in
<[EMAIL PROTECTED]>, in sci.crypt Steve Portly
<[EMAIL PROTECTED]> wrote:

>David Schwartz wrote:
>
>> Steve Portly wrote:
>>
>> > For applications that are network intensive, timing packets would be a better
>> > alternative than timing interrupts.  Network jitter is over 100 times greater than
>> > system jitter  so the laws of physics give you a natural firewall.  "One cycle 
>count"
>> > is easily lost to signal rise times even inside your system case.  I doubt anyone 
>would
>> > be able to monitor TS intervals from a distance of more than a few feet.  This is
>> > sci.crypt so a detailed explanation of system jitter would probably be off
>> > topic.
>>
>>         These are measuring the same thing. So it's not an alternative.
>>
>>         DS
>
>An assembly language call to int 13 takes a different amount of time than a packet 
>arrival.
>The key is to find the minimum time period that will always produces at least one bit 
>of
>entropy.
>Since 1995 CPU frequency wander and system jitter have become a source of entropy.
>
>http://www.ednmag.com/ednmag/reg/1995/070695/graphs/14dfcfga.htm
>
>With my crude analysis I found that it takes about 40 microseconds to get a bit of 
>entropy.

I for one would like to see the details of that analysis.

Nobody denies that crystal oscillator noise-jitter occurs.  But I deny
that it can be detected in software on a conventional system.  

There is another form of "jitter" which is just the expected
relationship of a signal of one frequency sampling a signal of another
frequency.  That occurs independent of quantum events, and has no
continuing randomness at all.  


>My window of error could be anywhere from 10 to 100 microseconds depending on the 
>speed,
>type of system, and entropy rollup you use.  I tested on a pentium 90, 233, and 350mhz
>platforms with good results (a little slower on the 90).

First of all, of course, we have to measure things that our Opponents
cannot measure from outside the security shield.  If the network is
open, they get to measure packet times just like we do.

Next, a real measurement of absolute time generally depends upon
hardware timers which are set up to clear, enable, disable, and be
read.  Simply sampling the "current time" in software is not the same
thing at all, and is not enough.  

In particular, if we sit in a software loop, polling the indication of
interest, a whole lot of other things are going on.  The process we
are running is swapped in and out; interrupts are occurring; memory
refresh is occurring.  And, in general, when these things occur, we
are not really polling the state any more -- we are doing something
else.  And while these other things may complicate the numbers, they
are generally deterministic, not fundamentally random.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: David Schwartz <[EMAIL PROTECTED]>
Subject: Re: Hardware RNGs
Date: Thu, 09 Nov 2000 15:34:56 -0800


Terry Ritter wrote:

> There is another form of "jitter" which is just the expected
> relationship of a signal of one frequency sampling a signal of another
> frequency.  That occurs independent of quantum events, and has no
> continuing randomness at all.

        That is not true. Ideally, the two frequencies are real numbers and
their exact ratio contains an unlimited amount of randomness. Each time
you compare them digitally, it is an independent event that gives you a
better estimate of the real ratio. Thus each sample contains additional
entropy, but less and less.

        DS

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: About blowfish...
Date: Thu, 09 Nov 2000 23:52:11 GMT

In article <8udd29$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Cory C. Albrecht) wrote:
> Depending on what one defines BF_ROUNDS to, you can get either 16 or
20 rou=
> nd=20
> encryption, and they struct for the keys is defined thusly:
>
>     typedef struct bf_key_st {
>         BF_LONG P[BF_ROUNDS+2];
>         BF_LONG S[4*256];
>     } BF_KEY;
>

My memory is too foggy, but Schneier may have defined a
higher-round-count version.  The 16-round version (which needs 18 values
in the P-table, since you use an extra P value at the start and end..)
is the common one, don't worry about more rounds.


=====
"There are no ifdefs in hardware."


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "Matt Timmermans" <[EMAIL PROTECTED]>
Subject: Re: MORE THAN FULLY BIJECTIVE !
Date: Fri, 10 Nov 2000 00:29:11 GMT

This is equivalent to whitening before encrypting with a constant IV.  An
attacker can still detect repeated messages.




------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Hardware RNGs
Date: Fri, 10 Nov 2000 00:36:16 GMT


On Thu, 09 Nov 2000 15:34:56 -0800, in
<[EMAIL PROTECTED]>, in sci.crypt David Schwartz
<[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>
>> There is another form of "jitter" which is just the expected
>> relationship of a signal of one frequency sampling a signal of another
>> frequency.  That occurs independent of quantum events, and has no
>> continuing randomness at all.
>
>       That is not true. 

Your statement is false. 


>Ideally, the two frequencies are real numbers and
>their exact ratio contains an unlimited amount of randomness. 

Conditionally true but impractical, since each additional bit of this
ratio takes twice as long to detect.  Moreover, the ratio is
approximately the same for every new use, so the same values will come
tumbling out, which is hardly fundamental randomness.  


>Each time
>you compare them digitally, it is an independent event that gives you a
>better estimate of the real ratio. Thus each sample contains additional
>entropy, but less and less.

Even "independent events" may well be correlated.  For example,
consider the situation with wrist watches:  Surely, every watch keeps
a different time.  Yet if we ask "when will Bob's watch show 4PM," our
best bet is that it will be very close to when our watch shows 4PM.
So even though watches are not -- and *can* not -- be synchronized in
an absolute sense, they are indeed *correlated* with other
time-keepers.  Which, of course, is the whole point.  

A very similar situation occurs with independent crystal oscillators,
with the exception that different frequencies will be involved.  But
approximately what those frequencies should be will be known, and
every bit we get out (in exponential time) further resolves the actual
exact relationship, a relationship fixed by the particular devices in
that equipment.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: "Matt Timmermans" <[EMAIL PROTECTED]>
Subject: Re: On obtaining randomness
Date: Fri, 10 Nov 2000 00:40:56 GMT


"Mok-Kong Shen" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> But I wanted simply to say that, on the assumption
> that keystrokes of monkeys are truly random, the writings
> of humans, since these could eventually be reproduced by
> monkeys, are not the 'exact' opposite of randomness (i.e.
> 'totally' deterministic).

On the assumption that the keystrokes of monkeys are random, there is _no_
finite sequence of letters that could not, eventually, be reproduced by
monkeys.  So what is special about books?

> Actually this fact is entirely trivial. One buys books because
> one wants to get something whose contents one doesn't
> know for sure.

This leads to a more interesting argument -- the randomness and entropy of
finite sequences is completely subjective.




------------------------------

From: Steve Portly <[EMAIL PROTECTED]>
Subject: Re: Hardware RNGs
Date: Thu, 09 Nov 2000 19:57:18 -0500



Terry Ritter wrote:

> On Thu, 09 Nov 2000 15:00:34 -0500, in
> <[EMAIL PROTECTED]>, in sci.crypt Steve Portly
> <[EMAIL PROTECTED]> wrote:
>
> >David Schwartz wrote:
> >
> >> Steve Portly wrote:
> >>
> >> > For applications that are network intensive, timing packets would be a better
> >> > alternative than timing interrupts.  Network jitter is over 100 times greater 
>than
> >> > system jitter  so the laws of physics give you a natural firewall.  "One cycle 
>count"
> >> > is easily lost to signal rise times even inside your system case.  I doubt 
>anyone would
> >> > be able to monitor TS intervals from a distance of more than a few feet.  This 
>is
> >> > sci.crypt so a detailed explanation of system jitter would probably be off
> >> > topic.
> >>
> >>         These are measuring the same thing. So it's not an alternative.
> >>
> >>         DS
> >
> >An assembly language call to int 13 takes a different amount of time than a packet 
>arrival.
> >The key is to find the minimum time period that will always produces at least one 
>bit of
> >entropy.
> >Since 1995 CPU frequency wander and system jitter have become a source of entropy.
> >
> >http://www.ednmag.com/ednmag/reg/1995/070695/graphs/14dfcfga.htm
> >
> >With my crude analysis I found that it takes about 40 microseconds to get a bit of 
>entropy.
>
> I for one would like to see the details of that analysis.
>
> Nobody denies that crystal oscillator noise-jitter occurs.  But I deny
> that it can be detected in software on a conventional system.
>
> There is another form of "jitter" which is just the expected
> relationship of a signal of one frequency sampling a signal of another
> frequency.  That occurs independent of quantum events, and has no
> continuing randomness at all.
>
> >My window of error could be anywhere from 10 to 100 microseconds depending on the 
>speed,
> >type of system, and entropy rollup you use.  I tested on a pentium 90, 233, and 
>350mhz
> >platforms with good results (a little slower on the 90).
>
> First of all, of course, we have to measure things that our Opponents
> cannot measure from outside the security shield.  If the network is
> open, they get to measure packet times just like we do.
>
> Next, a real measurement of absolute time generally depends upon
> hardware timers which are set up to clear, enable, disable, and be
> read.  Simply sampling the "current time" in software is not the same
> thing at all, and is not enough.
>
> In particular, if we sit in a software loop, polling the indication of
> interest, a whole lot of other things are going on.  The process we
> are running is swapped in and out; interrupts are occurring; memory
> refresh is occurring.  And, in general, when these things occur, we
> are not really polling the state any more -- we are doing something
> else.  And while these other things may complicate the numbers, they
> are generally deterministic, not fundamentally random.
>
> ---
> Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
> Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM

It all boils down to how long it takes you to roll up enough entropy to satisfy your 
needs.


------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: MORE THAN FULLY BIJECTIVE !
Date: Fri, 10 Nov 2000 01:17:16 GMT

On Fri, 10 Nov 2000 00:29:11 GMT, "Matt Timmermans"
<[EMAIL PROTECTED]> wrote, in part:

>This is equivalent to whitening before encrypting with a constant IV.  An
>attacker can still detect repeated messages.

That is true, if an entire message is repeated, there will still be a
problem.

John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: David Schwartz <[EMAIL PROTECTED]>
Subject: Re: Hardware RNGs
Date: Thu, 09 Nov 2000 17:28:37 -0800


Terry Ritter wrote:
> 
> On Thu, 09 Nov 2000 15:34:56 -0800, in
> <[EMAIL PROTECTED]>, in sci.crypt David Schwartz
> <[EMAIL PROTECTED]> wrote:
> 
> >Terry Ritter wrote:
> >
> >> There is another form of "jitter" which is just the expected
> >> relationship of a signal of one frequency sampling a signal of another
> >> frequency.  That occurs independent of quantum events, and has no
> >> continuing randomness at all.
> >
> >       That is not true.
> 
> Your statement is false.

        I stand by my statement, and I'll provide an example.

        You have two digital clocks, one at about 10Mhz and one at about 20Mhz.
They are uncorrelated, but we'll assume that their precise frequency
never ever changes.

        You let the slower clock run for exactly 100 cycles and count the
cycles of the slower clock. You get 203. Now you let the slower clock
run for 10,000 cycles and count the cycles of the slower clock. You get
20,301. You keep repeating using more and more cycles, when have you
exhausted the entropy?

        So how can you say it has "no continuing randomness at all"?
 
> >Ideally, the two frequencies are real numbers and
> >their exact ratio contains an unlimited amount of randomness.
> 
> Conditionally true but impractical, since each additional bit of this
> ratio takes twice as long to detect.  Moreover, the ratio is
> approximately the same for every new use, so the same values will come
> tumbling out, which is hardly fundamental randomness.

        I accepted a very strong restriction to demonstrate that a particular
weak point could be made even with it. You then demonstrate to me what
that restriction restricts. I know that. I didn't accept the restriction
as true, I simply said that even with this incredibly strong
restriction, your point (absence of continuing randomness) is _still_
wrong.
 
> >Each time
> >you compare them digitally, it is an independent event that gives you a
> >better estimate of the real ratio. Thus each sample contains additional
> >entropy, but less and less.
> 
> Even "independent events" may well be correlated.  For example,
> consider the situation with wrist watches:  Surely, every watch keeps
> a different time.  Yet if we ask "when will Bob's watch show 4PM," our
> best bet is that it will be very close to when our watch shows 4PM.
> So even though watches are not -- and *can* not -- be synchronized in
> an absolute sense, they are indeed *correlated* with other
> time-keepers.  Which, of course, is the whole point.

        Irrelevent. Correlation decreases entropy but doesn't remove it. See
the example in my first paragraph.

> A very similar situation occurs with independent crystal oscillators,
> with the exception that different frequencies will be involved.  But
> approximately what those frequencies should be will be known, and
> every bit we get out (in exponential time) further resolves the actual
> exact relationship, a relationship fixed by the particular devices in
> that equipment.

        Right. So we keep getting more and more entropy out, even if the
frequencies _never_ change. Of course, in real life, the frequencies do
change, so we get _even_more_ entropy out.

        DS

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Crossposted-To: talk.politics.crypto
Subject: MY BANANA REPUBLIC
Date: 10 Nov 2000 00:11:22 GMT


  For those of you not in the US let my explain.
The Clinton machine has decided to make GORE president.
Many elctions in my country are rigged. Chicago Daly city
is famous for having rigged elections. They use to have
a saying. In chicago the dead not only vote they vote
many times.
 Apparently they
didn't stuff enough ballots in FLorida. They obviously
added or exchanged more ballots in the last recount. But
You can bet your sweet ass the longer it takes to recount
the democrats will perfect the stuffing until Gore wins.
Cheating is a way of government in my country. But we have
the balls to tell everyone else how to run an election. By the
way the democrats desinged the ballot there bitching about.
 THe next recount if necessary will be desinged to give it
to GORE.

David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
        http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website **now all allowed**
        http://members.xoom.com/ecil/index.htm
Scott LATEST UPDATED source for scott*u.zip
        http://radiusnet.net/crypto/  then look for
  sub directory scott after pressing CRYPTO
Scott famous Compression Page
        http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: RSA security
Date: 10 Nov 2000 02:32:47 GMT

I noticed that a cluster of workstations broke RSA 512.  Does anyone know the
RAM used.  Some claim that the RAM is the limit that is hit first when doing
GNFS.
Don Johnson

------------------------------

Date: Thu, 09 Nov 2000 23:14:08 -0500
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Announcement: One Time Pad Encryption - 0.9.3 - freeware

Paul Schlyter wrote:

> In article <[EMAIL PROTECTED]>,
> Trevor L. Jackson, III <[EMAIL PROTECTED]> wrote:
>
> > Paul Schlyter wrote:
> >
> >> In article <[EMAIL PROTECTED]>,
> >> Dido Sevilla  <[EMAIL PROTECTED]> wrote:
> >>>Tom St Denis wrote:
> >>>>
> >>>> Perhaps you missed the boat, OTP's are not practical solutions!
> >>>>
> >>>
> >>> There are applications for which OTP's can be useful.  The only reason
> >>> why OTP's are not practical is the extremely onerous key distribution
> >>> problem involved in their use.  But some applications where, for
> >>> instance, data gets protected statically, e.g. files on your hard disk
> >>> with keys kept on removable media (kept in a secure location of course),
> >>> and for applications where only short messages need to be infrequently
> >>> transmitted, thus the key distribution operation is not so hard, e.g. by
> >>> exchanging removable media physically for example.
> >>
> >> If you use a OTP, the key will be as large as the data you want to
> >> protect.  Now, if you want to protect data on your disk using OTP,
> >> storing the keys on "removable media kept in a secure location", then
> >> you might as well transfer your sensitive data to that removable media
> >> instead and store it in a safe location: the encrypted data will be
> >> inaccessible without the OTP key on the removable media anyway, and if
> >> you lose that key, you might as well have lost your data.  Right?
> >>
> >> The best use of a OTP is if you meet someone with whom you'll later need
> >> to exchange sensitive data: then exchange a OTP key, to later be used
> >> to encrypt that sensitive data.
> >>
> >> Using OTP to encrypt your local harddisk is quite useless.
> >
> > It depends upon the threat model.  If the threat model is someone snooping
> > the disk than OTP either reduces the utility of the system to ~0 or it leaves
> > the system vulnerable to the threat.
> >
> > However, if the threat is one of _seizure_, then an OTP with the pad stored
> > in volatile memory can be quite useful.
>
> Not particularly, because when the OTP is gone from that volatile
> memory, your data will be incomprehensible unless you've backed up
> your OTP somewhere else.

Yes.

>  Then you might as well store the sensitive
> data itself in volatile memory, with an optional backup of the
> sensitive data if you also need to access it in the future.

No.

Such an arrangement implies that the data can be sacrificed in the event of a
seizure.  That implication may be false.  If you cannot afford to lose the data,
but also cannot afford that the data be exposed, then a volatile OTP secures the
data in both senses.

>  The OTP
> gives you neither more security nor more convenience here.

This conclusion is too general to be accurate.

>
>
> Since an OTP key is as large as the data it's supposed to protect,
> one might as well protect the data itself instead of the OTP key.

This conclusion is not valid.  Consider the backup copy of the OTP as form of
temporal key splitting.

>  As
> far as I can see, the only time an OTP is of practical value is if
> you have a secure communications channel channel now, but not in the
> future when you need to communicate sensitive data.  The typical
> scenario is a spy before leaving for his spying mission.

That is the classic situation that calls for an OTP.  Others include vehicles that
intermittently or permanently lose the secure communications channel.  A ship at
sea is an example of intermittent security while a satellite or space probe is an
example of permanent loss.


------------------------------

Date: Thu, 09 Nov 2000 23:23:58 -0500
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.misc,talk.politics.crypto
Subject: Re: Updated XOR Software Utility (freeware) Version 1.1 from Ciphile 

Scott Craver wrote:

> Tom St Denis  <[EMAIL PROTECTED]> wrote:
> >Richard Heathfield <[EMAIL PROTECTED]> wrote:
> >>
> >> Sorry? You thought of XOR? You invented the One Time Pad? Pure BS.
> >>
> >> > and engineered
> >> > it and I am not going to just give it to you all.
>
> >What does the OP mean by "give it all to us".  a program that xors
> >bytes is not particularly ingenius...
>
>         I was amused by the strong language of having "engineered"
>         his exclusive-or utility.  I can imagine that on a resume:
>         singlehandedly planned, engineered and implemented a piece
>         of computer software for XOR-ing two files together.

If it was good enough to patent in video memory it's probably good enough
to patent in persistent memory.



------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Hardware RNGs
Date: Fri, 10 Nov 2000 05:04:04 GMT


On Thu, 09 Nov 2000 17:28:37 -0800, in
<[EMAIL PROTECTED]>, in sci.crypt David Schwartz
<[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>> 
>> On Thu, 09 Nov 2000 15:34:56 -0800, in
>> <[EMAIL PROTECTED]>, in sci.crypt David Schwartz
>> <[EMAIL PROTECTED]> wrote:
>> 
>> >Terry Ritter wrote:
>> >
>> >> There is another form of "jitter" which is just the expected
>> >> relationship of a signal of one frequency sampling a signal of another
>> >> frequency.  That occurs independent of quantum events, and has no
>> >> continuing randomness at all.
>> >
>> >       That is not true.
>> 
>> Your statement is false.
>
>       I stand by my statement, and I'll provide an example.
>
>       You have two digital clocks, one at about 10Mhz and one at about 20Mhz.
>They are uncorrelated, but we'll assume that their precise frequency
>never ever changes.

Two signals of fixed frequency are inherently correlated, and we can
use that if we start measuring them at the same time:  Something which
happens at cycle 10,000,307 on the 100 MHz signal also happens at
cycle 20,000,614 on the 200 MHz signal.  Nor do the frequencies have
to be an integer relationship.  


>       You let the slower clock run for exactly 100 cycles and count the
>cycles of the slower clock. You get 203. Now you let the slower clock
>run for 10,000 cycles and count the cycles of the slower clock. You get
>20,301. You keep repeating using more and more cycles, when have you
>exhausted the entropy?`
>
>       So how can you say it has "no continuing randomness at all"?

For one thing, the next time you do it again from the start, you get
just about the same result.  That does not sound very entropic to me.

Clearly, your example is no example of randomness.  The statement you
stand by is wrong.  

 
>> >Ideally, the two frequencies are real numbers and
>> >their exact ratio contains an unlimited amount of randomness.
>> 
>> Conditionally true but impractical, since each additional bit of this
>> ratio takes twice as long to detect.  Moreover, the ratio is
>> approximately the same for every new use, so the same values will come
>> tumbling out, which is hardly fundamental randomness.
>
>       I accepted a very strong restriction to demonstrate that a particular
>weak point could be made even with it. You then demonstrate to me what
>that restriction restricts. I know that. I didn't accept the restriction
>as true, I simply said that even with this incredibly strong
>restriction, your point (absence of continuing randomness) is _still_
>wrong.

I have no idea what strong restriction you are talking about.
Harvesting the difference between oscillator frequencies by simple
comparison inherently requires exponential time.  If that is what you
have "accepted," you would seem to have little other choice, except of
course deception and dissembling.  

You cannot measure the variation exists in crystal oscillators with
software in a multitasking operating system, or in the presence of
hardware interrupts, because the computer system itself will have far
more variation than the oscillators.  That variation is, however,
deterministic, and not fundamentally random.  

 
>> >Each time
>> >you compare them digitally, it is an independent event that gives you a
>> >better estimate of the real ratio. Thus each sample contains additional
>> >entropy, but less and less.
>> 
>> Even "independent events" may well be correlated.  For example,
>> consider the situation with wrist watches:  Surely, every watch keeps
>> a different time.  Yet if we ask "when will Bob's watch show 4PM," our
>> best bet is that it will be very close to when our watch shows 4PM.
>> So even though watches are not -- and *can* not -- be synchronized in
>> an absolute sense, they are indeed *correlated* with other
>> time-keepers.  Which, of course, is the whole point.
>
>       Irrelevent. Correlation decreases entropy but doesn't remove it. See
>the example in my first paragraph.

Your example was wrong there, and is wrong here.  

One unknown frequency can be tested against another, but harvesting
this requires exponential time.  We must continually invest twice as
much time as the previous bit to get the next one.  And then we find
the "random" sequence to be pretty much the same as the one we got the
last time we did this.  


>> A very similar situation occurs with independent crystal oscillators,
>> with the exception that different frequencies will be involved.  But
>> approximately what those frequencies should be will be known, and
>> every bit we get out (in exponential time) further resolves the actual
>> exact relationship, a relationship fixed by the particular devices in
>> that equipment.
>
>       Right. So we keep getting more and more entropy out, even if the
>frequencies _never_ change. Of course, in real life, the frequencies do
>change, so we get _even_more_ entropy out.

Nope.  That is impractical because there is a continual exponential
increase in the effort required to measure the difference ever more
finely.  

Oscillator frequencies may drift, but they may drift in pretty much
the same way they did the last time the equipment was turned on.  If
that is your idea of "entropy," I would say that your vision is rather
limited.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to