Cryptography-Digest Digest #135, Volume #13      Fri, 10 Nov 00 14:13:01 EST

Contents:
  Re: MY BANANA REPUBLIC (Mok-Kong Shen)
  Re: RSA security (Quisquater)
  Re: Hardware RNGs (Terry Ritter)
  Re: Hardware RNGs (Terry Ritter)
  voting through pgp ("binary digit")
  Re: Request for code ("binary digit")
  monoalphabetic cipher ("Keith Monahan")
  Re: Calculating the redudancy of english? (Bill Unruh)

----------------------------------------------------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: MY BANANA REPUBLIC
Date: Fri, 10 Nov 2000 18:17:58 +0100



Runu Knips wrote:
> 
> If the elections in Florida would have been rigged, the
> result would be CLEAR. Or the people which manipulated
> it are very incompetent losers.
> 
> And if the US would have a reasonable election system,
> Gore would be president now anyway.

I can't see how the current US election has anything to
do with crypto, excepting that the system is probably void 
of any 'security' which however as a general topic could
be claimed to concern our group. (In a previous thread 
quite a time ago I learned that there is nothing to 
prevent anyone in US to give more than one vote through 
going to different voting locations, there being no 
identity cards or registrations to rigourously control the 
voters. If that is indeed true -- I don't exactly know --, 
then the election could just as well be replaced by casting 
a die, like in lottery.)

M. K. Shen

------------------------------

From: Quisquater <[EMAIL PROTECTED]>
Subject: Re: RSA security
Date: Fri, 10 Nov 2000 18:49:15 +0100

Bob Silverman wrote:
> 
> In article <8uh35j$21p$[EMAIL PROTECTED]>,
>   [EMAIL PROTECTED] wrote:
> 
> <snip>
> > > Peter Montgomery had already done some experiments on a tightly
> > > coupled SGI R10000 cluster; he reported only getting about 20%
> > > processor efficiency on 16 machines. Running block lanczos is
> > > easy on a *small* number of processors; the problem is that
> > >  communication bottlenecks prevent scaling.
> > Has he published these results? I'd heard that he was working on the
> > problem of parallelizing block lanczos, but I hadn't heard what the
> > results were.
> 
> He presented them at the 2000 RSA Conference.

And again better at ECC 2000, see the slides (powerpoint: Peter is working 
for Microsoft :-)) and the nice paper at

http://www.exp-math.uni-essen.de/~galbra/eccslides/eccslides.html

Some of my students are working about that. More in few months.

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Hardware RNGs
Date: Fri, 10 Nov 2000 17:46:00 GMT


On Thu, 09 Nov 2000 21:40:40 -0800, in
<[EMAIL PROTECTED]>, in sci.crypt David Schwartz
<[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>> >Terry Ritter wrote:
>> >>
>> >> On Thu, 09 Nov 2000 15:34:56 -0800, in
>> >> <[EMAIL PROTECTED]>, in sci.crypt David Schwartz
>> >> <[EMAIL PROTECTED]> wrote:
>> >>
>> >> >Terry Ritter wrote:
>> >> >
>> >> >> There is another form of "jitter" which is just the expected
>> >> >> relationship of a signal of one frequency sampling a signal of another
>> >> >> frequency.  That occurs independent of quantum events, and has no
>> >> >> continuing randomness at all.
>> >> >
>> >> >       That is not true.
>> >>
>> >> Your statement is false.
>> >
>> >       I stand by my statement, and I'll provide an example.
>> >
>> >       You have two digital clocks, one at about 10Mhz and one at about 20Mhz.
>> >They are uncorrelated, but we'll assume that their precise frequency
>> >never ever changes.
>> 
>> Two signals of fixed frequency are inherently correlated, and we can
>> use that if we start measuring them at the same time:  Something which
>> happens at cycle 10,000,307 on the 100 MHz signal also happens at
>> cycle 20,000,614 on the 200 MHz signal.  Nor do the frequencies have
>> to be an integer relationship.
>
>       Why 20,000,614? Why not 20,000,613 or 20,000,615? Remember, the
>frequencies are not precisely 10Mhz and 20Mhz. They are just perfectly
>constant.

Fine.  Knowing the exact time +/- 1 cycle out of 20,000,000 sounds
great to me.  

Any signals which mark absolute time are correlated.  All we need to
know about them are their frequencies or the ratio of their
frequencies, and we can get that as close as we want by sampling over
exponential time.  

Unfortunately, we can afford to go only so far down exponential time.

 
>> >       You let the slower clock run for exactly 100 cycles and count the
>> >cycles of the slower clock. You get 203. Now you let the slower clock
>> >run for 10,000 cycles and count the cycles of the slower clock. You get
>> >20,301. You keep repeating using more and more cycles, when have you
>> >exhausted the entropy?`
>> >
>> >       So how can you say it has "no continuing randomness at all"?
>> 
>> For one thing, the next time you do it again from the start, you get
>> just about the same result.  That does not sound very entropic to me.
>
>       just about the same != the same

So?  It is pretty easy to complete a search when the object of
interest is known to be not far away.  


>       You are falling into the fallacious chain of reasoning that entropy
>requires all results be equally probable. A messed up room has tons of
>entropy, however the result where everything in its place is very
>improbable.

On the other hand, you repeatedly fall into the error of thinking that
any little change at all is just as good as a big change.  It is not.


>       Consider 70 particles each randomly placed in a room. The situation
>where 35 particles are on the left and 35 on the right is far more
>probable than the case where all 70 are on the left. That doesn't mean
>there isn't entropy in the positioning.

Oh, please.  We are not talking about 70 objects.  We are talking
about 2 frequencies, and the ability to measure one against the other
in exponential time.  At any particular level we may be off a cycle or
so, which is simply not much uncertainty.


>       Yes, repeat the process and you will get "just about the same" result.
>The entropy is in the "just about". If the two frequencies are
>independent, odds are they'll never line up exactly the same way twice.
>(In other words, if you start a trial at time T and you start another
>trial at any other time, the two trial results will eventually differ).

Clearly that description is unlike anything we have discussed.  If we
had such a situation -- if we had 70 objects which were randomly
shuffled upon power-up -- we might well be able to use that entropy.

But deterministic digital systems do not do that.


>> Clearly, your example is no example of randomness.  The statement you
>> stand by is wrong.
>
>       I'm not sure if you are being deliberately dense or really are dense. I
>can't imagine how I can be any clearer or more obvious than I've been.

Anyone can judge that for themselves from my web page:  One might
start with my crypto glossary, including statistical terms.  One could
read my ink-on-paper articles from IEEE Computer and Cryptologia.  One
could look at various formal articles I have published to sci.crypt,
including Boolean function nonlinearity, orthogonal Latin square
construction and measurement, and a massive 25-page characterization
of noise source data.  Or one might just look at literally thousands
of my sci.crypt postings over more than a decade.  

And -- while you are doing that -- what shall I look at that *you*
have done?  

I suggest re-thinking your position.  You are seriously mistaken in
many ways, abysmally ignorant in others, and quite offensive to boot.
It is time for you to accept your errors and learn from them.  


>> >> >Ideally, the two frequencies are real numbers and
>> >> >their exact ratio contains an unlimited amount of randomness.
>> >>
>> >> Conditionally true but impractical, since each additional bit of this
>> >> ratio takes twice as long to detect.  Moreover, the ratio is
>> >> approximately the same for every new use, so the same values will come
>> >> tumbling out, which is hardly fundamental randomness.
>> >
>> >       I accepted a very strong restriction to demonstrate that a particular
>> >weak point could be made even with it. You then demonstrate to me what
>> >that restriction restricts. I know that. I didn't accept the restriction
>> >as true, I simply said that even with this incredibly strong
>> >restriction, your point (absence of continuing randomness) is _still_
>> >wrong.
>> 
>> I have no idea what strong restriction you are talking about.
>
>       The restriction being that the two frequencies are perfectly constant.

In practice, nothing is perfect.  But knowing that imprefection exists
does not make measurement easier.


>> Harvesting the difference between oscillator frequencies by simple
> comparison inherently requires exponential time.  If that is what you
>> have "accepted," you would seem to have little other choice, except of
>> course deception and dissembling.
>
>       So, it requires exponential time, so what? 

So that makes collecting even 64 valid bits way out of reach.  And
then what do we do if we want even another random bit?


>Are you finally admitting
>that the entropy is there and can be measured to any desired level of
>accuracy?

Eventually one runs out of measurement accuracy, but way before that
one runs out of time.  

 
>> You cannot measure the variation exists in crystal oscillators with
>> software in a multitasking operating system, or in the presence of
>> hardware interrupts, because the computer system itself will have far
>> more variation than the oscillators.  That variation is, however,
>> deterministic, and not fundamentally random.
>
>       THAT MAKES NO DIFFERENCE. You can add any amount of deterministic data
>you want to random data, and the data is still random. 

Presumably you are introducing something else -- some part of the rest
of the system -- in an attempt to hide the fact that your current
argument is wrong.  The amount of "entropy" to be mined from crystal
oscillators is very small, and is largely repeatable from power-on
over the timeframe of interest.  Is repeatable entropy really entropy?

Where do you think randomness comes from, anyway?  What quantum effect
is being translated into a state you can measure?  And if there is no
such effect, how do you imagine that -- with the same equipment,
operated in the same way -- that deterministic digital systems will
somehow give you a random result?  The whole idea makes no sense at
all.  


>Suppose I have a
>stream of truly random integers and I add a stream of predictable
>integers to them, the final results are as unpredictable as the
>unpredictable integers. All the predictable stuff does it add
>predictable amounts of delay to the unpredictable amounts of delay.

None of which has anything to do with measuring oscillator frequency.


>       Now, I've said this same thing at least four times now. And it's so
>simple anyone can understand it. So your refusal no longer seems to be
>any sort of legitimate difference of opinion or understanding. So I'm
>starting to seriously question your motives.

By all means, question whatever you want.

On the other hand, I have no question at all about your competence,
knowledge or desire for truth.

 
>> >> >Each time
>> >> >you compare them digitally, it is an independent event that gives you a
>> >> >better estimate of the real ratio. Thus each sample contains additional
>> >> >entropy, but less and less.
>> >>
>> >> Even "independent events" may well be correlated.  For example,
>> >> consider the situation with wrist watches:  Surely, every watch keeps
>> >> a different time.  Yet if we ask "when will Bob's watch show 4PM," our
>> >> best bet is that it will be very close to when our watch shows 4PM.
>> >> So even though watches are not -- and *can* not -- be synchronized in
>> >> an absolute sense, they are indeed *correlated* with other
>> >> time-keepers.  Which, of course, is the whole point.
>> >
>> >       Irrelevent. Correlation decreases entropy but doesn't remove it. See
>> >the example in my first paragraph.
>> 
>> Your example was wrong there, and is wrong here.
>
>       You've yet to show how.

I have.

 
>> One unknown frequency can be tested against another, but harvesting
>> this requires exponential time.  We must continually invest twice as
>> much time as the previous bit to get the next one.  And then we find
>> the "random" sequence to be pretty much the same as the one we got the
>> last time we did this.
>
>       You are contradicting yourself. First you say that we do in fact keep
>getting entropy, then you say we don't.

Theoretically, in a handwave sort of first-level-approximation way, we
might naively imagine continuing the process forever.  

But since each bit takes twice as long as the one before, in practice
we must give up and move on before we get even 64 bits.  

If we can't do something in practice, we can't do it.  That is the
distinction between dreams and reality, and is no contradiction at
all.  

 
>> >> A very similar situation occurs with independent crystal oscillators,
>> >> with the exception that different frequencies will be involved.  But
>> >> approximately what those frequencies should be will be known, and
>> >> every bit we get out (in exponential time) further resolves the actual
>> >> exact relationship, a relationship fixed by the particular devices in
>> >> that equipment.
>> >
>> >       Right. So we keep getting more and more entropy out, even if the
>> >frequencies _never_ change. Of course, in real life, the frequencies do
>> >change, so we get _even_more_ entropy out.
>> 
>> Nope.  That is impractical because there is a continual exponential
>> increase in the effort required to measure the difference ever more
>> finely.
>
>       Of course this is impractical, that's because we started with
>impractical assumptions. I am not saying that you can design practical
>systems with unpractical assumptions. I'm saying even with your
>ridiculous assumption that the frequencies never change, you can still
>get out as much entropy as you need. In other words, there is tons of
>entropy there.

No.  That is the difference between theoretical delusions and actual
practice.  

 
>> Oscillator frequencies may drift, but they may drift in pretty much
>> the same way they did the last time the equipment was turned on.  If
>> that is your idea of "entropy," I would say that your vision is rather
>> limited.
>
>       You throw around words like "pretty much". What you don't see
>(deliberately, I know) is that anything that can only be described in
>statistical or approximate terms _is_random_ to the extent that it can
>only be described in statistical or approximate terms.

Nonsense.  "Pretty much" means that one does not have to do much
searching to get to the right answer.  And making the Opponent search
is exactly what cryptography is all about.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Hardware RNGs
Date: Fri, 10 Nov 2000 17:46:09 GMT


On Thu, 09 Nov 2000 21:44:27 -0800, in
<[EMAIL PROTECTED]>, in sci.crypt David Schwartz
<[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>
>> First of all, an exponential wait will kill you.  To collect the next
>> bit of the difference between oscillator frequencies, you must wait
>> twice as long as you did for the last bit.  Just how long do you think
>> this can possibly be a reasonable source?
>
>       4096 bits of entropy is adequate for pretty much any purpose I can
>imagine. 

I would think it should be obvious to even the most naive observer
than an exponential wait per bit is not going to be satisfactory, but
apparently not, so here we go:

Let's say we can get the first bit in one cycle @ 100 MHz (10**8
cycles per second), or 10 nsec.  So the next bit takes 20 nsec, the
next 40 and so on.  

In other words, the time for bit n = tn = 2**n * 10 nsec.


t1 = 2 * 10**-9 sec = 20 nsec
t2 = 40 nsec
. . .
t10 = 1024 * 10**-9 sec = 10.24 usec
. . .
t20 = 1,048,576 * 10**-9 sec ~ 10**6 * 10**-9 sec ~ 10.48 msec
. . .
t30 = 1,073,741,824 * 10**-9 sec ~ 1 sec
...
t40 ~ 1.1 * 10**12 * 10**-9 sec ~ 1.1 * 10**3 sec ~ 1100 sec ~ 18 min
...
t50 ~ 1.1 * 10**15 * 10**-9 sec ~ 1.1 * 10**6 sec ~ 2 weeks


So, the 50th bit takes about 2 weeks of continuous measurement, and we
want bit 4096.  Hmmm.


>In any event, in most real world cases, you get entropy as you
>continue to work, so you really only have a problem with an entropy
>shortage during startup.

And from where, exactly, do you imagine this "entropy" coming?

 
>> And then, the sequence you get is pretty much the same as it was the
>> last time you did it from the start.  The equipment has not changed,
>> the oscillators have not changed, their heating has not changed;
>> exactly what different thing about this can you expect to make the
>> result different?
>
>       There heating has changed. Two fans never spin up the same way twice.
>The temperature in a room is never the same twice. The distribution of
>heat over the surface of the crystal is never the same twice. As you
>admit, it's "pretty much the same". The fact that it's someone can only
>guess "pretty much" what it is is where the entropy is.

It is indeed unlikely that 2 real quantities will ever be the same,
just as it is unlikely that 2 clocks or 2 watches will ever measure
exactly the same time.  Nevertheless, we do manage to get along with
the approximation.  And if an Opponent can simply *approximate* a
value we imagine to be random, that is a pretty nice step up from
nothing at all.  

Crystal oscillators are deliberately designed and constructed to be
stable.  Most watches and clocks now use crystal oscillators to keep
time.  All cell phones use crystal oscillators, and they manage to
communicate on "the same" frequency.  

Many shortwave receivers use crystal oscillators, which allows us to
hear any short-term defect in oscillator stability by listening to
SSB, Morse code or beating against some fixed station.  Typically,
there may be a 70 MHz 1st IF, and a crystal oscillator mixes that down
to 10.7 MHz or even 0.455 MHz.  If that crystal oscillator were to
move randomly even 100 Hz (at 70,000,000 Hz), a 100 Hz difference
would be produced at audio, which would be extremely apparent.  That
does not happen.  Drift does occur, but it is slow, and consists of a
fairly predictable and constant frequency change over time until
thermal stability is reached.  

Crystal oscillators are an extremely poor source of entropy.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: "binary digit" <[EMAIL PROTECTED]>
Subject: voting through pgp
Date: Fri, 10 Nov 2000 18:35:10 GMT

Imagine if everyone had pgp in the world and voted through pgp, every single
vote could be verrified and everyone would be happy, and there wouldnt be
this problem that is going on now in florida



------------------------------

From: "binary digit" <[EMAIL PROTECTED]>
Subject: Re: Request for code
Date: Fri, 10 Nov 2000 18:46:41 GMT

ok asshole, how do you know he has to do this for homework, what if he just
wants to learn how to use crypto++ library, and just because hes using MSVC
doesnt mean he's developing a app that will have a gui, he could be writing
a console app. Instead of being so arrogant you could have simply not even
responded to the post cause you had nothing good to say to him anyway.


"Tom St Denis" <[EMAIL PROTECTED]> wrote in message
news:8uh87o$6hl$[EMAIL PROTECTED]...
> In article <[EMAIL PROTECTED]>,
>   epiphone <[EMAIL PROTECTED]> wrote:
> > Dear All:
> >
> >                    I am trying to develop a simple app using Microsoft
> > Visual C++ 6 using
> >                    Crypto++ library writen by Wei Dai.  The app is
> > supposed to perform          encryption/decryption on a file/string
> > using various of today's algorithms.
> >
> >                    As I am a novice in encryption technologies and
> MSVC,
> > I would like to
> >                    ask if there is anyone out there who has written
> any
> > similar app using
> >                    MSVC 6. I would most appreciate it if anyone can
> > allow me to look at
> >                    his/her code.
> >
> >                    Thank you in advance!
>
> Ask this in a MSVC newsgroup not sci.crypt.
>
> Also I doubt anyone is about todo your homework.
>
> Also why not start learning C and not C++ first, and avoid machine
> specific C extensions... learn basic console C stuff first!!!
>
> Tom
>
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.



------------------------------

Reply-To: "Keith Monahan" <[EMAIL PROTECTED]>
From: "Keith Monahan" <[EMAIL PROTECTED]>
Subject: monoalphabetic cipher
Date: Fri, 10 Nov 2000 18:45:03 GMT

I've been working on what I believe to be a monoalphabetic cipher for about
a week and half, off and on.

I'm fairly sure that it is just simple substitution, because frequency
analysis shows that only 24 unique characters are being used.  The problem
is that the amount of ciphertext available is relatively small, which is
screwing me up trying to match letter frequency info to the correct
substituted letter.  I've got approximately 400 characters, which doesn't
seem to be a whole lot to work with.

What I'm doing is trying to swap out ciphertext letter that occurs let's say
14% of the time with the plaintext letter that occurs 14% of the time.  My
main problem is that there is not very much difference in the letter
frequency, of a number of different letters.  For instance, there are 7
letters that occur in a span of 2% in normal texts.  Of course, these
letters aren't going to show up exactly in x percent of the time, which
means I have to guess which letters get substituted for which letters --
which isn't easy, and there are a ton of permutations, is that 7^24 ??  Or
is that 24^7?? To top it off, these letters that show up within 2% of each
other are all high-frequency letters, which HAVE TO be correct in order to
solve the rest of the cipher.

Character distribution ranges from less than 1% to 14%.  This makes me think
it's a relatively simple cipher as it more resembles as new york city
skyline with its ups and downs.

Another problem I have is that I'm not even sure if they are using spaces or
not within the text, so I can't start guessing at little common words like
the, a, an, etc.  I do have a character that shows up 14% of the time, btw,
and I think that given typical frequency info, that this is a little above
the most common letter, "E".  If I use this character for the space, the
words are still pretty long -- and I think definitely longer on average than
the average word length.

What I have done is to write a program that exhaustively tries every
possibility(for only the 7 letters), and then uses a scoring algorithm that
counts the number of digrams found on a line of the plaintext(after applying
subsitution on new guess), and the score is based on the frequency that the
digrams show up in normal text.  I copied the idea off of another program,
but I can screw around and make changes live to my own program.

I also wrote a program that given a pattern of letters and unknowns, it
searches a one meg dictionary of words, looking for words that match up.

Despite my efforts, I haven't really been able to come up with anything even
close to resembling normal text.

I'm trying to figure out what my next step is.  I've been coding all these
little utilities, which although not hard, is time consuming -- and someone
has to have already written better utilities than I.  I did find a program
called "mono" that lets you switch ct/pt pairs around and it shows you the
change in realtime.  It's a nice program, but without trying a million
possibilities, I don't see how its going to help me.

I have read a couple crypto books lightly and have access to a number of
them.  I have AC and Dr. Dobbs Crypto CD (which includes Handbook of Applied
Crypto + 11 other books), etc.

Can someone please make some suggestions to help me narrow this down?

It would be appreciated.

Keith Monahan






------------------------------

From: [EMAIL PROTECTED] (Bill Unruh)
Subject: Re: Calculating the redudancy of english?
Date: 10 Nov 2000 18:50:20 GMT

In <8u7e4p$9m4$[EMAIL PROTECTED]> "David C. Barber" <[EMAIL PROTECTED]> writes:

]I was speaking of dictionary entries *and* their following definitions,
]where common words would likely be more used than uncommon words.

]    *David Barber*
]"Kristopher Johnson" <[EMAIL PROTECTED]> wrote in
]message news:5KFM5.16697$[EMAIL PROTECTED]...
]> Dictionary entries aren't representative of "real-world" letter or word
]> frequencies.

They may however be more representative of "password" use of words,
since passwords are single (often rare) words without all of the added
grammatical "chaff" of text (where for example words like "a" or "the" or
"of" do not occur.) Ie, what the redundancy of English is is situation
dependent.
 





------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to