Cryptography-Digest Digest #910, Volume #8       Fri, 15 Jan 99 12:13:03 EST

Contents:
  Re: Cayley-Purser algorithm? (Bruce Schneier)
  Re: What is left to invent? ("Sam Simpson")
  Re: Cayley-Purser algorithm? ([EMAIL PROTECTED])
  Re: Cayley-Purser algorithm? ([EMAIL PROTECTED])
  Re: Cayley-Purser algorithm? ([EMAIL PROTECTED])
  Re: On the Generation of Pseudo-OTP (Terry Ritter)
  Re: On the Generation of Pseudo-OTP ("Mr. Brisket")
  Re: sci.crypt intelligence test. ("hapticz")
  Re: Sarah Flannery and the "Cayley-Purser" Algorithm (William Whyte)
  Re: Cayley-Purser algorithm? ("Sam Simpson")
  Re: SSL - How can it be safe? ("Joseph Suriol")
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: Encrypted WordPerfect 4.2-files (DOS) ("hapticz")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Re: Cayley-Purser algorithm?
Date: Fri, 15 Jan 1999 13:41:23 GMT

On Wed, 13 Jan 1999 17:53:28 +0000, William Whyte
<[EMAIL PROTECTED]> wrote:
>> Does anyone have any details on this...
>
>
>Yes, I do. it is a public key algorithm, and it's based on work that 
>Sarah did with us in Baltimore Technologies in Dublin when
>she was here on a student work placement last March. We've been
>looking at algorithms based on 2x2 matrices for a while and
>gave her the idea to see what she could do with it.
>
>The idea we were working on was to use 2x2 matrices with entries
>modulo n, n the product of 2 primes (ie an RSA number). The 
>security is therefore exactly the same as the security of an RSA key with
>the same modulus. However, the encryption and decryption processes 
>require only a small number of matrix multiplications rather than
>modular exponentiation, so both public-key operations (16 multiplications 
>over the finite field) and private-key operations are as fast as a
>normal RSA private-key operation (17 multiplications). The downside
>is that both the key and the ciphertext are about eight times the
>length of the modulus, rather than more-or-less the length of the
>modulus as with RSA.

This sounds like it will be broken a few days after the details become
public, assuming some of the mathematicians working in this area are
not otherwise busy at the time.  (Sorry for the skepticism, but others
have worked in this area as well.)

Bruce
**********************************************************************
Bruce Schneier, President, Counterpane Systems     Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN  55419      Fax: 612-823-1590
           Free crypto newsletter.  See:  http://www.counterpane.com

------------------------------

From: "Sam Simpson" <[EMAIL PROTECTED]>
Subject: Re: What is left to invent?
Date: Fri, 15 Jan 1999 10:38:13 -0000

I think the previous posters point was that some ground is being made
proving DHP = DLP while no such advances have been made in respect of RSAP =
IFP.

In fact, a paper by D.Boneh, R.Venkatesan entitled "Breaking RSA may not be
equivalent to factoring"  as published in Eurocrypt '98 has found some RSA
problems which cannot be equivalent to the underlying IFP.

Cheers,


Sam Simpson
Comms Analyst
-- http://www.hertreg.ac.uk/ss/ for ScramDisk hard-drive encryption & Delphi
Crypto Components.  PGP Keys available at the same site.

Jayant Shukla wrote in message <77l91j$2kd$[EMAIL PROTECTED]>...
>David A Molnar <[EMAIL PROTECTED]> writes:
>
>>Jayant Shukla <[EMAIL PROTECTED]> wrote:
>
>>> I personally would not consider them to be a major unconquered
>>> frontier in cryptography. A public key system, that is provably
>>> secure (unlike RSA or DH) would be at the top of my list.
>> ^^^^^^^^^^^^^^^^^^
>>are either of those proved one way or the other yet ? I thought "we"
>>were getting close on DH and becoming more pessimistic on RSA.
>
>No, either of them have not been proved to be secure or insecure.
>
>If you break RSA by factoring p, where p is the product of two primes,
>then you can break DH as well. An easy way to factor product of
>primes can be expolited to solve the discrete log problem efficiently.
>
>I would not say we are getting close on DH v/s RSA, but I would agree
>that DH is much simpler and cleaner than RSA.
>
>regards,
>Jayant



------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Cayley-Purser algorithm?
Date: Fri, 15 Jan 1999 15:21:51 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (John Savard) wrote:
> [EMAIL PROTECTED] wrote, in part:
>
> >Let us see the algorithm and judge for ourselves. Otherwise, this is all
> >overblown hype.
>
> As soon as the patents get sewed up.
>
> I remember that a math book, "Pi in the Sky", gave a method of
> encryption based on products of large primes that used multiplication
> instead of encryption. It was a version of the Shamir three-pass
> protocol. But because it used multiplication, not exponentiation, to
> make it easier to understand, it had a serious defect: dividing one of
> the messages by another one cracked the code.
>
> Is it possible to do public-key cryptography without exponentiation,
> or an operation with similar properties? I've tended to be inclined to
> the view that it isn't. This will be *very* interesting...

You need a one-way function.  Simple algorithms that involve multiplication
in a ring or field are generally useless because division usually breaks the
message or key.   For example,  let p be a large prime [or composite] and
consider a discrete log public-key method based on  the ADDITIVE cyclic
group  Z/pZ+.   Discrete logs in this cyclic group are trivially broken.

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Cayley-Purser algorithm?
Date: Fri, 15 Jan 1999 15:21:21 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (John Savard) wrote:
> [EMAIL PROTECTED] wrote, in part:
>
> >Let us see the algorithm and judge for ourselves. Otherwise, this is all
> >overblown hype.
>
> As soon as the patents get sewed up.
>
> I remember that a math book, "Pi in the Sky", gave a method of
> encryption based on products of large primes that used multiplication
> instead of encryption. It was a version of the Shamir three-pass
> protocol. But because it used multiplication, not exponentiation, to
> make it easier to understand, it had a serious defect: dividing one of
> the messages by another one cracked the code.
>
> Is it possible to do public-key cryptography without exponentiation,
> or an operation with similar properties? I've tended to be inclined to
> the view that it isn't. This will be *very* interesting...

You need a one-way function.  Simple algorithms that involve multiplication
in a ring or field are generally useless because division usually breaks the
message or key.   For example,  let p be a large prime [or composite] and
consider a discrete log public-key method based on  the ADDITIVE cyclic
group  Z/pZ+.   Discrete logs in this cyclic group are trivially broken.

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Cayley-Purser algorithm?
Date: Fri, 15 Jan 1999 15:16:45 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (David P Jablon) wrote:
> For what it's worth, here's an unsolicited open letter to Sarah:


Your advice shows gross ignorance of patent law.

> It's important for you to get as much private expert review
> as you can from people you trust.

This is correct.

> When you're pretty sure
> of what you've got in light of that feedback, publish it.
> If you've got something good, that's the only way to prove it.
>
> If all goes well, you can then consider applying for a US patent.

Sorry.  Once the method has been published, it CAN NOT be patented in
the US.



============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: On the Generation of Pseudo-OTP
Date: Fri, 15 Jan 1999 14:54:43 GMT


On Fri, 15 Jan 1999 12:56:48 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (R. Knauer) wrote:

>On Thu, 14 Jan 1999 22:32:11 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:
>
>>I often use 31-bit CRC's in crypto.  
>
>Is there a lookup table published for the 31-bit CRC, preferably on
>the Web?

I don't always use tables for CRC , but when I do, I never use the
tables as constants, but instead compute the table as an
initialization.  I've never seen anybody else use 31-bit CRC's.  


>>So CRC mixing is really only appropriate for sources which have some
>>amount of "true" randomness which must be accumulated.  CRC cannot
>>provide "a fine mixing" for LFSR's.  
>
>So you are not recommending a CRC for removing correlations from text
>streams (even pre-processed ones) or digit expansions of pi. 

On the contrary, I think CRC hashing *could* be used to process text
into "randomness"; I suppose it *must* work if there is any uniqueness
in the text.  And it should similarly work for digits obtained from
Pi.  I would think that it is just a matter of getting sufficient
input material.


>What
>about Juola's suggestion of LZ77?

Compression is a far more complex process than CRC.  While we would
like to think that compression "improves" the distribution, that is
neither its main goal, nor how compression is judged.  It seems to me
that we cannot hope for reversible compression to produce "totally"
random output, while for lossy methods (hashing) we can at least hope.


>>It may be that in order to know if one has removed a correlation, one
>>must first know what the correlation is.  I am not sure that *any*
>>technique would remove *all* correlations under *all* circumstances.
>
>I suspect you are right. I can imagine some sort of indeterminancy in
>all of this - like the only true random number is one that is
>non-computable in the Turing sense.

There is no random number.  Certainly there is no specific number
which we can point at as "random."  And if there were, that would not
be cryptographic, because everybody would know it.  Instead we expect
to be able to compute generally predictable statistical results from a
large ensemble of values.    


>It may be that once you can start computing with a number, it is not
>random. But that analysis falls back on the generator, since Turing
>non-computable numbers are such because they cannot be generated by an
>algorithmic procedure.
>
>Propositions:
>
>1) If you can generate a number algorithmically, then it is
>non-random.

There is no number that cannot be generated algorithmically:  

Start with 1.  Describe it in memory.  Print it out.  Step to the next
number.

And if we make some numbers preferable to others, we have just
decreased the uncertainty an Opponent has in selecting our numbers.


>2) If you operate algorithmically on a random number generated by a
>TRNG, it is no longer random.

I'm not willing to concede this.  It seems to me that the result of
any reversible operation (1:1 with the same number of unique values in
the domain and range) will still possess the randomness properties.
And an operation which reduces the range may also have those
properties.  


>Here the term "random" means totally unpredictable, undecideable,
>indeterminant - a number generated in such a way that all possible
>numbers of a given length are equiprobable.

If we start out unknown mod 256 and add a known constant such as 4
(mod 256), the result is still completely unknown mod 256, despite the
fact that we know both the constant and the operation.  So the linear
operation has not affected randomness.  


>>But if we mix a lot of almost-random material into a small result, it
>>seems like we have a good chance of getting pretty random stuff out.
>
>Is there anything to back that up besides intuition?

It seems to me that whatever uncertainty we have in the domain can be
retained by an appropriate range-reducing operation.  If we see a hash
as essentially a range-reduction computation, that should apply.  


>[...]
>>At least with a Latin square we can say we get a balanced (unbiased)
>>result from each mixing port, for any value on the other port, in a
>>guaranteed absolute sense which does not depend upon infinite
>>sequences.  
>
>How do you produce the actual square? If you start out with a square
>where every row is the same and the digits are in ascending order - a
>sort of "starting square" - how do you do the shuffle that square such
>that every row is a different permutation? What algorithm do you use
>for the shuffle?

   http://www.io.com/~ritter/ARTS/PRACTLAT.HTM
   http://www.io.com/~ritter/KEYSHUF.HTM


>BTW, what makes the Latin square method significantly different from a
>CRC lookup table method?

CRC is a linear hash; a Latin square combiner is probably nonlinear,
which means it must be keyed.  Both are very basic, understandable,
and balanced.  The hash takes input of arbitrary size; the combiner is
limited to 2 input values.  The combiner takes 2 inputs to 1 output,
but is reversible if we know one of the inputs; similarly, the CRC
takes n inputs to 1 output and is reversible only if we know n-1
inputs.  

Other hashes claim "strength" but are complex and so could hide a
backdoor in their computation; in contrast, CRC is simple, does what
it must, and clearly nothing else.  


>>Even the comparison of two pulse periods is "algorithmic."
>
>Yes, it is - and very primitive at that. But does it introduce any
>correlation?

I don't know.  Are you sure it does not?  

And if you are *not* sure -- to cryptographic levels of assurance --
we would seem to have lost the only randomness machine construction
that you seemed willing to accept.  

(Another machine possibility is to have two distinct sources, then
open gates and capture "the first" pulse.  While almost certain to be
biased, it should be simple and fast.  And if we post-process, the
bias may not be important.)


>>http://www.io.com/~ritter/GLOSSARY.HTM#LatinSquareCombiner
>>http://www.io.com/~ritter/ARTS/PRACTLAT.HTM
>>http://www.io.com/~ritter/GLOSSARY.HTM#MixingCipher
>>http://www.io.com/~ritter/#MixTech
>
>You need a search engine on your site. :-)

I will look into it. 

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM



------------------------------

From: "Mr. Brisket" <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Date: Fri, 15 Jan 1999 07:29:24 -1000

> >>So CRC mixing is really only appropriate for sources which have some
> >>amount of "true" randomness which must be accumulated.  CRC cannot
> >>provide "a fine mixing" for LFSR's.

You should say "true" fine mixing.

> >
> >So you are not recommending a CRC for removing correlations from text
> >streams (even pre-processed ones) or digit expansions of pi.

You should say "true" correlations.

> It seems to me
> that we cannot hope for reversible compression to produce "totally"
> random output, while for lossy methods (hashing) we can at least hope.

Is that "true" hope, "total" hope, or just common hope?

> >I suspect you are right. I can imagine some sort of indeterminancy in
> >all of this - like the only true random number is one that is
> >non-computable in the Turing sense.

"True" Turing sense is better than Turing sense.

> >2) If you operate algorithmically on a random number generated by a
> >TRNG, it is no longer random.

You need to use a "true" operator in this case.

> >Here the term "random" means totally unpredictable, undecideable,
> >indeterminant - a number generated in such a way that all possible
> >numbers of a given length are equiprobable.

Yes, "totally" unpredictable is better than unpredictable. Totally, dude, 
like wow, you know?

------------------------------

From: "hapticz" <[EMAIL PROTECTED]>
Subject: Re: sci.crypt intelligence test.
Date: Fri, 15 Jan 1999 09:55:12 -0500
Crossposted-To: talk.politics.crypto

i love it !!!

--
best regards
[EMAIL PROTECTED]





------------------------------

From: William Whyte <[EMAIL PROTECTED]>
Subject: Re: Sarah Flannery and the "Cayley-Purser" Algorithm
Date: Fri, 15 Jan 1999 11:41:47 +0000

> Since we don't have anything know the exact nature of the algorithm yet, does
> the fact that it uses matrix math (even if the security is compared to finding
> the square root mod a number in another article) mean that we can attack it as
> such?
> Their have certainly been public key algorithms based on matrices before:  I
> have immediate access to two: a program  called ATBASH and an algorithm called
> TTM (http://www.usdsi.com/).  I have not heard of either being broken (TTM is
> reletively new, to my knowledge), but nor have I heard of either being used or
> discussed.

TTM suffers from the problem that the public key is 200K long. It's a nice
idea, though.

I don't know anything about ATBASH. Could you give us some more details?

Cheers,

Wiliam

------------------------------

From: "Sam Simpson" <[EMAIL PROTECTED]>
Subject: Re: Cayley-Purser algorithm?
Date: Fri, 15 Jan 1999 10:45:25 -0000


Mike Andrews wrote in message <77lorq$qkn$[EMAIL PROTECTED]>...
>Tim Smith ([EMAIL PROTECTED]) wrote in article
<77lla7$m13$[EMAIL PROTECTED]>:
>: In article <77l3nr$55n$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
>: >  William Whyte <[EMAIL PROTECTED]> wrote:
>: >> modulo n, n the product of 2 primes (ie an RSA number). The
>: >> security is therefore exactly the same as the security of an RSA key
with
>: >> the same modulus.
>: >
>: >This is "proof by assertion".  The fact that the algorithm uses  SL(2,
Z/NZ)
>: >where N = pq  is not equivalent to proving that its security relies on
>: >factoring N.
>
>: When was it proved that the security of RSA depends on factoring?  I
thought
>: that this was still conjecture?  Has someone proved that breaking RSA
requires
>: factoring?
>
>The way I remember reading it, the conjecture (and it was a conjecture)
>was that breaking RSA was *_no_harder_* than factoring. The (unspoken)
>assumption had to do with factoring being a "hard" problem.


Breaking RSA is certainly no harder than factoring - it can be shown that an
efficient general purpose factoring method would also break RSA.

The conjecture was that RSA is equivalent to factoring (or more correctly
finding e^th roots modulo a composite integer) - which is still unproven
(and may indeed be a false assertion).

<SNIP>


Sam Simpson
Comms Analyst
-- http://www.hertreg.ac.uk/ss/ for ScramDisk hard-drive encryption & Delphi
Crypto Components.  PGP Keys available at the same site.




------------------------------

From: "Joseph Suriol" <[EMAIL PROTECTED]>
Subject: Re: SSL - How can it be safe?
Date: Fri, 15 Jan 1999 10:39:54 -0500


Doug Stell <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>On Thu, 14 Jan 1999 17:07:44 -0500, "Joseph Suriol" <[EMAIL PROTECTED]>
>wrote:
>

Thank you kindly for your reply.

>
>>As I understand the Secure Socket Interface (SSL) has optional encryption
>>and when installed all network traffic will be encrypted.   Where does it
>>get the key to encrypt.
>
>The session keys is either randomly generated or the result of a
>public key agreement algorithm with random components.
>

If there is a public key system at work, why does it need random component?
Why can't the system just use the target's public key, obtained from a key
bank?
 Each node must then keep its private key secure,
is there a way to do that other than file permissions?

>
>For the moment, I'll assume that you are not too familiar with public
>key techniques. If that assumption is correct, I suggest studying up
>on the use of such techniques, especially for key exchange/agreement
>protocols.
>

I know use and know PGP  (read the O'reilly book); have read a good part of
Schneier's book, but I am no expert.   My company cannot afford to buy the
Oracle SSL
product and want me to write an encryption module to insert suitably between
Oracle
based applications clients and servers.   I am starting from first
principles.  My weakness in
cryptology is somewhat compensated by long UNIX and C experience.  With a
some help from
here I hope to develop something serviceable.  I think I can find free
existing C libraries for most
of the functions I will need.

>The answer to this question depends on which key you are talking about
>and which key exchange/agreement protocol is used.
>
>1. If RSA key exchange is used, the session key (the one used to
>encrypt data) is randomly generated by the client and sent to the
>server, encrypted in the server's public RSA key.
>
>2. If a key agreement exchange, e.g., Diffie-Hellman, is used, the two
>copies of the session key are the result of the D-H computations done
>privately by the client and server.

If two independent system can compute the same session key, doesn't then the
secrecy of the key reside in keeping the algorithm and its seeds secret?  I
thought
this was a bad thing to do.


Many thanks.




------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Fri, 15 Jan 1999 13:48:20 GMT
Reply-To: [EMAIL PROTECTED]

On Thu, 14 Jan 1999 22:32:32 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:

>When many random and independent events occur they produce
>a collective distribution which is not arbitrary, but instead follows
>the rules of large numbers.  If the randomness detector ignores a
>particular subset of events, the results may be statistically
>incorrect and biased.  

That is correct. But that is not what the radioactive TRNG is doing.

It is not averaging over many events. It is measuring the difference
between discrete, completely independent events. That is why it si
random - there is no interdependence on when a nucleus decays.

If the detector "selects" *any* 4 decays out of a pool of many decays,
then the times for the decays are random and the intervals formed from
them are random because each of those decays is completely independent
of one another.

Recall where the radioactive decay law comes from, namely the
experimental observation that the decay of the population is a simple
exponential in time. That means that radioactive decay obeys a first
order rate law, which in turn requires that each decay event be
unrelated to any other decay event in time:

dN/dt = - k * N, where k is a constant independent of the time t.

That is, the probability that a decay will occur in the interval t ->
t +dt is independent of the time t. There is no averaging being done
here nor is there any interdependence of one decay on the other.
Therefore "missed" decay events are not an issue in principle.

Now if there is some pathlogical behavior of the detector going on
which causes these missing events, and that behavior itself introduces
an interdependence, that is another matter. But one can look for that
possibility and make design corrections to remove it. That is one of
the skills you learn in experimental nuclear physics.

>If we are doing post processing, this may not be much of an issue.
>But if we expect to build a physical machine which, on its own and
>without post processing, produces "random" values, the loss of an
>expected segment of events may be serious.

Not if each of the events is completely unrelated to the others.

When one nucleus decays it did not have to consult the others to
decide when or when not to do it. The decay of any nucleus is
completely independent of the other nuclei present. That is why is it
called "spontaneous decay".

>It means that we must
>innovate a design which provably ignores the effect if we are to have
>a provably random machine.    

And that design is the radioactive TRNG, properly constructed.

>I think one of the best examples of a good design is the HotBits site
>
>   http://www.fourmilab.ch/hotbits/

Yes, I am very familiar with that site. In fact that is where I first
got the information for the discussions here.

>by John Walker, who acknowledges John Nagle.  One of the issues with
>this sort of design is a difficulty in precisely measuring the
>interval between asynchronous pulses.

It is the "asynchronous" nature of those pulses that produces the
randomness to begin with.

But the difficulty you allude to does not destroy the randomness of
the output unless there is some inherent interdependence present.

>And the design is necessarily slow.

Only because of the detector they chose. There are scientific-grade
detectors and associated electronics that can handle faster count
rates. In fact, high count rate, short time coincidence measurements
in nuclear physics were standard even 30 years ago in Physics when I
was active.

But you are right - even a megabit of data per second seems slow by
today's standard.

>The results should be good.  But even though I can't *see* a
>correlation or bias here, I would be hard put to say this was
>*provably* cryptographically random, which seems to be the discussion.

I realize that there is no such thing as a Perfect Circle in reality
(possibly not even in mathematics, since it would necessarily contain
non-computable numbers, and what would that mean when it comes time to
work with it analytically?). But there are circles to an arbirtary
level of precision. I believe that a properly designed radioactive
TRNG can produce crypto-grade random numbers for unbreakable OTP
ciphers.

Even if such ciphers would "leak" a small amount of information about
the TRNG,  I do not believe that such leakage would give a
cryptanalyst enough to break the cipher unless he had a very large
number of ciphers to analyze. IOW, the work effort would be impossibly
large.

For example, the Venona cipher was not broken until pads were reused,
yet that particular cipher was built on very shakey grounds. True, the
cryptanalysts did not have the awesome computing power they have today
to mount experimental attacks, but still they did have some equipment
and a ton of traffic to analyze.

>A lot of talented people have spent a great deal of expensive time
>trying to design and build good really random generators with high
>data-rate outputs.  

But they are used for real time streams. We are talking about OTPs.

>My pessimism is based on the engineering interpretation of building a
>real device which must inherently produce *conceptual* randomness.

But can you justify the same degree of pessimism for a practical
device - one that produces random numbers that are practically secure
in that the work effort to break the system would be impossibly large?

In the world of experimental physics we had to address that same
problem all the time. Our experiments never came close to the level of
accuracy demanded by theory. But that did not stop us from assessing
the results of the experiments on a practical basis, by assigning a
figure of merit to the results.

Greg Chaitin argues that there is a fundamental limit to mathematics,
even number theory. Just as we know that it is impossible to make any
round physical object that is a Perfect Circle, it is impossible to
make some mathematical object that is a perfect representation of a
theorem in mathematics. Chaitin goes on to state that one must in
those cases rely on experimentation - Experimental Mathematics - to
provide proofs for propositions, just like physicists have been doing
for centuries.

I believe the same is true in crypto. You first try to perfect a real
system based on theory. Knowing that it is not perfect you certify its
level of precision by experiment. After all, that's how it works in
the real world - you invent a cryptosystem which you cannot formally
prove is secure, and you submit it for experimental analysis by seeing
if actual ciphers can be broken.

>As far as I can see, the HotBits idea is probably about as close as
>one can come.  

I have never come across any other radioactive decay TRNG posted to
the Web.

>We could probably do the same thing with any shot noise source (e.g.,
>reverse-biased semiconductor junction or resistor) by using a fast
>comparitor and setting the compare level relatively high, so that it
>is triggered rarely.  Then we have pulses and can measure those
>periods.

As long as any one pulse is not dependent on any other pulse, then it
would seem that each pulse is random.

The thing I suspect about such a device is that there is a dependence
because the shot noise is emminating from a bulk device. The solid
state properties of a physical device are bulk properties - for
example, superconductivity is a bulk phenomenon and so is band
conduction. By contrast radioactive decay is not a bulk phenomenon -
each decay event is completely unrelated to any other decay event.

>The advantage would be the ability to use high-bandwidth
>amplification with a fast compare device and clock, and thus increase
>the data rate.  A big disadvantage would be the need to set the
>compare level, but of course this is in some sense what we do by
>selecting a particular radioactive source and position for the Geiger
>tube anyway.  

See the SG100 at

http://www.protego.se/sg100_en.htm

Bob Knauer

"Liberty lies in the hearts of men and women.  When it dies there,
no constitution, no law, no court can save it."
--Justice Learned Hand


------------------------------

From: "hapticz" <[EMAIL PROTECTED]>
Subject: Re: Encrypted WordPerfect 4.2-files (DOS)
Date: Fri, 15 Jan 1999 10:04:42 -0500

seems ive seen some crackers for this kinda thing on some of the russian
sites in last year or two.    thyre pretty good at that kinda stuff.

--
best regards
[EMAIL PROTECTED]





------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to