Cryptography-Digest Digest #911, Volume #8       Fri, 15 Jan 99 15:13:03 EST

Contents:
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: SSL - How can it be safe? ("Joseph Suriol")
  Re: Cayley-Purser algorithm? (Kent Briggs)
  Re: Encrypted WordPerfect 4.2-files (DOS) (JPeschel)
  stuff (Jeremy Smith)
  Re: SSL - How can it be safe? (David P Jablon)
  Question on current status of some block ciphers in AC2 (David Hamilton)
  Re: On leaving the 56-bit key length limitation ([EMAIL PROTECTED])
  Re: Practical True Random Number Generator ([EMAIL PROTECTED])
  SHA-0 attack (David Crick)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Fri, 15 Jan 1999 16:27:49 GMT
Reply-To: [EMAIL PROTECTED]

On Fri, 15 Jan 1999 14:54:43 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:

>I don't always use tables for CRC , but when I do, I never use the
>tables as constants, but instead compute the table as an
>initialization.

Have you published that calcualtion, and if so where can it be found.

>I've never seen anybody else use 31-bit CRC's.  

What reason do you give for 31-bit? I realize 31 is prime - is that
the only reason?

>On the contrary, I think CRC hashing *could* be used to process text
>into "randomness"; I suppose it *must* work if there is any uniqueness
>in the text.  And it should similarly work for digits obtained from
>Pi.  I would think that it is just a matter of getting sufficient
>input material.

So - you do recommend a CRC for hashing out such correlations.

>Compression is a far more complex process than CRC.  While we would
>like to think that compression "improves" the distribution, that is
>neither its main goal, nor how compression is judged.  It seems to me
>that we cannot hope for reversible compression to produce "totally"
>random output, while for lossy methods (hashing) we can at least hope.

So now it's 2 votes for the CRC and 1 for the LZ77. Any more
contenders out there?

>There is no random number.  Certainly there is no specific number
>which we can point at as "random."  And if there were, that would not
>be cryptographic, because everybody would know it.  Instead we expect
>to be able to compute generally predictable statistical results from a
>large ensemble of values.    

I agree fully. As stated many times, crypto-grade randomness comes
from the generator itself.

What I was trying to say was that numbers that non-computable are
random because they cannot be generated by an algorithmic procedure -
but can only be generated by a TRNG.

Whatever the case, I was just musing philosophically.

>There is no number that cannot be generated algorithmically:  

I disagree. See Turing's non-computable numbers. Also Chatin's Omega.

>Start with 1.  Describe it in memory.  Print it out.  Step to the next
>number.

Yes, that is how you calculate computable numbers, even pseudo-random
numbers. But how do you compute the next bit from a TRNG?

>>2) If you operate algorithmically on a random number generated by a
>>TRNG, it is no longer random.

>I'm not willing to concede this.  It seems to me that the result of
>any reversible operation (1:1 with the same number of unique values in
>the domain and range) will still possess the randomness properties.
>And an operation which reduces the range may also have those
>properties.  

If you can compute a number or manipulate it by algorithmic means,
then you have introduced some degree of non-randomness because of the
procedural nature of the computation.

When someone says pi, for example, is "random" I point out that it
cannot be random because each bit is perfectly known thru its
algorithmic expansion. That means that every sequence of bits is
predictable, determinate, non-random. 

A seeming problem with that analysis is that random offsets into the
digit expansion of pi produces sequences that are not predictable
unless you know the offset. But that is not true randomness - that is
just obscurity. If a cryptanalyst had the time and energy to try out
every possible offset, he could break the pad based on that system.

But no matter how one tries, one will never be able to break an OTP
system using pads generated from an ideal TRNG. That, and only that,
is proveably secure in principle. And even if the TRNG is not
"perfect" it can still be made strong enough to force the cryptanalyst
into a hopleless situation in terms of work effort.

Of course, once quantum computers become a reality, all bets are off
in terms of work effort as we measure it today.

>If we start out unknown mod 256 and add a known constant such as 4
>(mod 256), the result is still completely unknown mod 256, despite the
>fact that we know both the constant and the operation.  So the linear
>operation has not affected randomness.  

Yes but is that sufficent? For a number to be crypto-grade random it
must be one of all possible numbers of a given length and it must be
generated equiprobably. Can you say that for your generator?

If your generator is limited in terms of not being capable of
outputting all of the 2^N possible numbers or in terms of outputting
some numbers in a manner that is not equiprobable, then it fails to
meet the specification for crypto-grade random numbers required for
the proveably secure OTP cryptosystem.

Just because one number from your generator is unpredictable does not
make it an acceptable generator of crypto-grade random numbers.

>It seems to me that whatever uncertainty we have in the domain can be
>retained by an appropriate range-reducing operation.  If we see a hash
>as essentially a range-reduction computation, that should apply.  

Interesting analysis. I believe it is tied to the concept of
"distilling entropy" or "concentrating entropy".

But I remind you that you still have to certify that such processing
is capable of outputting all possible sequences of a given length
equiprobably - or else it is unsuitable for the OTP.


>   http://www.io.com/~ritter/ARTS/PRACTLAT.HTM

I must have missed something when I scanned that page. I plan to read
it carefully when I get the chance.

>   http://www.io.com/~ritter/KEYSHUF.HTM

As they say down Mexico way - grassyass senor.

>CRC is a linear hash; a Latin square combiner is probably nonlinear,
>which means it must be keyed.  Both are very basic, understandable,
>and balanced.

When you say "balanced" do you mean that all sequences for a given
length are not only possible but equiprobable?

>The hash takes input of arbitrary size; the combiner is
>limited to 2 input values.  The combiner takes 2 inputs to 1 output,
>but is reversible if we know one of the inputs; similarly, the CRC
>takes n inputs to 1 output and is reversible only if we know n-1
>inputs.  

Because the CRC can hash any size input, one would suspect that its
performance depends on that size. Is there any quantitive measure
regarding the level of decorrelation for the CRC and input size? I
would assume from what you just said above that the extent of
"distillation" would play a significant part.

>>Yes, it is - and very primitive at that. But does it introduce any
>>correlation?

>I don't know.  Are you sure it does not?  

As a one-time practicing experimental physicist who make his own
instrumentation for nuclear measurements, I would offer the opinion
that it depends crucially on the exact design and components selected.
We spent a lot of time certifying the performance of our equipment,
doing calibrations and adjustments constantly - and that was with
state of the art designs and components.

It would take careful testing of both to ensure that one is truly
measurng ONLY the random decay of radioisotopes. If so, then the
interval method would be certifiably secure for crypto purposes, to
within some level of precision. As I said earlier, measuring
coincidence intervals for discrete events was stock in trade for
nuclear physics.

>And if you are *not* sure -- to cryptographic levels of assurance --
>we would seem to have lost the only randomness machine construction
>that you seemed willing to accept.  

I am willing to accept any TRNG design that meets the prime directive.
I like the radioactive method because it is based on the known
randomness of a physical process, which I am quite familiar with.

>(Another machine possibility is to have two distinct sources, then
>open gates and capture "the first" pulse.  While almost certain to be
>biased, it should be simple and fast.  And if we post-process, the
>bias may not be important.)

That is how coincidence measurements are sometimes done in nuclear
physics.

>>You need a search engine on your site. :-)

>I will look into it. 

Good Deal. 

Bob Knauer

"Liberty lies in the hearts of men and women.  When it dies there,
no constitution, no law, no court can save it."
--Justice Learned Hand


------------------------------

From: "Joseph Suriol" <[EMAIL PROTECTED]>
Subject: Re: SSL - How can it be safe?
Date: Fri, 15 Jan 1999 11:40:53 -0500


David P Jablon <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>On Thu, 14 Jan 1999 17:07:44 -0500, "Joseph Suriol" <[EMAIL PROTECTED]>
>wrote:

Thank you kindly for your reply.

>
>The lack of proper involvement of people in most *deployed*
>cryptography, and unrealistic presumptions for human behaviour,
>are a principal weakness of many protocols, including SSL
>as commonly used.
>

Doug brought me up and you bring me down, but thanks I rather know the
actual circumstances that believe comforting fantasies.   Most cryptosystems
are never put to the test and their security can remain illusory, I am so
glad this
group and its participants exist.

>
>As for the original question, there are plenty of good
>uses for non-password-based cryptography in machine-to-machine
>transactions.  But verifying a human presence is a tougher problem,
>which usually requires passwords, tokens, or biometrics,
>and often more than one of the above to cover for
>the quirky limitations of human beings.
>

This is my point really.  When I use PGP I have to enter a pass phrase and
the system
takes over from then on.  But without this step I cannot see how the
ultimate goal of
cryptography: confidential communications between humans, not machines, can
be achieved.
The trade off of convenience vs. security can't be done away with.
Encryption under your feet
without entering a password is very convenient, but it may not be what you
need or what you want.

The only way I see  SSL and other fully automated crypto systems can offer
some security
is to ensure that they operate in conjuction with a trusted login mechanism.
But this still leaves this cryptosystem relying on external components that
may or may
not be there.

Thank you



------------------------------

From: Kent Briggs <[EMAIL PROTECTED]>
Subject: Re: Cayley-Purser algorithm?
Date: Fri, 15 Jan 1999 16:42:37 GMT

[EMAIL PROTECTED] wrote:

> > If all goes well, you can then consider applying for a US patent.
>
> Sorry.  Once the method has been published, it CAN NOT be patented in
> the US.

You have up to one year to file after initial publication.  There was a big
stink made about the Diffie-Hellman patent because it was filed over a year
past initial publication.

--
Kent Briggs, [EMAIL PROTECTED]
Briggs Softworks, http://www.briggsoft.com



------------------------------

From: [EMAIL PROTECTED] (JPeschel)
Subject: Re: Encrypted WordPerfect 4.2-files (DOS)
Date: 15 Jan 1999 17:41:43 GMT

><[EMAIL PROTECTED]>writes:

>seems ive seen some crackers for this kinda thing on some of the russian
>sites in last year or two.    thyre pretty good at that kinda stuff.

the site below, not in Russia,  may be faster, and we're pretty good
at that kinda stuff, too.

Joe
__________________________________________

Joe Peschel 
D.O.E. SysWorks                                 
http://members.aol.com/jpeschel/index.htm
__________________________________________


------------------------------

From: Jeremy Smith <[EMAIL PROTECTED]>
Subject: stuff
Date: Fri, 15 Jan 1999 13:52:13 +0000

I thought that it would be best to have a quiet
night tonight as I have to get up at 4.30 in the
A.M !!!!so maybe a swift pint or two and then to
bed.
    I may see you before you go on your hols as I
get back next friday afternoon,but if I don't then
haxe a good one and happy birthday and stuff

Happy birthday to you Happy birthday to
you,etc,etc

Don't do anything I wouldn't .

P.S France is always a good place to buy much
delayed and overdue Christmas pressies,not that I
am going on about it or anything:-))


------------------------------

From: [EMAIL PROTECTED] (David P Jablon)
Subject: Re: SSL - How can it be safe?
Date: Fri, 15 Jan 1999 19:20:08 GMT

Doug,

I am now unsure what mode of SSL you were trying to describe.
On the other hand, I *am* pretty sure you've misunderstood
the significance of the stronger alternatives.

A point-by-point discussion is included here for posterity:

In article <[EMAIL PROTECTED]>,
Doug Stell <[EMAIL PROTECTED]> wrote:
>>> A password is only used when a human needs to remember something or
>>> give it, by another channel, to another human, to be entered at the
>>> remote end.  This would be the typical case with file encryption.[...]

On Fri, 15 Jan 1999 04:07:58 GMT, I, David Jablon replied:
>> Not "only".  There are lot's of other uses too, like
>> password-authenticated key exchange, and other
>> human-to-machine transactions.

Doug:
> Of course, there are other uses for passwords. I never intended to write an
> exhaustive list.

[David:]
Ok.  But you present some strong opinions, and in my view,
unjustified opinions about password authentication.

David:
>> The lack of proper involvement of people in most *deployed*
>> cryptography, and unrealistic presumptions for human behaviour,
>> are a principal weakness of many protocols, including SSL
>> as commonly used.

Doug:
> I believe that most people would disagree with this statement.

Why?  If it's not in humans, where do you see the weak link?
I occasionally make provocative comments to stir interesting
debate, but that wasn't intended to be one of them, and
your rebuttal isn't interesting.

David:
>> Doug gave a fair description of how a password is sent
>> across an SSL channel, secured only by the server's private key.

Doug:
> A "password" is not sent "secured only by the server's private key." It is a
> master key from which session keys are created that is sent in a secure
> manner.

Are we still talking about the same use of SSL?  The same Internet?
Today, a common use of passwords with SSL is clear-text
through a server-authenticated-and-encrypted channel.
Sure there are also various forms of weak hashed challenge
response password methods built into the browser too,
but in these cases the password is not at all used as a
"master key".

What you've just described, using a "master" password to
negotiate a secure one-time session key, sounds a lot more
like EKE, SPEKE, and the other alternatives I was referring to.

>> [A password sent through SSL] also requires that the user take action
>> to insure he's talking to the correct server, perhaps by
>> verifying the name-to-server-public-key binding after
>> clicking on the tiny key logo on the corner of the
>> browser window.  Without such action, the user may
>> only be guaranteed that he's giving his password to someone
>> (perhaps anyone) who's spent a couple bucks to get a PK cert.

> The software checks the binding and certificate chain. The user should c check
> the authenticated name to determine if this is someone he should be willing
> to talk to. So, you are right in that the user does have to take this
> minimal action.

Exactly.  My point is that even this minimal action
is often neglected.  Humans are the weak link here.

> A CA should not give credential to just anyone who shows up
> with a couple of bucks, but reserve certificate to those with a legitimate
> need.

Surely, you can't be serious.  Will Verisign refuse money
just because a potential customer hasn't proved a "legitimate"
need?  I think not.

>> To be complete, there are stronger ways to verify a
>> memorized key, and use it to secure a channel.
> 
> This is not the purpose of SSL, although client authentication is optionally
> available. [...]

So?  I was addressing the original question of whether SSL is
safe for human authentication, and how it might be made safer.
Furthermore, client authentication does *absolutely nothing*
to solve the server-name-binding problem.  You can still
send your password through a doubly-authenticated channel
to a complete stranger.

> [...] SSL is designed under the assumption that the server MUST always
> be authenticated to the client and the client MAY optionally be
> authenticated to the server. In most uses, the client will not have
> credentials to authenticate.

What?  I get the sense that instead of replying with
thoughful comment on how a user *can* authenticate
securely, you're just cutting and pasting from some RFC.

>> One alternative is to use a password-authenticated
>> key exchange, where the password helps negotiate a session key,
>> and yet isn't revealed to anyone who doesn't already know it.
>> So far, such methods (as listed at the web site below)
>> aren't built-in to today's browsers.
> 
> Bad concept, which is why they are not in today's browsers. The problem with
> passwords is that somebody else must know it. Public key techniques
> eliminate this problem [...]

Fine.  Condemn what you don't understand.
Password-authenticated key exchanges *are* public key
techniques.  They're just not the ones you think you know.

> [...] and allow a key to be negotiated without anyone
> having to already know the private key. One can demonstrate that you know
> the private key without anyone else having or being able to know it.

There is no "one" problem with passwords, and certainly no
"one" problem in human authentication that is solved
just by getting rid of passwords and using only unmemorizably
huge stored private keys.

>> As for the original question, there are plenty of good
>> uses for non-password-based cryptography in machine-to-machine
>> transactions.  But verifying a human presence is a tougher problem,
>> which usually requires passwords, tokens, or biometrics,
>> and often more than one of the above to cover for
>> the quirky limitations of human beings.
>
> True, but passwords are often used in hybrid systems to accomplish this. A
> hybrid system is better than a single approach, because it brings the
> advantages of each component toagether. I gave an example where a password
> would unlock a local secret, the private key. Then public key techniques
> would prove that you have unlock that private key with a password known only
> to the human. [...]

Not bad, but you can do even more by incorporating password
key exchange into hybrid schemes.

> [... The password] need not be known to anyone else, including the local
> terminal.

Wrong.  When local password-encrypted data can be brute-forced,
it essentially *is* known to the local terminal.

> Passwords are needed only because of the limitations of humans,
> but are considered weak for any use other than the one just mentioned.

Again, you pontificate without reason.  Password-authenticated
key exchange is another strong use.  It removes the need for
password-derived data to be stored at the local terminal.

> I susggest that you look at Kerberos, a symmetric key scheme, or SPX, its
> public key equivalent, as examples of how this can be done.

No, you should look again at those, more closely.  Then read
about EKE, SPEKE and other modern password key exchanges.
Initial Kerberos authentication, and all purely *symmetric key*
schemes for password verification are completely open to
network dictionary attack.  Some form of public-key scheme
must be used to prevent this.  The worthy debate is in *how* to
use PK in the best possible way.

Take a look at <http://world.std.com/~dpj/links.html>
for papers that describe all this in full detail.

=========================
David P. Jablon
Integrity Sciences, Inc.
[EMAIL PROTECTED]
<http://world.std.com/~dpj/>


------------------------------

From: [EMAIL PROTECTED] (David Hamilton)
Subject: Question on current status of some block ciphers in AC2
Date: Fri, 15 Jan 1999 19:28:32 GMT

=====BEGIN PGP SIGNED MESSAGE=====

I'm reading Applied Cryptography 2nd edition by Bruce Schneier. In chapters
13 and 14 he gives views on a number of block ciphers. I'm wondering if
anything 'ever became' of the following half a dozen:-
Madryga;
Redoc II;
Loki191 (LOKI-97 is an AES candidate now though);
Khufu;
CA-1.1;
Gost.

There is nothing specific behind my question, it's just general interest.
I've checked my usual first ports of call (Bruce Schneier/Counterpane site,
Terry Ritter's site, RSA FAQ, Bill Unruh's site, John Savard's site, Peter
Gutmann's site, and now Sam Simpson's FAQ) but all I can see is Peter Gutmann
saying that Gost is incompletely specified and Sam referring to Gost in a
non-internet reference.

Presumably interest has dwindled in non-AES ciphers? 

Any current opinions or up to date internet references welcome. Thanks.  


David Hamilton.  Only I give the right to read what I write and PGP allows me
                           to make that choice. Use PGP now.
I have revoked 2048 bit RSA key ID 0x40F703B9. Please do not use. Do use:-
2048bit rsa ID=0xFA412179  Fp=08DE A9CB D8D8 B282 FA14 58F6 69CE D32D
4096bit dh ID=0xA07AEA5E Fp=28BA 9E4C CA47 09C3 7B8A CE14 36F3 3560 A07A EA5E
Both keys dated 1998/04/08 with sole UserID=<[EMAIL PROTECTED]>
=====BEGIN PGP SIGNATURE=====
Version: PGPfreeware 5.5.3i for non-commercial use <http://www.pgpi.com>
Comment: Signed with RSA 2048 bit key

iQEVAwUBNp+WSco1RmX6QSF5AQEGyQgApRojDdGEo78QcyXEMQpiqOEW1wQaQ+Od
igaWBYfiu9dg3FHXYJ0toWlSFfdU8t8gNpJ+zr5zifkLAQAwGidFfiPbxHs5PMvP
/pUzTQaJVyZYABlP28rN7jgXl9OBfaFeVe5L3rPKmhr1XvdgXOSqyZuDFylSgN/Y
72Cy7aNgd4q5QDa/B7MKotFsf0fjDolc9I1nRdriCgDN4yhPK5pRZqTofY6BONef
vIIpWJtNGziiIZo+lOp1VZGaxO3bBfzt74lvMhvBZTd5gy0miaDKgCsnjzvgB2iQ
pz/OPD6LQnEGLxMkuRdhwBo6SED7GDyEhe/wYL3pK7uiZc4O1bAGGA==
=3mXu
=====END PGP SIGNATURE=====

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: On leaving the 56-bit key length limitation
Date: Fri, 15 Jan 1999 18:33:31 GMT

In article <77lq7n$ooc$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> Ed Gerk wrote:
> >   [EMAIL PROTECTED] wrote:
> > >
> > > The unicity distance is the least number of characters for
> > > which we do not expect any false alarms.
> > >
> >
> > Unfortunately still, what you define as "unicity" is not even slightly
> > equivalent to the fundamental concept of unicity. And, note. You cannot just
> > define things at will -- there is a reason why definitions are as they are:
> > because they work as they are written. They correspond to some reality.
>
> Over and over I've used Shannon's definitions, and Ed has made
> his up.

I want to call your attention to the fact that if your understanding of the
concept of "unicity" would be correct, then the "example" you cited for it two
e-mails ago would not be absurd -- such as considering that the unicity
condition would be lost *after* it was reached. To recall, your "example" was:

|Consider the situation in which the message space has several
|plausible messages, but the conditional probabilities, given the
|ciphertext, show that "Attack at dawn with 3000 men." has a
|probability of 0.599999, and "Attack at dawn with 3006 men"
|has a probability of 0.399999.  Using Shannon's formula for
|entropy, I calculate the equivocation of the plaintext is
|0.97 bits.  Shannon defined the unicity distance as the number
|of intercepted characters for which the equivocation in the
|plaintext is very close to 0.  0.97 bits is not very close to
|zero, therefore unicity has not been reached.

So, I considered that "example" impossible in regard to DES (which was being
discussed) but gave you the opportunity to present the cipher system for
which it would apply -- in other words, how did you calculate percentages
such as "0.599999" and "0.399999", and with so many digits of precision? How
could unicity be lost after it was reached and then regained after it was
lost, so that one could unambiguously initially read "Attack at dawn with
300" *before* unicity was lost between a "6" or a "0"  and then recuperated
afterwards to unambiguously read " men" at the end -- and all that well past
the limit of 20 characters currently cited for DES (that is, if I would be
wrong)?. To which you have not answered.

This lack of proper dialogue in your presentations has IMO confused issues
even more. Perhaps this has to do with the impression that some of your
phrases are simply ambiguous. Funnywise, it is even possible that we are
speaking about the same issues at least in some points -- i.e, not regarding
your "example".

So, with this wisp of a doubt I think I must wait either for your
qualification or recall of your "example" -- unless I want to "note that you
merely repeat your position that I'm wrong"  -- if I may repeat your words
;-)

Cheers,

Ed Gerck
______________________________________________________________________
Dr.rer.nat. E. Gerck                                 [EMAIL PROTECTED]
http://novaware.com.br
 ---  Meta-Certificate Group member -- http://www.mcg.org.br  ---

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Practical True Random Number Generator
Date: Fri, 15 Jan 1999 17:28:21 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:

> Hmm... I thought for low count rates it was a Poisson distribution.

Whatever.  The point is its not flat.

(Aside: The halogen-quenched geiger tube has a refractory period after each
discharge, which limits the counts to something like 1e5/sec.  This no doubt
affects the statistics.)

> >You have to distill the entropy to get a uniform distribution.
>
> Do you have a preferred technique for doing that which does not
> introduce non-randomness algorithmically?

I've used simple parity-of-N in some earlier experiments,
and that "worked" for a reasonable N.  This is straight out of RFC 1750.
I make no claim that this is optimal in any sense, just simple, and
apparently works.

"Works" means the cooked data looked random to Diehard and to Maurer's
universal statistical test.  You must use tools like this to see what
compression factor "N" is necessary for a given raw sample and given
conditioner.  (E.g., if you keep increasing N, Maurer's metric asymptotes
at the maximum possible entropy.)

> The one advantage of the radioactive decay TRNG is you can prove that
> it satisfies the TRNG specification. I am not so sure about electronic
> noise devices - it appears that one would need to make some
> assumptions which have no fundamental physics behind them.
> Chaos does not imply randomness, for a TRNG anyway. The weather is
> chaotic, but hardly random. I suspect the same for electronic noise in
> general, although there must be a few instances where one could use
> first principles to prove randomness due to quantum effects.

I don't think it matters that the physics behind the RNG is "truly random" in
the quantum sense so much that the internal state trajectory is unpredictable
*outside* the box.  For instance, in a purely classical resistor whose
lattice is disturbed by brownian (thermal) motion, no one outside of the
resistor can predict its instantaneous resistance, even though if you knew
all atoms' positions and velocities you could. What matters is that Eve isn't
able to predict what comes out of your classical resistor even if she could
given knowledge she doesn't have and can't obtain.

Personally I'm stuck in 1900 and don't believe in quantum randomness,
just in ignorance written off as randomness.  But if you want to harvest
entropy from rotting atoms, that's fine; I've had good results
with conditioned digitized hiss too.

Since chaos amplifies initial uncertainty, it can be used to generate
entropy.  But this path inherits the complexities of understanding
dynamical systems.  Much cleaner to use simpler phenomena.

In any case, conditioning is always required.  Measuring the health of
your raw RNG source and the effect of conditioning is strongly recommended.

Cheers









============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

Date: Fri, 15 Jan 1999 19:46:33 +0000
From: David Crick <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: SHA-0 attack

>From BS's latest CRYPTO-GRAM newsletter:

> In August, two French cryptographers described an attack against
> SHA-0. For those who don't remember, SHA is a NIST-standard hash
> function.  It was invented by the NSA in 1993, and is largely
> inspired by MD4.  In 1995, the NSA modified the standard (the new
> version is called SHA-1; the old version is now called SHA-0).
> The agency claimed that the modification was designed to correct a
> weakness, although no justification was given.  Well, we now
> understand the attack against SHA-0 and how the modification
> prevents it.

Does anyone have any further information on this? It's the first I've
heard about it and I'd like to have more of a look.

  Cheers,

     David.

-- 
+---------------------------------------------------------------------+
| David Crick  [EMAIL PROTECTED]  http://members.tripod.com/~vidcad/ |
| Damon Hill WC '96 Tribute: http://www.geocities.com/MotorCity/4236/ |
| Brundle Quotes Page: http://members.tripod.com/~vidcad/martin_b.htm |
| PGP Public Key: (RSA) 0x22D5C7A9  00252D3E4FDECAB3 F9842264F64303EC |
+---------------------------------------------------------------------+

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to