Cryptography-Digest Digest #499, Volume #10       Wed, 3 Nov 99 03:13:05 EST

Contents:
  Re: Your Opinions on Quantum Cryptography (David A Molnar)
  Re: Proposal: Inexpensive Method of "True Random Data" Generation ("John E. Kuslich")
  Re: Compression: A ? for David Scott (Tom)
  Re: Compression: A ? for David Scott (Tom)
  Re: Kerberos Question
  Re: Your Opinions on Quantum Cryptography
  Re: Scientific Progress and the NSA (was: Bruce Schneier's Crypto (SCOTT19U.ZIP_GUY)

----------------------------------------------------------------------------

From: David A Molnar <[EMAIL PROTECTED]>
Subject: Re: Your Opinions on Quantum Cryptography
Date: 3 Nov 1999 04:37:42 GMT

[EMAIL PROTECTED] wrote:
> Dear All,

> I am preparing a short paper on Quantum Cryptography. I would be most
> grateful if you could give your opinion/thought/knowledge on the
> following points:

> 1. Is there a need for Quantum Cryptography?

The only quantum crypto I am familiar with is "quantum key
distribution" via privacy amplification. So that's to what my comments
refer. 

I think the answer to this depends in part on how hard you think certain
computational problems "really" are, and what resources you assume on
the part of your adversary. The security of quantum key distribution
requires no computational assumptions; this means that even if P = NP,
or you are under attack by large and well-funded organizations, you have
some hope that your communication is secure. 

So if you believe that the computational strength of adversaries can't
be measured, or if you think that they have much better algorithms than
you do, you need quantum key distribution. or something else
info-theoretically secure, like a one-time-pad. 

On the other hand, as you can see in a separate thread here in
sci.crypt, understanding just what you get from quantum key distribution
can be tricky. It's probably best to go read the thread (and the
original papers) instead of trying to summarize here.  

> 2. Will Quantum Cryptography reach a phase where it can be implemented
> over long distances successfully?

Aren't we there already, at least in the lab? Applied Cryptography
mentions that quantum key distribution has been experimentally performed
over a distance of 10km via fibre-optics. I'm almost positive I've heard
of tests using lasers across ordinary space("Plug and Play Quantum
Crypto" is a paper title which comes to mind,but I can't find the
reference...). 

I'm sorry for not citing references at the moment, but you may be able
to find experimental implementations via a web search. 

> 3. Will Quantum Cryptography become a neccesity against increasing
> advanced crypto attacks?

Uh, speculation about what will happen? I understand this as asking if I
think that some computational problems are hard, and if we know how best
to exploit that "hardness". well, I do believe the first, and think
that we'll get there on the second. Not everyone agrees with me. 

-David 



------------------------------

From: "John E. Kuslich" <[EMAIL PROTECTED]>
Crossposted-To: sci.math,sci.misc,sci.physics
Subject: Re: Proposal: Inexpensive Method of "True Random Data" Generation
Date: Tue, 02 Nov 1999 22:18:31 -0700

NAHHHHH!

Here is what you do...

You take three or four of these CD's that AOL keeps sending you in the
mail and you suspend them from strings.  The longer the strings, the
better. Mount then from near the center so the swing and sway and lally
back and forth.

Now you get a little "personal" fan and direct the airflow on to the
suspended CD's from slightly underneath the CD's.  This will cause them
to move in a random and chaotic way.  Now focus a couple of desk lamps
on to the CD's (you can use different colors if you like).  Whip out
your ultracheap web-cam that plugs right into your USB port (you have a
USB port don't you??)  You set the web cam software to snap a photo of
the CD's every couple of seconds.

Now you write some software to take the images you have save over a few
days and you whiten the data by hashing, mixing grinding (use you
favorite whitening software, maybe Yarrow...).

WALA!!  Reams and reams (GIGABYTES??) of cheap random data for the rest
of your life!!! :--)

It sure beats Lava Lamps with all their temperature sensitivities and
need to rest etc.

Finally, a real life use for those AOL CD's !

John E. Kuslich  http://wwww.crak.com




DSM wrote:
> 
> If this is off-topic, please forgive me;
> I am thinking that the groups this message is directed to
> are frequented by those who may be interested in the method.
> 
> ***
> 
> Currently, any experiment (or other procedure) for which "true"
> random data is required must be conducted on a computer equipped
> with a special-purpose peripheral device (usually quite expensive.)
> Applications for "true random data" include statistical research
> and strong encryption.
> 
> PROPOSAL: Make use of minute electronic inaccuracies in existing
> computer
>           systems. Every computer, no matter how skillfully designed,
>           contains some component(s) which can be "tortured" to extract
>           entropic data.
> 
> Examples of Implementation
> 
> Basic premise: ALL MODERN DIGITAL COMPUTERS CONTAIN BUILT-IN TRUE-RND
> GENERATORS,
>                WHETHER PART OF THE DESIGN OR NOT.
> 
> (1)
> The machine you are most likely now using contains a number of quartz
> crystals used for timing various processes, including the operation of
> the CPU. The crystals are accurate to ~0.02%, as I recall. No two of
> even the
> best-made and most accurate quartz crystal timing devices will err in
> exactly
> the same way at any given point of time. Use various system timers to
> extract
> data from relative inaccuracies. Put it through some entropizing
> algorithm,
> mix well.
> 
> (2)
> Study in detail a design for a certain computer, no matter which. Some
> element
> of its digital circuts will no doubt produce errors when "flexed". Track
> down
> a weak point in the motherboard, for example. Have the system send data
> through
> said point, test for "quality" (entropy), test for consistency across
> systems
> (test it on 100 machines of same make.) Distribute software library, or
> incorporate
> into operating system.
> 
> If anything in this post is new, it is now in the public domain.

-- 
John E. Kuslich
Password Recovery Software
CRAK Software
http://www.crak.com

------------------------------

From: [EMAIL PROTECTED] (Tom)
Subject: Re: Compression: A ? for David Scott
Date: Wed, 03 Nov 1999 11:34:55 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 2 Nov 1999 22:27:26 GMT, Tim Tyler <[EMAIL PROTECTED]> wrote:

>Tom <[EMAIL PROTECTED]> wrote:
>: On Sun, 31 Oct 1999 11:34:34 GMT, Tim Tyler <[EMAIL PROTECTED]> wrote:
>:>Tom <[EMAIL PROTECTED]> wrote:
>:>: On Sat, 30 Oct 1999 17:29:55 GMT, Tim Tyler <[EMAIL PROTECTED]> wrote:
>:>:>Tom <[EMAIL PROTECTED]> wrote:
>
>:>:>: Measuring by compression ratio is objective.  You're definition of
>:>:>: adding information sounds subjective. [...]
>:>:>
>:>:>No - it could be measured precisely using entropic measures on both
>:>:>pre- and post- compressed texts.
>:>
>:>: Has this been done?
>:>
>:>I doubt it.  Measuring entropy accurately is very difficult in general.
>:>
>:>One-on-one compressors have only just been invented.  I don't know if
>:>people have been very interested in quantifying how bad non-one-on-one
>:>compressors are before now.
>
>: If "one-on-one" (symmetric compression) was better at reducing
>: patterning, the files would be smaller
>
>Indeed.  When I said "bad" I was referring to giving away information to
>attackers, not *just* file size.

But what's the difference?  If you don't compress as well, you have
more patterning, thus more information to potential attackers.

>
>:  The objective, quantitative measurement of compression algorithms
>: is compression ratio.
>
>Not for compression before encryption it's not.
>
>Compression ratio is *one* factor.  Whether the compressor systematically
>adds information to the file, is another, largely independent factor.
>
>[snip]
>
>:>Certainly I'm more interested in building compressors which avoid adding
>:>clues of their own to files than I am in measuring /exactly/ how shafted
>:>ordinary compressors are on this front ;-)
>
>: Or if they are at all!
>
>They are.  They add information.  It has been demonstrated for a number of
>compressors.  In fact, no other one-on-one compressor has been found yet.

Plaintext is not random.  Plaintext has patterns.  Compression reduces
patterning, making decryption more difficult.  If you don't add
information, but don't reduce patterning, you're not accomplishing
much.  The overall amount of information present shouldn't be any more
of an issue than a minor change in the size of the file.

Or in another form:  Say an uncompressed file contains X amount of
information in Y bytes, yet can be represented in Z bytes in
compressed form.  This means that the uncompressed file contains Y-Z
bytes of redundant information, which is effectively known
information.  In general, if the information added by compression is
less than the redundant information removed, you have an improvement.
I say "in general" because clearly the location and type of
information would affect its usefulness for decryption.

To say that non symmetrical compression is advantageous because it
guarantees that any possible output is possible, even those with
patterns, is to say that a traditionally compressed file is a weakness
because it has no patterns.  I contend this is not so, as the set of
non patterned files is large.  (More specifically, there is an
extremely large number of possible output files from a standard
compression algorithm of any size X.)


>
>How severe the problem is, and how easy it is for analysts to make use of
>the information has not been so widely studied, AFAIK.
>
>: If the point is to reduce patterning, why not address that [...]
>
>The point is to increase the entropy per bit.  You *can't* increase the
>entropy of the whole file, without injecting "real" randomness.
>
No, but you can increase the entropy per bit by compressing the file.
Without any measure of overall entropy discussed or suggested, the
only sure way to maximize the result is to minimize the file size.

>: I can certainly understand the rationale for looking at compression
>: without added patterning, but what does symmetry have to do with it?
>
>It's a condition which prevents the compressor from adding information of
>its own making to the file, when the one-on-one property is present.
>
In doing so it *must* be weaker in compression, as in some cases it
will make the file larger!  So we come back to the information per
byte question, or even the overall information question.  Again, if
overall information were the issue, file size would be more related to
decryption ease than it is.

>:>: Everything I've seen has been subjective, based on viewing the file, or
>:>: anecdotal; and even these examples weren't specific as to whether the
>:>: patterning was confined to the header and footer or not.
>:>
>:>Perhaps you need to look again at the eariler posts.  David has discussed
>:>ordinary Huffman compression on a number of occasions.  This is now the
>:>third time I have mentioned that LZ compression schemes typically map both
>:>"3X + 5X" and "4X + 4X" and "8X" to "XXXXXXXX".
>
>: But that's an example of a pattern in the input, not the output.
>: We're not talking about the input.
>
>?
>
>These patterns are in the input to the decompressor.

But that's not an example of patterning on the output of the
compressor, which for use with encryption is the only part that
matters.  If the output from the decompressor weren't patterned, it
wouldn't be decompressing!
>
>They illustrate clearly that the non-one-on-one property can be
>distributed throughout the file - and not confined to "headers".
>
But this example does not show that standard compression adds
information.

>:>:>: If we try to compress a file compressed by pkzip - a "bad" compressor
>:>:>: by your definition, we'll find it doesn't compress much, if any.
>:>:>: This could be taken as an objective measurement of a lack of
>:>:>: information added, overall.
>:>:>
>:>:>No it could not.  The same incompressibility would result if you appended
>:>:>32MB of hardware-generated random numbers to the file.
>:>
>:>: Actually that wouldn't, because although the random data appended
>:>: wouldn't compress, the first part of the file would, and it would
>:>: reduce the size of the file.
>:>
>:>...but you've *already* said the first part of the file has been
>:>compressed by PKZIP and - in your own words - "we'll find it doesn't
>:>compress much, if any"!
>:
>: Random data isn't compressible, nor is pkzip output generally
>: compressible.  Both inputs are an example of low patterning.  I'm not
>: suggesting that ALL patterns are eliminated by compression, only that
>: the compressor with the highest compression ratio should have the
>: largest reduction of patterning.  This should be true even if counting
>: the information added by compression.
>
>Information added by the compressor may be qualatitatively different from
>redundant information in the plaintext that has not been squeezed out.

I'd agree.  But is this information more useful to the attacker than
the patterning left in the file by less than optimal compression?

>
>In particular information added may be essentially the same in every file,
>independent of the plaintext of the messages.
>
This is certainly a problem, and I'm calling it "housekeeping
information" & will describe what I mean below.

>You should be able to see why we are emphasising "added information" over
>"redundancy that has failed to be squeezed from the messages" from this.
>
I can see you're emphasizing it, but don't believe that one type of
information is more valuable than the other to an attacker.

>:>:>: A few bytes in the header?  Sure.  Patterns throughout the file?  Doesn't
>:>:>: seem likely.
>:>:>
>:>:>You don't seem to grasp how severly compression preeceeding encryption
>:>:>differs from ordinary compression:
>:>
>:>: As far as the compression goes, it doesn't differ at all.  
>:>
>:>: What this sounds like is a presentation of a new form of encryption,
>:>: presented in a way to side step analysis of the strength of the cipher.
>:>
>:>Analysis of the strength of the cypher would be fine.  It's *cracking* the
>:>cypher that might cause problems.
>:
>: I'm wondering if this scheme is being proposed so that someone
>: couldn't tell if a decrypted file was, in fact, decrypted correctly,
>: as the compressed file would always decrypt to "something".
>
>This is one of a number of plus points of the scheme.  It is hardly the
>primary motivation, though.
>
>: If so, there are a handful reasons why this won't work.
>
>?
>
>: First, if the compressed file has patterns, especially if the patterns
>: are specific to the compressor, it'll be recognizable as probably plaintext.
>
>?
>
>: Second, decompressing the file isn't difficult at all, and would
>: quickly establish if the decryption were correct or not.
>
>?
>
>If the compression is good enough, /all/ decompressed messages will
>look plausible.

That can't be true.  One message may decompress to a spreadsheet
format file, while another may decompress to a seemingly random
collection of bytes.  Even with a large key, the set of possible
plaintext files is typically going to be astronomically larger than
the set of all possible key values.  The chances of someone decrypting
a file to the wrong compressed value, then having it decompress to an
English text file, as an example, for typical file and key sizes,
seems too unlikely to fathom.

>We may not have the technology to do this for text, but for some
>formats, compression more closely approaches this ideal.
>
If you're talking about some type of seemingly random data file, then
I'd agree, but this doesn't seem typical, and not practical unless the
compression was specifically designed for that particular format.

>: Finally, it seems of interest only for a brute force attack, and
>: this shouldn't be a concern with a decent algorithm and key size.
>
>I agree that brute-force attacks should not get too much attention.
>
>However, there will be cases where the attack can be used.
>
>Imagine you have extractes 102 bits of the key from agent orange before he
>commits suicide.
>
>Suddenly a brute-force attack on the remainder becomes plausible.
>
>There are other circumstances in analysis where the ability to rule out
>particular keys is useful.

But you could still do this by using one-on-one, by simply
decompressing.  If brute force is practical in some application,
decompressing as part of this shouldn't be much trouble.

Additionally, brute force seems a lot likely than chosen plaintext, as
discussed in other posts and a weakness of one-on-one.

>
>:>: I'll grant you that poor compression can increase patterning, and provide
>:>: for known plaintext attacks, but again, unless random data or a keyed
>:>: system is added, the compression will still result in some form of
>:>: known plaintext attack being possible.
>:>
>:>This appears to be a questionable notion.  In the case where compression
>:>reduces the size of the message to the size of the key, the entire system
>:>reduces to a OTP.
>
>: Even if it's a OTP, it still allows a known plaintext attack.
>
>Yes, /if/ the keys have been generated by a less-than perfectly random
>process, such an attack might work.
>
OTP is always vulnerable to known plaintext.  That's why a OTP has to
be one time.

>:>Perhaps.  However it depends on the volume of information concerned
>:>in each case, and various other factors.
>:>
>:>Note that the compressor may add the /same/ type of regularity to all the
>:>files it treats, even random files where an analytic attack based on
>:>patterns in the data would not normally be feasible.
>:>
>:>You need to quantify your problems before claiming one is smaller than
>:>another.  With qualatatively different security problems - such as the
>:>ones faced here - this can be difficult.
>:>
>:>Fortunately - in principle - both problems can be eliminated.
>:>
>:>However "perfect" compressors are rare beasts - but fortunately there
>:>is at least one compressor that has completely eliminated the other
>:>problem.
>:
>: Having a compressor that doesn't have a defined header format or CRC's
>: is certainly an advantage, but I don't see how that has anything to do
>: with the concept of "one-on-one".
>
>It doesn't necessarily.  Did anyone claim otherwise?
>
The added information of pkzip has been used as an argument for
one-on-one

>: That the compression is non-symtetrical or non-unique doesn't equate
>: with added data.
>
>Yes it does.  Sorry.

In that non perfect, non one-on-one reduces the number of possible
values at the output of the compression, you're right.  But I'd
contend that what were talking about is mapping the input file to a
smaller output file which is a subset of all possible output files of
that size.  In other words, the added information is specifically
because the compression isn't perfect.  If the compression were
perfect, then the set of possible output files would be the same as
the set of all possible files of the same size.  Even if you can prove
that this must be a one-on-one compression, it doesn't prove that the
quality of not being-one-on-one adds information.

>
>:>There are probably patterns in the compressed file remaining from
>:>inadequate compression - but that has nothing to do with wther or not
>:>there are additional patterns generated by the compressor there as well.
>:
>: I would agree - there are two issues.  I'd contend that the patterns
>: resulting from inadequate compression would tend to be lower with
>: compression algorithms of higher compression ratio, and I again
>: haven't seen examples of patterns added by compression, again
>: excepting housekeeping information, which of course is a problem.
>
>What counts as "housekeeping information"?

Used the term loosely.  Apologies.  Was referring to the typically
well defined header left by compression programs, as well as error
checks and formatting information left at the end of a file.  
>
>/Any/ systematically-added information represents a potential security
>problem.
>
>:>:>No.  Most compression programs are specifically designed not to do this.
>:>
>:>: I'll agree with programs, but not algorithms, as the programs add the
>:>: CRC information you've mentioned.  This I'll also agree is a problem,
>:>: especially if there are checksums scattered within the file.
>:>
>:>Algorithms with no concern for error recovery or detection may not be
>:>"designed to do this".  A number of them do /still/ scatter their own data
>:>carelessly through the file, though.  See the LZW family, for example.
>
>: But how much is added patterning, and is this more than they gain by
>: having a higher compression ratio?
>
>Added patterning is fifferent from failure to compress as well as is
>possible.  Different types of attack result.
>
>You can't very usefully say one is "more" than the other - unless
>one of them is zero.

I'd say that sounds true even if one didn't add information.  We're
trying to figure out which is better to leave/add, without having
quantitative information about either the value of the types of
information or the amount.  Seems pointless.  

I'm beginning to see both have weaknesses, and am wondering if the
"added" information of standard compression is just the byproduct of
non optimal compression, but that's possibly just a semantical
question.


>
>[snip rest]


------------------------------

From: [EMAIL PROTECTED] (Tom)
Subject: Re: Compression: A ? for David Scott
Date: Wed, 03 Nov 1999 10:04:41 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 02 Nov 1999 05:22:01 GMT, [EMAIL PROTECTED]
(SCOTT19U.ZIP_GUY) wrote:

>
>  This is getting to long so I chopped the bottom half. We
>are stating to over and over old stuff. You obviosuly think
>it is not worth it. Then you don't have to use it. If your happy
>with a high compression ratio using a method where the only
>possible valid key is the key to the encyrption you used fine.
>
Agree about the thread length.  I posted the chosen plaintext comments
before my newsfeed delivered your posts. 

If you mean the only valid output from the decryption process is the
one that was created by the compressor, that's clearly not true.  The
set of valid output files would be the set of valid output files from
the compressor, which would be an extremely large number.

I'll admit that the possibility of  chosen plaintext does depend
significantly on the implementation.

I'm less sure about patterning.  I completely agree with you about the
desire to have a compressor not add information.  That one-on-one
always decompresses to "something" still doesn't seem useful, as it
should still leave repetitive patterns in the file, while a high rate
compression system may not.  The decompression to anything feature
would seem useful only if the entirety of the file were decrypted, or
at the very least the file were decrypted from the beginning, as would
be the case with a brute force attack.  Still doesn't seem
significant, because the file could still be easily recognized after
decompression.

If any compressor leads to repetitive patterns (and I don't know if
one-on-one does), then that would seem to aid decryption, whether the
patterns were the result of the compression method, or left
uncompressed by the algorithm.  



------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Kerberos Question
Date: 3 Nov 99 05:58:14 GMT

Craig Inglis ([EMAIL PROTECTED]) wrote:
: John Myre wrote in message <[EMAIL PROTECTED]>...
: >
: >Regarding Kerberos, EKE, and SPEKE, what about SRP?  It's in
: >the same general vein.
: >
: >It ought to be at http://srp.stanford.edu/srp/, although I
: >can't seem to connect any more.


: Or SNAKE, which isnt subject to the patent restrictions which
: encumber the others.

: see http://www.smdp.freeserve.co.uk/snake.html

: an example application and source is there for the curious.

: SNAKE is completely public domain... not open source or
: GPL'd etc.  :-)

Well, I've given up and added a section to my web page which includes a
description of EKE - and descriptions of SPEKE, SNAKE, and SRP, which I
had to learn about myself.

http://www.ecn.ab.ca/~jsavard/mi060703.htm

I'll be adding a description of coin flipping to it and some other types
of protocol as well. I don't know if I'll go as far as to discuss digital
cash, though...

Anyhow, since the 2nd edition of AC only had EKE in it, I suppose
describing all the techniques in one place is useful.

John Savard

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Your Opinions on Quantum Cryptography
Date: 3 Nov 99 06:10:21 GMT

[EMAIL PROTECTED] wrote:
: 1. Is there a need for Quantum Cryptography?
: 2. Will Quantum Cryptography reach a phase where it can be implemented
: over long distances successfully?
: 3. Will Quantum Cryptography become a neccesity against increasing
: advanced crypto attacks?

Quantum cryptography is an elaborate physical method for distributing a
one-time pad, and as such provides maximum theoretical security.

However, for most users of cryptography, exchanging a CD-ROM filled with
true physical random numbers in advance - if quantum computers or other
such techniques made all existing cryptographic methods obsolete - is
easier than maintaining a quantum link with their correspondents.

Thus:

while I feel that (2) will be very difficult, although impressive results
have already been achieved, in the sense that even if a quantum conduit
could girdle the Earth, a public switched quantum network is not on,

(3) is the most telling problem: in the unlikely event that something like
quantum computing turned out to be more powerful than is generally
believed possible, there is still a simple alternative, the one-time pad.

Perhaps quantum cryptography will find some special niche uses, such as
initiating the security of the first permanent Earth-Mars
telecommunications link. (Line-of-sight is good for Quantum Cryptography,
even if the distance is long...) Or it might be used simply to allow to
separate machines to verify each other in the generation of physical
random numbers...perhaps for a future version of the Clipper chip (let's 
hope not) or for the winning number in a future lottery.

But it is entirely possible, despite the fact that I am disposed to
pessimism about its prospects, that quantum cryptography could surprise
everyone, and actually become the "wave of the future". There is a lack of
obvious indications that it will be really needed, but it is also true
that this field is full of surprises.

John Savard

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Scientific Progress and the NSA (was: Bruce Schneier's Crypto
Date: Wed, 03 Nov 1999 07:23:00 GMT

In article <[EMAIL PROTECTED]>, "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
>"SCOTT19U.ZIP_GUY" wrote:
>>   But what are these important and difficult tasks.
>
>Gee, an easy question!  Basically, to obtain information
>useful to the US executive branch (including defense) by
>the analysis of foreign signals, and to protect the
>security of official US communications.  There have been
>extensions of this fundamental mission, but that's the
>gist of it.
 
   Thats interesting since I was under the impression
parts of there charter are classifed and you seem unaware
of the spying they do on US citizens. Since the spying
is for the executive branch does Clinton get to spy on
the legislators but there results of spying are not available
to them. Where the HELL was the NSA when our nuclear
screts were giving to CHINA or did Clinton order than to
not look.




David A. Scott
--

SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
                    
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm

Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm

Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm

**NOTE EMAIL address is for SPAMERS***

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to