Cryptography-Digest Digest #248, Volume #10      Thu, 16 Sep 99 16:13:04 EDT

Contents:
  Re: Exclusive Or (XOR) Knapsacks ("Douglas A. Gwyn")
  Re: Comments on ECC (Robert Harley)
  Re: Can you believe this?? (John)
  Re: Can you believe this?? (John)
  Re: Encryptor 4.1 reviews please. (Paul Koning)
  Re: Mystery inc. (Beale cyphers) (sha99y00000)
  Re: some information theory (SCOTT19U.ZIP_GUY)
  Re: Mystery inc. (Beale cyphers) ([EMAIL PROTECTED])
  Re: SCOTT19U.ZIP_GUY/Questions Please (D Wells)
  Re: Comments on ECC (John Myre)
  Re: The good things about "bad" cryptography (John Savard)
  Re: Second "_NSAKey" ([EMAIL PROTECTED])
  crypto export rules changing (Paul Rubin)
  Re: The good things about "bad" cryptography (Eric Lee Green)
  Re: Example of a one way function? (Anton Stiglic)
  Re: Can you believe this?? (Paul Koning)
  FPGAs (Arthur Dardia)
  Re: RC4-40 Cracking (Tom St Denis)

----------------------------------------------------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Exclusive Or (XOR) Knapsacks
Date: Thu, 16 Sep 1999 18:28:27 GMT

Gary wrote:
> Problem:
> Given an n bit number X and a set {B1,B2,...,Bn} of n bit numbers;
> is there a subset whose elements collectively XORed give X?

There is iff X lies in the space spanned by the {Bi}.
So if the {Bi} constitute a basis for GF(2)^n,
the answer would be "yes" for any X.

> Can the general problem be solved easily?

It's just linear algebra in GF(2)^n.  I'm sure there are efficient
algorithms for this, but it's been a while since I've worked with
this and my memory is fuzzy.  Very likely an adaptation of some
classical algorithm for R^n would work.

------------------------------

From: Robert Harley <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: 16 Sep 1999 20:18:41 +0200


Mike Rosing writes:
> In his latest "Crypto-Gram", Bruce Schneier wrote:
> >Certicom used the event to tout the benefits of elliptic curve public-key
> >cryptography.  [...]
> >It's tiring when people don't listen to                           
> >cryptographers when they say that something is insecure, waiting instead
> >for someone to actually demonstrate the insecurity.

Did Bruce Schneier really write this in reference to ECC?!?!

If so he is out of his depth and apparently has a bit of a fat head to boot.

Rob.

------------------------------

From: John <[EMAIL PROTECTED]>
Subject: Re: Can you believe this??
Date: Thu, 16 Sep 1999 11:22:36 -0700

Your point is well taken. I will point out, though, that
as an employer, most people have probably signed a NDA, so
you are free to do more with, and as an employer, have more
rights to the source code.

I don't presume to be better, or would not ask the same of
anyone else. I wouldn't invent from scratch, either. I would
ask this.  If I put a lot of work into an algorithm, and assuming
it's half-decent, for sake of argument, say it is a pretty
good algorithm. Now, I patent or copyright the source. My
question is how can I be sure someone won't use my stuff to
improve or make a better one?  Sure, there's the legal
protection, and "the code of ethics," but is that enough?
Yeah, it should be.

http://www.aasp.net/~speechfb

* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!


------------------------------

From: John <[EMAIL PROTECTED]>
Subject: Re: Can you believe this??
Date: Thu, 16 Sep 1999 11:08:02 -0700

Yeah, truue, but without getting into politics, I wonder if
a copyright is just as good.  I know it's a lot cheaper and
protects you for longer.

http://www.aasp.net/~speechfb

* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!


------------------------------

From: Paul Koning <[EMAIL PROTECTED]>
Subject: Re: Encryptor 4.1 reviews please.
Date: Thu, 16 Sep 1999 13:11:00 -0400

Rebus777 wrote:
> 
> This program seems to be implemented pretty well, except after testing all the
> algos and combinations of algos I found that they  are all using ECB mode.

You're contradicting yourself... :-)

A program that uses ECB to do data encryption is, by definition,
implemented poorly.  (ECB is utterly and completely unfit for
that purpose...)

        paul

------------------------------

Date: Thu, 16 Sep 1999 19:31:51 +0100
From: sha99y00000 <[EMAIL PROTECTED]>
Subject: Re: Mystery inc. (Beale cyphers)

I made one big ball sup when concluding to the reason for the lack of
duplicate pairs in code #1 and #3 which closely resembled the same
output from the pseudorandom list on a computer. I assumed the encoder
must have used a similar kind of list. Wrong! I looked in my spectrum
(old home computer) manual and found that the pseudo random list on the
computer is created by a formula:

" The next pseudorandom number in a sequence generated by taking the  
powers of 75 modulo 65537, subtracting 1 and dividing by 65536     "

I conclude then that codes #1 and #3 have been encoded the same way,
with some sort of formula. 

I tested this theory by systematically increasing the numbers in code #2
with that of the alphabet:

a = 0
DO
 INPUT #1, b
 c = b + a
 WRITE #2, c
 a = a + 1
 IF a > 25 THEN a = 0
LOOP UNTIL EOF(1)

results:

1st #  2nd #   freq.
40     21      2  
42     58      2  
55     127     2  
56     25      2  
65     70      2  
67     145     2  
3      115     1  
4      31      1  
6      46      1  
9      27      1  
... etc.

6 duplicate pairs

That's at least looking better in matching codes #1 and #3. Ed Rupp's
theory didn't quite work for me because I would have expected an out
come more like
AAAAAAAAAAAAAAAAAAAAAAAAAAAABBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBCCCCCCCCCCCCCCCCCCCCCCCCCCCCDDDDDDDDDDDDDDDDD
...etc.

To get ABCDEF etc. you have to work diagonally, adding 1. Thus the
theory of formulas again. 

sha99y

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: some information theory
Date: Thu, 16 Sep 1999 17:54:17 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>Tom St Denis <[EMAIL PROTECTED]> wrote:
>:   [EMAIL PROTECTED] wrote:
>:> Anti-Spam <[EMAIL PROTECTED]> wrote:
>
>:> : First, Compressed data is NOT necessarily random data.
>:>
>:> If your compressed data is distinguishable from randomness, you're using
>:> a sub-optimal compression scheme.
>
>: If your compressed file is random it can't expand into anything real.  Note
>: that the compressed stream is as a random as the input message.  It can't be
>: any more/less random.
>
>You and I appear to be talking at cross-purposes.  The compresses stream
>will be more random in the sense that it will exhibit higher entropy, and
>more nearly pass tests for randomness.
>
>I don't see what you mean by "if your compressed file is random it can't
>expand into anything real".  When I described the compressed file as
>"random" essentially I meant that it would pass or more nearly pass
>tests for randomness.  There's no reason why such a file should /not/
>expand into a real message on decompression.
>
>:> : Many of us assume the compressed form of a file is "equivalent" in some
>:> : form to true random data.  It is not.
>:>
>:> It certainly /should/ be - or your compression algorithm is likely to
>:> be behaving sub-optimally.
>
>: Try finding the average spacing for symbols (order 0) and you will see it's
>: rarely even (for byte symbols it should be around 256).  That's one way to
>: 'detect' compressed files (this works with LHA and PKZIP).
>
>Did anyone ever claim PKZIP was an optimal compression system for any
>class of file?  For most things, even ARJ is better ;-)
>
>:> : Compressed files will not pass statistical tests for random bit streams.
>:> : A compressed file is non-random.
>:>
>:> Speak for your own compressed files ;-)
>
>: True the entropy 'per byte' is higher but the entropy 'per message' is not.
>
>I thought entropy was commonly taken to be a property of a source, not
>a property of a string.  Entropy "per byte" conforms to this common usage,
>while entropy "per message" does not.
    I agree with you. I think Tommy lacks the understanding of this
fine point.
>
>If your compressed file is distinguishable from a random stream of data
>then it is likely to contain pattern which a better compression algorithm
>would have eliminated.  Maximally compressed files should approach the
>ideal of being statistically random.
>
>One definition of what constitutes "randomness" mentions that random
>data is generally incompressible.  Conversely, incompressible data should
>look random - if there's any order in it it will be fodder for a better
>algorithm that identifies that order and squeezes it out.
   One think many fail to consider. Is that like you stated the entropy is
function of the probability of various sources. So one test that many fail
to conduct is to take several input files. You can play games with them
like XOR all the compressed files with the first file compressed. See how
much these resulting files compress ( you can drop the excess if lengths 
don't match).  For most Huffman like routines the files will compress
if the source files match for long streches in the front of file.
That is why I like to do an extra huffman pass through the file in the
reverse direction. Then  the resulting files resisted this last form
of XOR and compress.

See my web page at http://members.xoom.com/ecil/compress.htm


 


David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Mystery inc. (Beale cyphers)
Date: Thu, 16 Sep 1999 19:08:32 GMT

In article <7rr2bf$foj$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:

> I believe that cipher.doc contains the ciphers exactly as
> they appear in Viemeister's book.

  That is, except for the fact that I corrected the seven typos in B2
as I described in doi.doc.  You can easily uncorrect them if you need
the exact cipher that appears in Viemeister's book.

  -- Jeff Hill




Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: D Wells <[EMAIL PROTECTED]>
Subject: Re: SCOTT19U.ZIP_GUY/Questions Please
Date: Thu, 16 Sep 1999 13:04:00 +0100

I think this may be refering to the fact that some of the AES submissions
have been resubmitted after request for modification from the nist.  Not
sure what these modifications are....perhaps they should be documented
somewhere

"Douglas A. Gwyn" wrote:

> tunafish wrote:
> > What they seem to have done is deliberaltely weeken these algorithms
> > by asking those who submitted to make certain modification to the
> > code...
>
> Oh, good grief!  That's the old conspiracy theory resurrected
> from the early DES debate.  What EVIDENCE do you have that this
> has occurred?


------------------------------

From: John Myre <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Thu, 16 Sep 1999 12:59:07 -0600

Robert Harley wrote:
> 
> Mike Rosing writes:
> > In his latest "Crypto-Gram", Bruce Schneier wrote:
> > >Certicom used the event to tout the benefits of elliptic curve public-key
> > >cryptography.  [...]
> > >It's tiring when people don't listen to
> > >cryptographers when they say that something is insecure, waiting instead
> > >for someone to actually demonstrate the insecurity.
> 
> Did Bruce Schneier really write this in reference to ECC?!?!


No.


> 
> If so he is out of his depth and apparently has a bit of a fat head to boot.
> 
> Rob.

Mike removed some of Bruce's text.  The "tiring" comment comes
after an assertion that 512 bit RSA keys have been considered
too short by experts for quite a while.  In fact, the immediately
preceeding sentence is:

> Anyone implementing RSA should have moved to 1028-bit keys
> years ago, and should be thinking about 2048-bit keys today.

Meanwhile, I'd say your elisions from Mike's post make it
more misleading still.

John M.

P.S.
As to Mike's comments regarding Bruce's mathematical competence,
I'd reserve judgement; Alex's query about whether ECC's properties
are really proven (whatever *that* means...) is relevant.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: The good things about "bad" cryptography
Date: Thu, 16 Sep 1999 18:46:20 GMT

[EMAIL PROTECTED] (Bill Unruh) wrote, in part:

>]- if an attacker doesn't know the algorithm being used, he will have a
>]harder time of even beginning an attack;

>True. But the question is how do you KNOW that your attacker is
>ignorant. After all you have to distrubute something which impliments
>the algorithm to others for them to be able to use it. How do you know
>it has not leaked?

I guess the same way I know that the key I'm using hasn't leaked.

Although a secret algorithm _can_ be of benefit, I agree that it is
not at all sensible to rely on somebody else's secret algorithm.
Whatever extravagant claims have been made about it.

But for an organization to use its own algorithm internally does have
a benefit, *although* that benefit is indeed outweighed by the fact
that such an algorithm won't have had the kind of algorithm the major
public algorithms get.

>]- most well-known algorithms have key sizes that are just enough to resist
>]a brute-force search, even though it's not difficult to increase the key
>]size for a symmetric algorithm by an order of magnitude;

>Hardly. 128 bits for example is well, well beyond the ability to resist
>brute force attacks. That stands it seems at around 56 bits right now,
>and 2^72 times harder is not "just enough".

But the trouble is that I can't *prove*, for example, that the NSA or
someone else doesn't have some kind of attack on most block ciphers
that the academic community is 10 years away from discovering, that
allows a block cipher to be cracked in the same amount of time as
would be taken for brute-force search...of a cipher with half as many
key bits.

Making the key bigger can be cheap, and it is clearly of _some_ value.

>]- no amount of study can prove that the crack for an algorithm isn't just
>]around the corner, and such a crack seems likelier to be both found and
>]publicized for a well-known algorithm if it exists.

>No, it is much likelier to be found for a weak algorithm.

Oh, yes, that is very true. But other things being equal, a well-known
algorithm recieves more study. Which is primarily an asset, but it
_also_ has drawbacks.

I don't say that the second school of thought is superior to the first
school of thought. Far from it.

But, particularly in the case where a message must remain secret for
years after it was transmitted, the second school of thought also has
its points.

Do I know that a 128-bit key is long enough to be secure 50 years from
now? What about quantum computers?

Do I know that a flaw won't be discovered in Twofish or MARS sometime
in the next 50 years?

No, I don't. Relying on a cipher with a jillion-bit key whipped up by
an amateur, however, is obviously not a valid alternative.

And I am not trying to argue otherwise.

What I am trying to claim, though, is that although these points are
often raised by annoying amateurs with weak cipher proposals, who
neglect the more important factors while focusing on these relatively
minor ones...

these minor ones are actually valid considerations which sometimes we
can't afford to neglect.

Either some seriously scaled-up designs need to be brought to the
point where they can get some true scrutiny, or some other method has
to be found to obtain an encryption method that satisfies both kinds
of criteria.

John Savard ( teneerf<- )
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED]
Crossposted-To: talk.politics.crypto
Subject: Re: Second "_NSAKey"
Date: Thu, 16 Sep 1999 18:35:44 GMT

In article <7roj4o$71h$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (David Wagner) wrote:

> It just doesn't seem awfully believable to me unless there's some
> clear advantage to be had by using the "_NSAKEY" instead of NOPing,
> and as far as I can tell, there's apparently no such clear advantage
> to be had.  But maybe I'm missing something.  Do you disagree?

Well, I do find Pearson's arguments rather convincing.

All applications should do integrity checking on themselves, so that
inserting a few NOPs will not work. Integrity checking can be done in
such a way that an attacker would have to disassemble and study the
whole application before being able to modify it.

More importantly, it seems to me that the number of security
applications is intractable. In the near future, even small
organizations will be able to produce in-house security software. The
cost of analyzing and finding a way to disable all security
applications is much higher than the cost of disabling all security
tools, the number of which will certainly be much smaller.

The moral of the story here is that a security application should not
use external tools in object form - whether signed or not. At the very
least application designers should use security primitives in source
code.


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: [EMAIL PROTECTED] (Paul Rubin)
Crossposted-To: talk.politics.crypto
Subject: crypto export rules changing
Date: 16 Sep 1999 17:58:44 GMT

A big liberalization of export rules is supposed to be announced
today, but apparently there will also be some key escrow provisions.

http://www.sjmercury.com/breaking/headline1/024676.htm

------------------------------

From: Eric Lee Green <[EMAIL PROTECTED]>
Subject: Re: The good things about "bad" cryptography
Date: Thu, 16 Sep 1999 11:01:30 -0700

"SCOTT19U.ZIP_GUY" wrote:
> >One school of thought notes that many new cipher designs have turned out,
> >after brief examination, to be seriously flawed. Hence, because of this
> >high risk, it is not advisable to rely on any cipher that hasn't been
> >subjected to extensive study by the foremost experts in the open academic
> >world.
>    One point it is entirely possible that many of the so called foremost
> experts in the field have to worry about jobs and may have been
> influenced by agencies that do not want the public to have safe
> encryption. So it is possible that the government could rasie people
> to the status of crypto god to influence the direction of the open
> reseach. I just learned a few days ago that David Wagner is listed
> as an employee of Mr B.S. No wonder they spend so much time
> patting each other on the back.

Huh? Go to David Wagner's site and you'll see that he and Bruce have
been issuing papers together for quite some time. It's no secret. You
mean you only learned of this "a few days ago"? 

Regarding government influence: There are individuals and companies
which get extensive government business, and one could assume that these
companies would be most likely to be influenced. RC5/RC6 are bloody
brilliant work by one of the great cryptographic minds of our time, for
example, but the question of government influence remains.

But I find it hard to believe that U.S. government influence would guide
the development of algorithms like Rijndael or Serpent, which have zero
U.S. content. And for a certain person with the initials B.S. (rather
unfortunate, that), I find it hard to believe that someone who
apparently recieves the vast majority (i.e., close to 100%) of his
business from large corporations seeking security analysis of the
protection of their vital business interests would be all that amenable
to government influence. 

In short: Paranoia is a nice trait, but assuming that all of the "crypto
gods" are in the pay of the NSA is probably going overboard. (Not that
there probably isn't a few in the pay of the NSA... just that there are
enough not in the pay of the NSA to keep everybody honest). 

> has many merits. But the problem is many of the ametur methods
> leave tell tale signs of what methods are used. So it is very hard to
> get methods for the common man that leave no hooks. You should
> check to see if a mehod can encrypt without chaning the file length

Good point. Any tell-tale information should probably be stored within
the file if you're doing this, which will mean having to decrypt the
file with multiple algorithms at the other end since you don't know
which cipher is being used, but that is still preferable to hanging the
cipher indicator out there in plain text. And similarly, a block cipher
may have a "fingerprint" if you prepend each file with various header
information detailing the checksum of the file, the cipher used to
encode the file, a "decrypted okay" indicator, etc... i.e. if one block
was encrypted with Twofish, and another with the same contents was
encrypted using RC6, most likely an attacker can figure out by the
fingerprint which one was used to encrypt that file. 
   The file length itself doesn't matter (unless the file length is sent
"in the clear", meaning that the difference in lenghts is a clue to what
algorithm was used, or if we have a known plaintext attack, in which
case we know what algorithm was used here but not anything else). The
"fingerprint", on the other hand, is a different story. That would tend
to indicate that we need a very large block size, or a different
approach altogether.  

> the main experts in the field except for Ritter are on a narrow gane
> playing path and really don't care about secure encryption. They
> are playing the AES game of making one method for all which
> is a very stupid idea.

Which has been stated by many of the AES game players, i.e., that they
think it's stupid to make one algorithm be the be-all and end-all. 

One thing to note: adding algorithms basically just means that you're
adding a couple of bits to the amount of effort needed to brute-force a
key. If someone discovers that, e.g., RC6 can be cracked with minimal
effort due to leaked key information if you just know how to recover the
leaked key information, you'll basically be sending 1/6th of your
messages "in the clear" (assuming that you have six different algorithms
that you shuffle between).  If you want something really useful, NEST
the algorithms. E.g., run the output of RC6 into the input of Twofish,
and upon receiving/decrypting run the output of Twofish into the input
of RC6. Thus even if some attack is found that reduces RC6 to shreds,
the Twofish would presumably be protecting your data still. Furthermore,
use DIFFERENT KEYS for the two algorithms. That way if some way is found
to somehow recover the key for one of the algorithms, the other key
still remains secure.

Of course, the above basically doubles CPU time while doubling your
security, meaning that for ordinary transactions it's probably not
practical. Do that only if you're in Extreme Paranoia mode, or the data
involved would destroy you or your company if somehow compromised. YMMV
etc...

-- 
Eric Lee Green    http://members.tripod.com/e_l_green
  mail: [EMAIL PROTECTED]
                    ^^^^^^^    Burdening Microsoft with SPAM!

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Example of a one way function?
Date: Thu, 16 Sep 1999 13:43:42 -0400



> f(x)  = x^2 mod N,   where N = pq and p, q are primes.
>
> is beleived to be one way.

when p, q are unknown.....





------------------------------

From: Paul Koning <[EMAIL PROTECTED]>
Subject: Re: Can you believe this??
Date: Thu, 16 Sep 1999 14:35:36 -0400

Paul Crowley wrote:
> 
> Paul Koning <[EMAIL PROTECTED]> writes:
> > The ONLY difference between the two generators is that /dev/random
> > limits the number of output bits to <= the estimated amount of
> > input entropy, while /dev/urandom does not.  Bruce Scheier et al. have
> > argued (in the Yarrow paper) that this is unnecessary, given that
> > the PRNG is using a cryptographically strong mixing function (as
> > /dev/urandom does).
> 
> Which suggests that /dev/urandom should make sure to produce *no
> output at all* until the input entropy crosses some threshhold (like,
> say, 128 bits).  Does anyone know if that's what it does?

That's not what it does.  But that exact proposal has been made
(along with one to do "block reseeding", applying new entropy in
clumps of at least 128 or so).  Last I heard that discussion had
quieted down; it seems that the next step is in the hands of 
the author.

        paul

------------------------------

From: Arthur Dardia <[EMAIL PROTECTED]>
Subject: FPGAs
Date: Thu, 16 Sep 1999 14:06:03 -0400
Reply-To: [EMAIL PROTECTED]

I've recently acquired 5 Xilinx FPGAs.  The numbers on the chip look like this:

XC4005E
PC84CKJ9637
A64196A
3C

Anyone know how fast these things are?  What if I ran them in parallel?  Can
anyone point me to any resources on how to program these?  They were given to me
by a friend who used to work at National Semiconductor.  I have 2-3 test boards
for other Xilinx FPGAs, but none of the chips in the sockets on the test boards
are the same that I have in cases.  Would I be able to write code for these in
C++ or Perl?  Or, must I write it in ASM.  The other thread on RC4-40 cracking
has some relevance to this thread, because FPGAs are mentioned.

Where could I obtain sockets for these chips, and how do I make the PCBs for the
cards?  Any information what-so-ever would be a great help.

                                        Art Dardia

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: RC4-40 Cracking
Date: Thu, 16 Sep 1999 19:39:05 GMT

In article <[EMAIL PROTECTED]>,
  yoni <[EMAIL PROTECTED]> wrote:
> Can you help me clarify something ?
>
> When you refer to Cracking the RC4 you mean a "brute force" attack ?

Most likely I haven't found any open attacks on RC4 outside of the weak key
class found by Wagner and someone else (anybody know his name?).

> RC4-40 is RC4 initialized with 40 Bits key (5 bytes)?

Yes it's the short name, just like RC5-128 is 128 bit keys and RSA-155 is 155
digit moduli ...

Tom
--
damn windows... new PGP key!!!
http://people.goplay.com/tomstdenis/key.pgp
(this time I have a backup of the secret key)


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to