Cryptography-Digest Digest #453, Volume #10      Tue, 26 Oct 99 18:13:03 EDT

Contents:
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column (David 
Wagner)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column (David 
Wagner)
  Re: multiple signature scheme (Paul Koning)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column (David 
Wagner)
  Re: some information theory (very long plus 72K attchmt) (Tim Tyler)
  Re: some information theory (very long plus 72K attchmt) (Tim Tyler)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column 
(SCOTT19U.ZIP_GUY)
  Re: some information theory (very long plus 72K attchmt) (Anton Stiglic)
  Re: This compression argument must end now (SCOTT19U.ZIP_GUY)
  scramdisk 2.02h-fr ("Dr Fantastik")
  Re: This compression argument must end now (Tom St Denis)
  Re: Unbiased One to One Compression (Tom St Denis)
  Re: Slide this.... (Tom St Denis)
  Eurocrypt 2000 submission deadline: November 3, 1999 (Eurocrypt 2000)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: 26 Oct 1999 11:44:08 -0700

In article <[EMAIL PROTECTED]>,
Trevor Jackson, III <[EMAIL PROTECTED]> wrote:
> I response to the above referenced paper, my only comment is that the model
> you've proposed lacks any characterization of the dilution effect of multiple
> ciphers upon adversarial resource allocation.  In particular the assumption that
> the probability of a break is monotonicly related to the effort spent on
> designing and testing it and independent of the effort spent on cracking it.

Yes, you're absolutely right.  I had in mind the implicit assumption that
that adversary's analysis resources would be so much larger than ours that
they could be considered essentially infinite, or at least large enough that
dilution effects still were negligible -- so that, if a cipher had a hole,
the adversary would find it.  I suppose this is in line with the tradition
of making conservative assumptions about the adversary's capabilities, although
it might cause inaccuracies for the modelling purposes.

Thanks for pointing that out.  At the very least, it is an assumption which
should have been made explicit; moreover, it seems to be a partial limitation
of the model.  I appreciate the feedback.

------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: 26 Oct 1999 11:39:30 -0700

In article <[EMAIL PROTECTED]>, Terry Ritter <[EMAIL PROTECTED]> wrote:
> I see no reason why the negotiation protocol must be "cryptographic,"
> other than in the sense that it will handle cryptographic objects
> (ciphers).  [...] The
> negotiation protocol can afford to have some weaknesses.  

If you have a non-cryptographic negotiation protocol, you might end
up with the weakest of the ciphers supported by both ends, rather than
the strongest of them.

The SSL 2.0 ciphersuite negotiation protocol is a good case in point.
The client (Alice) sent the list of ciphersuites supported in her
implementation; the server (Bob) picked the first one from that list
that is also supported in his implementation, and both ends started
using the cipher picked by Bob.  Thus, an example protocol run might
look something like this:
  Alice -> Bob:  RC4, DES, IDEA, Triple-DES, RC4-40-export, ...
  Bob -> Alice:  IDEA

The problem is that there are severe attacks on the SSL 2.0 protocol,
due to the non-cryptographic nature of the negotiation protocol.  For
instance, an active attacker (a "man-in-the-middle") can edit Alice's
list of supported ciphers, removing all but the weakest one supported
by both endpoints (e.g., RC4-40-export); then Bob will be forced to
pick that weak one, and both ends will use the weak cipher.  Even if
both endpoints support a stronger cipher, they won't know that they've
been silently forced down to "least-common-denominator security".

Specifically, in SSL 2.0, even if both the client and the server
support strong non-exportable cryptography, they can be undetectably
forced to use weak exportable (40-bit) crypto for their communications.

In general, it is quite a challenge to design a general, secure, and
practical protocol for cipher negotiation that resists these attacks.

This illustrates the risk of cipher negotation: you might end up with
the _weakest_ of the available ciphers, and thus you'd better be very
sure that every cipher you support is strong enough for all purposes.

------------------------------

From: Paul Koning <[EMAIL PROTECTED]>
Subject: Re: multiple signature scheme
Date: Tue, 26 Oct 1999 14:58:23 -0400

dino wrote:
> 
> Hi
> There is some standard for multiple signature on the same file?
> Infact, when two people sign a paper contract, they sign the same copy,
> maybe two identical copies.
> What happen when they are using digital sign?

Just have each sign the file, keeping the signature separate
from the file itself (in PGP that's called "detached signature").

A digital signature (unlike a paper one) specifically goes with
a particular data file, so each single signature can be checked
as being a signature FOR that file.

        paul

------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: 26 Oct 1999 11:59:14 -0700

In article <[EMAIL PROTECTED]>, Terry Ritter <[EMAIL PROTECTED]> wrote:
> This appears to be the same old stuff to which I have already replied
> in detail.  Oddly, you do not seem to give the opposing arguments
> equal space on your web page.

Yes, you're right.  You (and several others) have given some excellent
feedback which has definitely helped me, and which was not directly
incorporated into that web page.

But the goal wasn't to provide a comprehensive survey of the strengths
and weaknesses of the model.  (I have to leave that for later, due to
lack of time.)  It was just intended as a quick "5-minute" hack to provide
a relatively brief introduction to the proposed model, with a pointer to
more extensive discussion on it on Dejanews.  I don't have time for more.

It's not clear to me how this demonstrates some sort of bias on my part,
but I apologize if I caused any confusion.

As to the rest of your comments in this post, we've already discussed
them in detail, as you say, so I'm not convinced that replying will really
advance the discussion much.  Still, if I get a chance, I'll try to reply.

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: some information theory (very long plus 72K attchmt)
Reply-To: [EMAIL PROTECTED]
Date: Tue, 26 Oct 1999 19:04:23 GMT

Anton Stiglic <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> Anton Stiglic <[EMAIL PROTECTED]> wrote:
:> : John Savard wrote:

:> :> Compression, as it gets better and better, produces a result that
:> :> approaches closer and closer to being random.
:>
:> : This is absolutely wrong!
:>
:> No, it's absolutely right. [snip]

:> : If the messages picked in P are not random (but english text),
:> : Comp(P) will never be random.
:>
:> They will if Comp(P) is a good compression function.  You can see this
:> by considering sets of finite strings of finite size and mapping them onto
:> the integers from 1 to N.  The compressed results are random from a simple
:> counting argument.

: No it doesn't, here is a little example:
:    Let say we have P = {"I agree with Alice",  "I agree with Bob"},
: and that my source that generates plaintext (which I will also call P)
: is P =   "I agree with Alice"  with prob 5/6
:               "I agree with Bob" with prob 1/6

: Now let's say Comp is such that "I agree with Alice" goes to 0
: and "I agree with Bob" --> 1.

: Now, give me six outputs of Comp(P), call them c1, c2, c3, c4, c5, c6,
: what value, for the cX s, do you think will dominate, in average 5 out
: of the 6 cX s will be 0.   So by oberving this, I will say that I'm not
: getting uniformaly random stuff.

Notice that the discussion up to this point related to  "Comp(P)"
(singular).

I was attempting to discuss the statistical properties of bits in an
*individual* compressed message, given a message chosen from the set of
potential inputs at random.  It was my claim that such compressed files
approach being indistinguishable from random bitstrings, as the
compression gets better and better.

If you choose the same message and compress it several times, and
concatenate the results I completely agree that the resuls need no
longer be apparently random.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Slit your wrists to lower your blood pressure.

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: some information theory (very long plus 72K attchmt)
Reply-To: [EMAIL PROTECTED]
Date: Tue, 26 Oct 1999 19:13:28 GMT

Anton Stiglic <[EMAIL PROTECTED]> wrote:

:> Where P is the set of all *possible* plaintexts?  Who's talking about the
:> set of all conceivable strings of data? as far as I'm concerned we're
:> talking about compressing messages that are actually compressible - for
:> example, English text messages.

: Well, it's obvious that you compression function is not gonna have, as
: domain, just the set of elements that are english text messages, you
: compression function

But it's /far/ from obvious that the compression program can't be
tragetted at such messages.  It might make the majority of such messages
shrink, and make any binary files fed to it grow slightly.

We're not talking about a domain limited to just text messages, merely a
compression program targetted at such messages, so that it produces
sensible compression when faced with such inputs.

It doesn't have to do this /perfectly/.  If a few binary files wind up
getting compressed as well, there's little harm in that.

: [If...] in some way you are saying that you can create the subset of 
: english text messages, from a set {0,1}^n, in some polynomial time????

But nobody is saying any such thing.  It's simply not necessary for
compression programs that target text messages.

Such compressors have been around for donkey's years: look at the
homophones to replace common words like "the" in alphabetic substitution
cyphers.  These compress the message, while removing clues that aid
frequency analysis from the attacker - that's the whole point of them.

Such homophone compression is targetted specifically at English text.
The existence of such compressors should be completely uncontroversial.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Some things have to be believed to be seen.

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Tue, 26 Oct 1999 20:10:45 GMT

In article <7v4sh2$ps2$[EMAIL PROTECTED]>, 
[EMAIL PROTECTED] (David Wagner) wrote:
>In article <[EMAIL PROTECTED]>, Terry Ritter <[EMAIL PROTECTED]> wrote:
>> I see no reason why the negotiation protocol must be "cryptographic,"
>> other than in the sense that it will handle cryptographic objects
>> (ciphers).  [...] The
>> negotiation protocol can afford to have some weaknesses.  
>
>If you have a non-cryptographic negotiation protocol, you might end
>up with the weakest of the ciphers supported by both ends, rather than
>the strongest of them.
>
>The SSL 2.0 ciphersuite negotiation protocol is a good case in point.
>The client (Alice) sent the list of ciphersuites supported in her
>implementation; the server (Bob) picked the first one from that list
>that is also supported in his implementation, and both ends started
>using the cipher picked by Bob.  Thus, an example protocol run might
>look something like this:
>  Alice -> Bob:  RC4, DES, IDEA, Triple-DES, RC4-40-export, ...
>  Bob -> Alice:  IDEA
>
>The problem is that there are severe attacks on the SSL 2.0 protocol,
>due to the non-cryptographic nature of the negotiation protocol.  For
>instance, an active attacker (a "man-in-the-middle") can edit Alice's
>list of supported ciphers, removing all but the weakest one supported
>by both endpoints (e.g., RC4-40-export); then Bob will be forced to
>pick that weak one, and both ends will use the weak cipher.  Even if
>both endpoints support a stronger cipher, they won't know that they've
>been silently forced down to "least-common-denominator security".
>
>Specifically, in SSL 2.0, even if both the client and the server
>support strong non-exportable cryptography, they can be undetectably
>forced to use weak exportable (40-bit) crypto for their communications.
>
>In general, it is quite a challenge to design a general, secure, and
>practical protocol for cipher negotiation that resists these attacks.
>
>This illustrates the risk of cipher negotation: you might end up with
>the _weakest_ of the available ciphers, and thus you'd better be very
>sure that every cipher you support is strong enough for all purposes.

   This is why organizations should not trust browsers to automatically
encrypt there data before it is sent out on the net. People if possible
should encrypt and decrypt there data in files off line of sending it over
the web so that no automatic handshaking could occur that would allow
for a weak encryption method.




David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: some information theory (very long plus 72K attchmt)
Date: Tue, 26 Oct 1999 15:24:13 -0400

Tim Tyler wrote:

>
>
> Notice that the discussion up to this point related to  "Comp(P)"
> (singular).
>
> I was attempting to discuss the statistical properties of bits in an
> *individual* compressed message, given a message chosen from the set of
> potential inputs at random.  It was my claim that such compressed files
> approach being indistinguishable from random bitstrings, as the
> compression gets better and better.
>
> If you choose the same message and compress it several times, and
> concatenate the results I completely agree that the resuls need no
> longer be apparently random.
>

Yes, I beleive this was the source of confusion on both of our sides.  You
where considering the source as spitting out bits, such as a stream cipher,
in wich case, compression does give less ciphertext to an enemy AND
it does increase the entropy.
I was thinking more in a sort of mathematical model such as RSA, where
one considers a message as an integer and encrypts that.  Compression, here,
does not give less ciphertext to an enemy and does not modify it's entropy.
We where in fact giving valid arguments but in different models, this has
in fact inlighten me and I now know how to explain what I wanted to say
in a much better way.  The point about an attacker getting several chunks
of encryption of a single message m, or one single ciphertext c for a message
m, is an important point to distinguish when talking about entropy.

Anton


------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: This compression argument must end now
Date: Tue, 26 Oct 1999 20:49:25 GMT

In article <c1mR3.7280$[EMAIL PROTECTED]>, gtf[@]cirp.org (Geoffrey T. 
Falk) wrote:
>In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:
>>Geoffrey T. Falk <gtf[@]cirp.org> wrote:
>>: For example, here is my favourite compression algorithm: Copy the input
>>: file to the output file. This algorithm is "one-on-one." But it is
>>: worthless for cryptography.
>>
>>In what sense is this a "compression" program.  It totally fails to
>>compress all possible target sets.
>
>It is a compression program. It is even the best possible compression
>program under some circumstances. What more do you want? :-)

   I agree that it is better than the other methods which are non 1-1
and add information which help break a system. But this does nothing
to show that if one wants real compression that one should not use a
one to one compressor.  If you don't like my current version the concept
is not that difficlut. You can easily if you have any brains that is. Come up
with some useful one to one compression method for other techniques.
And since it does not add any information as Tim as repeatedly tried to
tell you it is what most compression methods should have been striving
for in the first place but for some very strange reason have not.
  What you fail to see either because of you up bringing or lack of
intelligence is that one should never add something to a crypto system
that helps to break it. Compression is commonly added as suggested in
even MR BS's book which I don't recommend any one buying but the
added compression is not one to one and actually weakens the over
all system. That is what this thread is about. And as I have repeatedly
asked is are there any other common compression system in use that
actaully concern themselves with the cryptographical considerations
as to there use and why does PGP not use a one to one compression routine
other than to make it easer for the NSA to break.




David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: "Dr Fantastik" <[EMAIL PROTECTED]>
Subject: scramdisk 2.02h-fr
Date: Tue, 26 Oct 1999 21:58:18 -0000

Comment faire machine arri�re ?

J'ai crypt� (triple DES) un lecteur logique E: (renomm� Z: dans Scramdisk,
window98) de 700mo env,
et ne trouve pas la fonction ou la manip pour "d�crypter" mon volume...!
J'ai �cr�m� le manuel : rien sur la suppression d'un volume crypt� Scram, ou
de quelconque fonction "D�cryptage des partitions/volumes"

Le pblme : le volume Z: n'appara�t pas sous le Dos, par FDisk ; donc ces
700mo ne sont pas r�cup�rables, m�me lorsque je recr�e puis formatte de
nouveaux lecteurs logiques dans ma partition �tendue Dos (comprenant D: E:
F: mais les 700mo de Z: inaccessibles)

Help...

Thank you





------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: This compression argument must end now
Date: Tue, 26 Oct 1999 19:57:55 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
>
> Well, tautologically.  If there were known breaks, the cyphers
> would probably lose their popularitity ;-)

Not exactly.  Some ciphers which are easy to break (Vinegere for
example) are teaching tools.  They remain popular.

I should have been more precise however.

> I thought you were saying folks should stop discussion compression
before
> encryption.

No I said stop arguing.  Just use a high compression ratio or suitable
compression algorithm...


> So what?  From this you cannot conclude that they're infinitely
secure,
> and that adding security is useless.  I predict *all* the above
cyphers
> will eventually fall to cryptanalysis.

Yeah but prove you need to keep your info infinitely secure with a
symmetric cipher.

> Well, we have, in fact - I cited one only yesterday.

You cited a system broken by the 'weak' compression?  Which one is this?

> Besides, it's pretty blinking obvious.  Non-one-on-one encryption
> systematically adds data to the message that was not present in
> the plaintext.  Unless this data is *really* random, it doesn't take
much
> in the way of brainpower to see that this might be at the root of
> cryptanalytic attacks.

But even adaptive huffman coding is not asymtotically optimal for the
first series of symbols.  You have a learning curve (for adaptive
methods anyways) which means the first occurences of the bias are not
compressed.


> Indeed - the security depends on other factors in addition to this.
> All we claim is that systems *without* this property (and which do not
> employ hardware RNGs or similar) all have unnecessary security
> deficits.  With a /very/ strong cypher, a few security deficits may
> perhaps be shrugged off, but so far we know of *no* practical,
> demonstrably strong cyphers, where the key is much shorter than the
> message.

Again I think chaining modes thwart your analysis.  Take  the ASCII
model.  When I use a random IV and CBC I am not guaranteed to be
encrypting ASCII text blocks.

> : It might not be possible todo so with PKZIP but just use the deflate
> : (with all error checking removed) and it's just as easy.
>
> You are asserting PKZIP is one-one-one?  Apologies - but are you on
drugs?

Sorry to burst your bubble, but PKZIP is not a compression algorithm.

> You have Faith.  Praise the Lord!

Well I have faith in other intelligent humans (which I would say you
are one of).  That much is true.

> Most *recognise* that they don't know how strong their cyphers are.

You should phrase that to 'they cannot prove their ciphers are strong,
but can conjecture with a body of evidence their strengths'.

> Those that don't are mostly snake-oil vendors, wearing their marketing
> hats.

Most good research papers have nothing todo with marketting.

> Virtually /any/ cypher may be broken tomorrow.  Heck, factorising
> into prime pairs may have some nifty shortcut.  In fact it looks
> very much like special machines will be able to do it in a jiffy!

This is true, even for your ideas as well.  That's why I would rather
lead away from speculation to analysis and research.  DES for example
is still strong today [minus the short key] because people who thought
about what they were doing wrote it.

> If this happens (assuming it has not happend already behind closed
doors)
> swathes of modern encryption, including your beloved PGP may simply
> collapse overnight.
>
> I'm sorry Tom, but I see your "secure" world as a dream - or perhaps a
> fantasy.  History is *littered* with broken cyphers.
>
> One day the cryptographers /may/ completely leave behind the
> cryptanalysts: that may be the nature of the cryptography game.
> However that day is *not* yet at hand.

So why do you even post here?  If all ciphers will be broken, why
invent them?  Why research them?  I would rather be more realistic.
People haven't broken Blowfish for example because we lack good metrics
to attack the sboxes.  RC5 because the of the non-isomorphism between
xor-rotate-add operations.  etc...  It's true that they good be done,
but a body of evidence suggests otherwise.

> The poster stated flat that they used CRC regularities in the
compressed
> files to guide the cryptanalytic attack.
>
> If I am denied such examples, your request for cyphers that have been
> cracked by inadequate compression becomes meaningless.

Actually the attack on PKZIP's encryption can be done on 13 known
plaintext bytes.  Whether compressed or not.  It's computationally hard
to get anything usefull out of a CRC32 from a 1mbit message.  Sorry
thems the facts.  While finding collisions in a CRC32 are easy (thus
not a secure hash).

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Unbiased One to One Compression
Date: Tue, 26 Oct 1999 20:01:53 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
>
> You're saying you think that any attempt on my part to explain my
> understanding of the workings or security benefits of a certain type
of
> compression designed specifically for compression before encryption
would
> be OFF TOPIC in this forum?!??
>
> Compression *is* a component of cryptography.  Get with it.

No actually you are wrong.  Compression is part of the cryptosystem not
the cipher or use of the cipher.  This is sci.crypt not alt.security.
You get with it.

If you have metrics by which you can say 'this compression' makes 'this
system' more secure then by all means share it.  No more speculation
please.

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Slide this....
Date: Tue, 26 Oct 1999 20:05:14 GMT

In article <oghR3.195$[EMAIL PROTECTED]>,
  "J. Byron" <[EMAIL PROTECTED]> wrote:
>
> Tom St Denis wrote in message:
> >
> ><snip insanely stupid post>
>
> Name calling?? That seems rather simple.

Technically I did not call him anything ^s^

> Can someone not make a free contribution to the field without
comment? Where
> is the analysis of your XTEA ????

No he posts his code snippet at least twice a month.  Never discusses
it further.  It's basically a reminder that stupid people are out there.

If by XTEA you mean the cipher in peekboo, that's a toy.  its'
basically a RC5 clone.  I wouldn't use it in any serious manner (in
fact I normally use one of the first seven ciphers).

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Eurocrypt 2000 <[EMAIL PROTECTED]>
Subject: Eurocrypt 2000 submission deadline: November 3, 1999
Date: Tue, 26 Oct 1999 22:25:27 +0200

=============================================================
                  E U R O C R Y P T    2 0 0 0
                         May 14-18, 2000
                    Bruges (Brugge), Belgium

     http://www.esat.kuleuven.ac.be/cosic/eurocrypt2000/
=============================================================

This is a reminder that the submission deadline for 
Eurocrypt 2000 is November 3, 1999. 

Visit http://www.esat.kuleuven.ac.be/cosic/eurocrypt2000/ 
for more information.

=============================================================


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to