Cryptography-Digest Digest #413, Volume #10      Fri, 15 Oct 99 00:13:03 EDT

Contents:
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("Brian 
Gladman")
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column (Mok-Kong 
Shen)
  Re: He is back...new "improved" code (Dan Day)
  Re: Should RC4 be free? (Darren New)
  Six out of six for Kerckhoffs
  Re: where to put the trust ([EMAIL PROTECTED])
  Re: RC-5 breaking, $19 per letter (Eric Lehman)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column 
(DJohn37050)
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("Trevor 
Jackson, III")
  Re: HELP on Kerberos and SSH ("Richard Parker")
  Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column ("Trevor 
Jackson, III")
  Re: Factoring public keys attack? ("Trevor Jackson, III")
  Re: where to put the trust ("Trevor Jackson, III")

----------------------------------------------------------------------------

From: "Brian Gladman" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Thu, 14 Oct 1999 21:00:15 +0100

Bruce Schneier <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> On Thu, 14 Oct 1999 12:33:11 +0100, "Brian Gladman"
> <[EMAIL PROTECTED]> wrote:
>
[snip]
> >The arguments for multiple AES winners cannot be dismissed so
easily.There
> >are at least three reasons for wanting more than one winner: (1) to
provide
> >a degree of choice in dedicated, closed applications, (2) to provide a
> >degree of diversity in open applications (a well established practice),
and
> >(3) to meet requirements that benefit from the sequential application of
> >different encryption algortithms (again an established practice).
>
> (3) doesn't make sense to me.  Whether there is one winner or several,
> those that want to cascade multiple algorithms will do so.  They do so
> now, and there's no problem.

I am inclined to believe that five algorithms provides too much diversity
and this might be the result of choosing just one winner.  I have less of a
concern if we eliminate two and choose a winner from the remaining three but
we would need to ensure that all three that remain are free of all IPR
constraints.

> I disagree with (2) also, primarily from a hardware perspective.  I've
> gone to Ascend, Cisco, and other companies to discuss AES.  The first
> thing they are concerned about is how they will be able to fit the
> algorithms into their hardware.  Multiple algorithms means more chip
> real estate; they hate that.  In software it's no problem adding
> another algorithm, but it is in hardware.

AFAIK no-one is saying that an application must offer multiple algorithms.
And I don't see hardware suppliers going out of business in a multiple
algorithm environment - if this is a requirement I am completely confident
that they will meet it.

While we should, of course, take the concerns of hardware suppliers into
account, we should also recognise that other suppliers will be able to
provide algorithm diversity without great difficulty and will see market
advantages in doing so (as they do now).

> As to (1), why does it make sense to give people without the expertise
> to make a choice a choice?  If we, as the world's non-military
> cryptographers, cannot choose an algorithm, why should we expect
> others to?

One reason is that they may have more information than we have when they
make their choice.  It is quite possible that after the single AES winner
has been selected more information will come to light that will cast doubt
on its security performance.  The provision of limited algorithm diversity
(on the lines proposed by Don Johnson) provides a degree of protection
against this possibility.  In simple terms its not good security practice to
put all our eggs in one basket, especially so when we cannot be certain that
the basket does not have holes in it.

> My primary worry is that a system that has a choice of algorithms
> will, operationally, be as secure as the weakest.  If that is the
> case, diversity is necessarily bad.  And we will always have a backup
> algorithm: triple-DES.  I see no benefit from NIST passing the buck
> and not making a decision.

I agree that there are engineering issues involved in using mutiple
algorithms that can cause problems but these are not necesarily difficult to
overcome.  In any event we already have a great deal of practical experience
here since many internet protocols already employ such features.

And I don't see a very convincing reason to use an interim algorithm with a
64 bit block length as a backup when some of the world's best cryptographers
have provided five carefully designed algorithms from which to make our
choice.

Nor do I see multiple algorithm choice by NIST as passing the buck - its
about ensuring that the AES programme delivers what is needed instead of
assuming that what was right last time is still right now.

    Brian Gladman




------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Thu, 14 Oct 1999 22:57:23 +0200

Brian Gladman wrote:
> 

> AFAIK no-one is saying that an application must offer multiple algorithms.
> And I don't see hardware suppliers going out of business in a multiple
> algorithm environment - if this is a requirement I am completely confident
> that they will meet it.
................

> One reason is that they may have more information than we have when they
> make their choice.  It is quite possible that after the single AES winner
> has been selected more information will come to light that will cast doubt
> on its security performance.  The provision of limited algorithm diversity
> (on the lines proposed by Don Johnson) provides a degree of protection
> against this possibility.  In simple terms its not good security practice to
> put all our eggs in one basket, especially so when we cannot be certain that
> the basket does not have holes in it.

> I agree that there are engineering issues involved in using mutiple
> algorithms that can cause problems but these are not necesarily difficult to
> overcome.  In any event we already have a great deal of practical experience
> here since many internet protocols already employ such features.
.....................
> 
> Nor do I see multiple algorithm choice by NIST as passing the buck - its
> about ensuring that the AES programme delivers what is needed instead of
> assuming that what was right last time is still right now.


I guess it could perhaps also be useful to know the raison de etre 
of 3DES after DES has come out. If we'll have 3AES, then it certainly
wouldn't be a big step in my humble opinion to employ instead the 
three top ones in the final round of AES contest for multiple 
encryption.

M. K. Shen

------------------------------

From: [EMAIL PROTECTED] (Dan Day)
Subject: Re: He is back...new "improved" code
Date: Thu, 14 Oct 1999 23:15:58 GMT

On Thu, 14 Oct 1999 14:16:20 -0500, "Dan Fogelberg" <[EMAIL PROTECTED]>
wrote:
>Let me put it this way.  He is illogical, but beyond that, it is the joy of
>proving him wrong.  I wnet over it with him about the algorthm etc but he
>says it is uncrackable because nobody will know what it is blah blah blah.

Ask him how he intends to keep the algorithm secret when it's actually in
use.  

The problem with real-world encryption is that you have NO way of
protecting the cryptographic machines/programs.  If it's put into
wide use, it's easy for an enemy to just get a copy and analyze it.
Even in limited use, an adversary can usually get his hands on
one of the coders/decoders, by the usual beg/borrow/steal methods.

Sooner or later, your adversary *will* get his hands on your
equipment or algorithm.  So the idea is that to be actually safe,
an encryption scheme has to be secure EVEN WHEN the adversary knows
everything there is to know about your system.  And the way you
do that is to make a system where even having the system is no good
unless the adversary also has the key that was used to encrypt a
given message.  Keys can be kept secure, because they exist only
in your own head, and in the head of the person you're communicating
with.  Procedures and algorithms, however, have to exist in the
real world, where they can be stolen, and to be of any use have to
be put into many hands, where they can be "leaked".

Hey, I've got an idea -- go sneak on to your buddy's computer when
he's not looking, and snarf his program.  This should be an excellent
education for him on the above subjects.  When he accuses you of
"cheating", tell him that you're just using the exact same methods
that an adversary in the real world would use, and that it's too bad
he didn't protect his system against such real-world weaknesses,
as you had urged him to do.


--
   "How strangely will the Tools of a Tyrant pervert the 
plain Meaning of Words!"
   --Samuel Adams (1722-1803), letter to John Pitts, January 21, 1776

------------------------------

From: Darren New <[EMAIL PROTECTED]>
Subject: Re: Should RC4 be free?
Date: Fri, 15 Oct 1999 00:00:38 GMT

Gabriel Belingueres wrote:
> I disagree. Copyright laws are not intented to encourage people to
> publish. They are intented to protect the investment and effort of
> somebody for bringing your invention to the world. 

Ummm... No. Sorry. Read section 8 clause 8 of the US Constitution. In the
US, the point of patents and copyright is to encourage the progress of
science, not to help the authors.

Clause 8: To promote the Progress of Science and useful Arts, by securing
for limited Times to Authors and Inventors the exclusive Right to
their respective Writings and Discoveries; 

RSA is in the US.

-- 
Darren New / Senior Software Architect / MessageMedia, Inc.
     San Diego, CA, USA (PST).  Cryptokeys on demand.
     Do no try to bend the spoon. That is impossible.

------------------------------

From: [EMAIL PROTECTED] ()
Subject: Six out of six for Kerckhoffs
Date: 15 Oct 99 00:12:42 GMT

Well, I've been continuing to update my web page.

On

http://www.ecn.ab.ca/~jsavard/mi060101.htm

there is a short description of SIGSALY, and a long ramble about different
floating-point representations, in a page concerned with the digital
representation of speech signals. (I didn't discuss analogue telephone
scramblers, I may add a mention of them later.)

And on

http://www.ecn.ab.ca/~jsavard/mi0611.htm

there is now a section, entitled "The Ideal Cipher", where I discuss
Kerckhoff's six desiderata for a cipher system.

I make a case that even the last two (the key should be short, the cipher
should not be complicated) are not totally obsolete, as they might seem,
if the reasons behind them are understood.

One may need longer keys these days, but the key should still be of such a
nature as not to make itself more vulnerable to compromise, or more
difficult to replace if compromised.

Computers can carry out complex algorithms without effort: but, as one
reason for the original requirement is to reduce the problems caused by
errors, I put the desirability of using a cipher without a poor
error-propagation characteristic under the heading of this desideratum.

John Savard

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: where to put the trust
Date: Fri, 15 Oct 1999 00:23:26 GMT



In article <7u28e9$7rr$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Patrick Juola) wrote:
> In article <7u26ki$pp0$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
> >In article <7u12b6$vsg$[EMAIL PROTECTED]>,
> >  [EMAIL PROTECTED] wrote:
> >
> >>[...]
> >> Still, the question remains: if we don't trust the experts then
what
> >is
> >> the better alternative?
> >
> >I am tempted to say that in any case where the "experts" are
> >not bound to us by contract or compensation, the expert we
> >should first trust is ourself.
>
> I take it that if you need your cat fixed, you do the surgery
yourself?
>
> Why does someone magically know more because he signs a contract?
> All it means is that you have someone to sue -- it doesn't make
> the system he designs any more secure.

Giving yourself someone to sue tends to get more attention
from that particular someone.  I recommend it.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM



Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Eric Lehman <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: RC-5 breaking, $19 per letter
Date: Thu, 14 Oct 1999 20:52:00 -0700

Don't try humor here, bub!  It'll go over like a lead balloon.  A lead balloon,
I warns ya!

/Eric

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: 15 Oct 1999 00:56:14 GMT

BTW, when I made my presentation to ANSI X9F1, the consensus by a large margin
was to ask NIST to include "future resiliency" as an AES criterion.  And if you
read the paper by Jim N. et. al on the rationale of their reduction to 5, you
can see that they did so.
Don Johnson

------------------------------

Date: Thu, 14 Oct 1999 21:33:18 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column

Bob Silverman wrote:

> In article <[EMAIL PROTECTED]>,
>   "Brian Gladman" <[EMAIL PROTECTED]> wrote:
> >
> > The arguments for multiple AES winners cannot be dismissed so easily.
>
> Yes it can.  By one word.  The word is:
>
> interoperability.
>
> By allowing multiple algorithms you are certain to guarantee that there
> will be some users who can't talk to others.

No.  This is an invalid conclusion.

Any users desiring to communicate will be able to select a mechanism to do
so.  Any analysis dismissing the active participation of the users in the
dynamic selection of their channel properties from amoug the telephone, fax,
and email is trivially flawed.  The same flaw applies to the choice of
ciphers.

Further, a serious vendor will be able to use the degree of interoperability
of their products as an advantage of their superior products, thus driving
inferior products and vendors from the market.  This is a Good Thing.

Also, if interoperability were the be-all and end-all of the selection
constraints it would not eliminate 14/15 of the candidates, but 15/15 of the
candidates because they will not interoperate with existing systems such as
DES.


>
>
>  There
> > are at least three reasons for wanting more than one winner: (1) to
> provide
> > a degree of choice in dedicated, closed applications, (2) to provide a
> > degree of diversity in open applications
>
> Why does everyone use TCP/IP?

They don't.

>
>
> Repeat after me:
>
> A Standard allows interoperability.

Multiple standards do not disallow interoperability.  These two statements
do not conflict.


------------------------------

From: "Richard Parker" <[EMAIL PROTECTED]>
Subject: Re: HELP on Kerberos and SSH
Date: Fri, 15 Oct 1999 01:36:34 GMT

"Richard Parker" <[EMAIL PROTECTED]> wrote:
> I know of four papers that discuss weaknesses in Kerberos:
>
>   J.T. Kohl, "The Use of Encryption in Kerberos for Network
>   Authentication," Advances in Cryptology - Crypto'89,
>   Springer-Verlag, 1990, pp. 35-43.
>
>   S.M. Bellovin and M. Merritt, "Limitations of the Kerberos
>   Authentication System," Winter 1991 USENIX Conference Proceedings,
>   1991, pp. 253-267.
>   <http://www.research.att.com/~smb/papers/kerblimit.usenix.pdf>
>
>   B. Dole, S. Lodin, and E. Spafford, "Misplaced trust: Kerberos 4
>   session keys," Proceedings of the 1997 Network and Distributed
>   System Security Symposium, 1997, pp 60-70.
>
>   T. Wu, "A Real-World Analysis of Kerberos Password Security,"
>   Proceedings of the 1999 Network and Distributed System Security
>   Symposium, 1999.
>   <http://www.isoc.org/ndss99/proceedings/papers/wu.pdf>

Oops, I just noticed that my brief summary of these papers was
accidently omitted.

Kohl describes an attack on the non-standard PCBC mode of DES used by
Kerberos 4. The attack allows the adversary to undetectably exchange
certain blocks in a Kerberos message.

Bellovin and Merit describe several attacks on Kerberos 4, including a
replay attack and a dictionary attack.  They offer several
recommendations for improving Kerberos, many of which were included in
Kerberos 5.

Dole, Lodin, and Spafford found an implementation problem in Kerberos
4's generation of random session keys.  Due to a bad random number
generation process the random session keys had only 20 bits of
entropy, which allowed the system to be broken in seconds.

Wu performed an experiment with a real-world Kerberos installation and
was able to recover thousands of passwords with a dictionary attack.
He suggests that a protocol like SRP be added to Kerberos 5 to improve
resistance against dictionary attacks.

-Richard

------------------------------

Date: Thu, 14 Oct 1999 21:40:00 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column

Roger Schlafly wrote:

> Brian Gladman <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > The arguments for multiple AES winners cannot be dismissed so easily.
> There
> > are at least three reasons for wanting more than one winner: (1) to
> provide
> > a degree of choice in dedicated, closed applications, (2) to provide a
> > degree of diversity in open applications (a well established practice),
> and
> > (3) to meet requirements that benefit from the sequential application of
> > different encryption algortithms (again an established practice).
>
> You are just saying that one algorithm won't be good enough for
> some reason. But what is the reason? Why is diversity/choice
> good?
>
> If NIST decides that algorithm A is best, but algorithm B is
> also very good,

For what?

There is no single dominating axis of the feature space of the candidate
ciphers.  I.e., no ranking that is pareto optimal in all environments.  Thus
for any selection by NIST there will be an environment in which there is a
better candidate.  So NIST should not be trying to build a cruptographic
blivet, but trying to cover the application space with enough good ciphers so
that a the best can be chosen by an implementor.  I.e., build a bigger bag.

It would be really stupid to require an implementor to use an inappropriate
cipher just because it was a "standard".



> then you want NIST to give its blessing to
> people who disagree with NIST and prefer to use algorithm
> B. This doesn't make any sense to me. If someone thinks
> he is smarter than NIST, and NIST's analysis was wrong,
> and he wants to use algorithm B, then he can do it no
> matter what NIST says.

By this reasoning we have no need of a selection by NIST.  We could, and
perhaps should, let the market decide.  Perhaps we should persuage NIST to
select "None of the Above".

>
>
> There is diversity in practice because there is disagreement
> about what is best. If it turns out that the five finalists are all
> more or less equally good, then there may never be a consensus
> about what to use unless someone like NIST declares one to
> be the standard. A lot of people like standards. Those who
> don't can do whatever they want anyway.

Let those who like standards pick from te final five.  If you like standards,
more standards is better isn't it?


------------------------------

Date: Thu, 14 Oct 1999 21:53:42 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Factoring public keys attack?

Bob Silverman wrote:

> In article <[EMAIL PROTECTED]>,
>   [EMAIL PROTECTED] (Jerry Coffin) wrote:
> > In article <19991013151854.334$[EMAIL PROTECTED]>, [EMAIL PROTECTED]
> > says...
> >
> > [ ... ]
> >
> > > At most, wouldn't that only double the number of primes to examine?
> > >
> > > So 10^74 would become 2*10^74 or 10^74.3? And the combinations would
> > > be 10^148.6?
> >
> > If you add one more bit in each direction, that should (unless I'm
> > missing something) roughly double the number of possible factors.
>
> You need to be careful with language here. You add one bit to one
> prime,  but substract one bit from the other.
>
> If the modulus length is fixed at (say) 1024 bits,  then
> the number of moduli which are the product of 512-bit primes is
> about [(sqrt(2) * 2^511)/log( sqrt(2) * 2^511)]^2  ~ 7.1e302
>
> The number which are the product of a 511 and 513 bit prime is about
> (sqrt(2)*2^510) * (sqrt(2) * 2^512)/log(...)log(...) ~ 5.1e302
>
> You don't quite double the keyspace by allowing 511*513  in addition
> to 512*512
>
> But I ask:
>
> Why does anyone think that 7.1e302 keys are not enough???

Because some people still think the size of the key space is related to the
strength of the cipher, which premise is of course false.


------------------------------

Date: Thu, 14 Oct 1999 22:09:07 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: where to put the trust

Patrick Juola wrote:

> In article <7u3nmd$u6e$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
> >In article <7u216r$7ke$[EMAIL PROTECTED]>,
> >  [EMAIL PROTECTED] (Patrick Juola) wrote:
> >...
> >>The bridge, however, has to stay up in all sorts of conditions,
> >>including high winds, flooding, and so forth.   Unless you can
> >>create all possible situations of wind, water, load, &c, then
> >>you can't test and confirm that the bridge will always stay up.
> >
> >Well, you are certainly right that there is no way to test a bridge for
> >all possible conditions. The important point is that you can test it
> >for its *basic* requirement, which is to carry weight (for a long time
> >under normal environmental conditions).
> >
> >Dianelos wrote:
> >>>No such test is known for ciphers. Cryptography is the only
> >>>engineering field I know of, where you cannot actually test to see if
> >>>what you build fulfils its design requirements.
> >
> >>Actually, I suspect that most engineering fields are like that --
> >>it's a dictum in CS that "testing can never show the absense of bugs,
> >>only their presence."  I've already discussed civil engineering.
> >>Chip testing is known to be only partially reliable in e.e.   I don't
> >>think we need to discuss aerospace engineering after recent events.
> >>What type of engineering *were* you thinking of?
> >
> >Again, you can check to see if a plane flies most of the time, or
> >whether a chip correctly computes most of the time. A cipher can suffer
> >a catastrophic failure of a global scale and we cannot really test
> >against that.
>
> A "catastrophic failure of a global scale?"  I'm not sure I understand
> what you mean by this.  Cryptographic systems don't fail all by themselves;
> it's a difficult task, requiring a fair amount of expertise and skill,
> to make a (properly used) cryptosystem fail.

> So, yes, you can check whether a plane flies "most of the time" as long
> as by "most of the time" you mean "under idealized, non-hostile situations."
> But for a fighter plane, this isn't the interesting bit of time nor is it
> the time that you are interested in.  I can similarly prove that an
> airbag "works" the 99+% of the time it's sitting quietly in the dashboard.
> The only way we know to test whether or not an airbag works is by
> simulating various sorts of events and hoping that it deploys properly.
> Similarly, to test whether a cryptosystem works, we simulate various
> attacks and see whether or not the system stands up to them.  But just
> as an automotive engineer can't test all the events in the world, neither
> can a cryptographer.

True, but the most important tests cannot be performed: inspection &
maintenance.  We can tell if the bridge is still up.  Look at it.  Walk/drive on
it.  We can confirm it's state and functionality.

We can tell if a plane flies.  Try it.

We can also tell if the bridge is down.  Or the plane does not function.

We cannot tell if a cipher is indeed protecting our information because an
adversary who has penetrated the cipher does not leave traces behind.

>
>
> >I think there is a big difference in degree. It is certainly imaginable
> >that in the next 50 years somebody will publish a result that renders
> >the AES, or RSA, or KEA, or whatever other critical standard there may
> >exist then, useless. As a result of this a big crisis in the world
> >financial and commercial system might develop. Now, it is not really
> >imaginable that something will happen in the next 50 years that will
> >make most bridges of the world crash down at the same time, or make
> >most plains stop flying, or most computer chips stop working.
>
> Obviously, you've never heard of the EMP bombs?  Or, heck, the Y2K bug?
> Bridges, being a little bit lower tech, are more likely to withstand
> EMP bombing, but they're really sensitive to earthquakes.

EMP bombs and earthquakes are noisy.  One tends to notice their effects.  On
noticing such an event one can inspect and repair as necessary.

What noise is made when a computer in Moscow/Bejing/Fort Meade translates our
cipher text into plaintext?  None.

How will we notice?  We won't.



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to