Cryptography-Digest Digest #270, Volume #9       Tue, 23 Mar 99 05:13:04 EST

Contents:
  Re: Live from the Second AES Conference

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: Live from the Second AES Conference
Date: 23 Mar 99 09:32:56 GMT

[EMAIL PROTECTED] wrote:
:     Almost all candidates
:     ciphers were stained one way or the other - and a surprising
:     strong showing of Rijndael (in the negative sense that very little
:     negative was said against it).

This reminds me of something I've noticed. Although Magenta had serious
problems, the cryptanalysis of it was based on a weakness in its key
schedule. The only other possible problem with it is that the S-box,
although nonlinear, seems a bit too simple.

If both these problems were corrected - which wouldn't change the speed of
enciphering blocks in it - the design seems like potentially a very good
one. (I had thought, though, that it would be slow. If that isn't the
case, it seems to be a good basic design.)

:     Thus, Eli Biham proposed a normalization of the
:     candidate algorithms depending on the number of rounds that are
:     known not to be secure. This view has also its detractors.

That is a way to compare the designs for one type of merit. I'd consider
that a valid thing to examine, but I'd also note that it was more relevant
to determining which of the ciphers have ideas in them that would be
useful in a new cipher than in choosing between the existing designs as
they are: if this point were missed, I'd be a detractor.

:     IBM's Pankaj Rohatgi explained how he got all 128 bits of
:     a Twofish key after only 50 (that is 50 not 2^50) uses of a smart
:     card!

I wonder how secure some of the other ciphers would be, if the kind of
optimizations Bruce suggested for fitting Twofish on a smart card were
applied to them. That is, if it were possible.

This result may well mean that the whole idea of using a cipher on a
smartcard will have to lead to the design of special ciphers optimized for
that particular application.
:     He made a good point that
:     only assembly code speeds should be compared because otherwise you
:     measure a particular compiler's efficiency rather than the
:     cipher's.

If you compare all the ciphers on the same compiler ... and I remember at
a computer chess competition, someone suggested that it wouldn't be a bad
idea to require the entrants all to use C, so that it wouldn't be a
competition of assembly-language coding skills.

Possibly one should really be comparing assembly code speeds for an
idealized machine, with an architecture free of ... peculiarities. But
comparing everything on one computer running Java was a good approach to
take if there were very limited resources to set the thing up.

:     Less
:     convincingly, he claimed that also 32 bit processors will stay
:     with us for a long time.

I think that 32-bit integer arithmetic will stay with us for a long time,
even if few microprocessors have a 32-bit data bus. I'll definitely admit
I don't think that there even *are* many 386s and 486s in new products
today. But I also don't think that it makes any sense for a computer to
force people to do integer arithmetic exclusively on 64-bit quantities.
The DEC Alpha, on which an int is 64 bits and a short int is 32 bits, is,
and I think will remain, very unusual.

Memory will always cost money, and people will always want more of it, so
they're unlikely to want to store quantities with useless precision.

:     The technique explained is slightly more complex, but
:     the point made was that DES could easily (actually especially
:     easily) be broken in this way. On the other hand, Skipjack, a
:     modern cipher produced by the US government, would be especially
:     hard to attack in this way.

This is a significant result, that will add to our knowledge.

:     Another possibility is "balancing", an expensive
:     proposition where, for example, you use 4 bit arithmetic on an 8
:     bit smart card, taking care that the other 4 bits are always the
:     complement of the "true" 4 bits, i.e. maintaining a constant
:     Hamming weight.

Ideally, a specialized microprocessor designed to do its calculations in a
balanced way should be used for encryption...

:     Bruce Schneier made a case against the idea of choosing not one
:     but several "approved" AES algorithms. He used the argument that
:     cryptanalytic efforts should not be divided between several
:     ciphers because then each individual cipher would be less trusted.
:     I didn't quite understand this argument: all ciphers that make it
:     to the second round will be strongly analyzed anyway no matter
:     whether only one or more than one is finally declared the winner.

Well, I don't think that particular point demolishes his argument. He is
thinking of the kind of analysis DES recieved - over decades.

But that is not to say I necessarily agree with that conclusion. Because
that applies to how well the AES will be trusted ten years from now - *if*
it isn't broken. (Suppose they choose the wrong one.)

Ten years from now, people may be using other ciphers. This doesn't seem
to be relevant to what is most secure for AES users in the near term
period after adoption of the standard.

:     The possibility
:     of a catastrophic failure is really what should keep the NIST
:     people awake at night.

I agree with you there, but I don't think choosing two AES candidates
(although it at least provides people with alternatives) really prevents
that danger. Maybe we should conclude that so much has been learned from
the AES process that all the entrants - having been designed before it -
are obsolete. And so what is needed is to take the best ideas from all the
entrants, and create a new block cipher based on this new knowledge.

:     One wonders what is so wrong in declaring several good algorithms
:     for specialized environments. Actually, after today's presentation
:     about attacks against smart cards, somebody mentioned that a smart
:     card is a sufficiently different environment to justify a
:     specialized cipher.

Yes, but I think that a special cipher for smart cards wouldn't look like
an AES candidate. Instead, it would be more like RC4. Block ciphers don't
offer maximum security for minimal resources.

The AES effort sought a block cipher because, in certain ways, a block
cipher is more flexible than a stream cipher. This is fine; but block
ciphers also have fundamental limitations too, and so while I think an
effort to find a good cipher for smart cards is a good idea, an effort to
find a good block cipher with a 128-bit block and 128, 192, and 256-bit
key sizes for smart cards is *not* necessarily such a good idea.

John Savard

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to