Cryptography-Digest Digest #427, Volume #9       Tue, 20 Apr 99 05:13:04 EDT

Contents:
  Re: Thought question:  why do public ciphers use only simple ops like shift and XOR? 
(Boris Kazak)
  BEST ADAPTIVE HUFFMAN COMPRESSION FOR CRYPTO (SCOTT19U.ZIP_GUY)
  Re: Adequacy of FIPS-140 (kurt wismer)
  Re: FSE-6 Report: Slide Attack (James Frey)
  Re: Question on confidence derived from cryptanalysis. ("Trevor Jackson, III")
  Re: RC6 new key standard from AES conference? (Paul Rubin)
  Re: Question on confidence derived from cryptanalysis. (Terry Ritter)

----------------------------------------------------------------------------

From: Boris Kazak <[EMAIL PROTECTED]>
Subject: Re: Thought question:  why do public ciphers use only simple ops like shift 
and XOR?
Date: Mon, 19 Apr 1999 19:45:17 -0400
Reply-To: [EMAIL PROTECTED]

Jerry Coffin wrote:
> ...or they might not be.  2^32-1 happens to be a prime number.  In
> many cases, the smallest factor of your modulus has a large effect on
> the security of encryption using that modulus.
=====================
Sorry, 2^32-1 = 3*5*17*257*65537, but I have found nice ways to set up
key-derived multipliers in this field. The maximum length of the 
multiplicative cycle is 65536, so you can select an appropriate SEED
and raise it to any power < 2^16. In fact, both the modular multiplier
and its inverse are computed in the same subroutine.
> 
> By contrast, 2^64-1 is what you might call extremely composite -- its
> prime factorization is (3 5 17 257 641 65537 6700417).  This large
> number of relatively small factors will often make this a particularly
> bad choice of modulus.
==========================
Also not necessarily. The important thing is the multiplicative cycle 
length which can be achieved, this gives you an idea of how many 
multipliers you can produce from an appropriately chosen SEED.
BTW, the only practical requirement to the SEED is that it should
produce the maximum length cycle of its powers, i.e be a generator.
> 
> Depending on what you're doing, 2^64+1 is likely to be a MUCH better
> choice -- it's still not a prime, but its prime factorization is
> (274177 67280421310721).  In many cases, the largest prime factor is
> what matters, and in this case, it's MUCH larger -- 14 digits instead
> of 7  (which is also considerably larger than 2^32-1). Unfortunately,
> using 2^64+1 as a modulus is likely to be fairly difficult even if you
> have a 64-bit type available.
=======================
As a matter of fact, very easy. The hex number c720a6486e45a6e2 
produces in the 2^64+1 field a cycle of its own powers which is 
72057331223781120 long (just under 2^56). This number is simply the
first 16 hex digits of sqrt(3), and I am sure that it will take me
not more than 15 minutes to find 5-6 numbers more like this.
(Please, don't ask me about a source code for the program, I've
written it in FORTH). So I can generate random 32-bit subkeys, raise
my SEED to these powers and I am in business... Go guess the linear
and differential properties of these multipliers, especially if they
will be chosen for encryption in a plaintext-dependent way!
> 
> I obviously haven't studied your encryption method in detail (or at
> all) so I don't _know_ that this will make a difference in your
> particular case, but it's definitely something to keep in mind.  Many,
> many forms of encryption that work quite well in 32-bit arithmetic
> basically fall to pieces when converted to use 64-bit arithmetic
> instead.
====================
I do not intend to keep it secret. If you are interested (just for fun),
I am ready to discuss with you the method of file transfer 
(unfortunately, I don't have a Web page).

  Thanks for your courtesy  Best wishes        BNK

------------------------------

From: SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]>
Subject: BEST ADAPTIVE HUFFMAN COMPRESSION FOR CRYPTO
Date: Tue, 20 Apr 1999 05:33:43 GMT

If one can find a better adaptive huffman compression to be
used with encryption I would love to hear about it.

 For those of you interested in the best in "Encryption" and
"Compression for use with Encryption" take a look at my site.
There are links to download all the software used. However until
someone posts the lastest version of scott19u.zip it will be
available only in the US or to those tricky enough to use
there brains to download it.

 Please check the site out.
It is free. Programming is a hobby I am presently retired
and love C or assembly any comments on the actual code or
improvements would be welcomed. Also looking for work in
Jaurez Mexcio or even El Paso.

--
--
http://cryptography.org/cgi-bin/crypto.cgi/Misc/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
to email me use address on WEB PAGE

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: kurt wismer <[EMAIL PROTECTED]>
Subject: Re: Adequacy of FIPS-140
Date: Tue, 20 Apr 1999 01:05:13 GMT

Douglas A. Gwyn wrote:
> 
> Kurt Wismer wrote:
> > why can't you take the minimum cost of all the attacks...
> 
> The cheapest possible cryptanalytic attack against a complex
> system is the "lucky guess", and yes, it has occasionally worked.
> However, it doesn't work against a "typical" instance of the
> system.  So expectations have to enter into the metric.

umm... correct me if i'm wrong - ok, don't, i just realized that when i
said "attacks on the algorithm" it was later in the article than my
first line... 

the lucky guess can't really be considered an attack on the algorithm,
can it? and if not, i would think that it wouldn't enter into the
equation... for the same reason i figured rubber hosery could be ignored
aswell...
 
> The real problem with your suggestion is that few people know
> all relevant attacks that your system might be exposed to.
> (Except the enemy, who isn't going to help you analyze your
> system.)

to that end i should think it would then be in the cryptographic
community's best interests to work on developing that body of knowledge,
though i suspect that by now it's a somewhat daunting task... 
 
> Shannon made a start at a *general* method of analysis,
> not dependent on considering any specific forms of
> cryptanalysis, with his "unicity distance" metric.
> But surely we could do better than that, now that information
> statistics is out of its infancy.

well, i'll have to plead ignorance here... while i know of unicity
distance (i have done *some* reading) i don't know what information
statistics promises in the way of developing more concrete measures of
the cost of breaking ciphers...

-- 
"when the truth walks away everybody stays
 cause the truth about the world is that crime does pay
 so if you walk away who is gonna stay
 cause i'd like to make the world be a better place"



------------------------------

From: James Frey <[EMAIL PROTECTED]>
Subject: Re: FSE-6 Report: Slide Attack
Date: Mon, 19 Apr 1999 22:23:33 -1000

[EMAIL PROTECTED] wrote:
> 
> > Well, known-plaintext attacks are sometimes possible 'in the field', and if
> > one has enough known plaintext, the plaintext one might wish to choose may
> > already happen to be among the known plaintexts.
> >
> > And against some devices, such as smartcards, or against a utility where
> > you are one of the legitimate users, but you want to eavesdrop on other
> > users, chosen-plaintext attacks are possible.
> >
> 
> So I encode a file with a program, and send the file, where does this attack
> come in?
> 
> I am a secure modem, I encrypt data the user gives me, I send it, I also
> received and decrypt with another key.  Where does this attack come in?
> 
> Thanks,
> Tom
> 
> -----------== Posted via Deja News, The Discussion Network ==----------
> http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own

The Slide Attack

Summarized from the FSE-6 Conference in March, 1999, Rome

The authors were Alex Biryukov and David Wagner at the Fast 
Software Encryption Workshop #6. The Slide Attack is done 
using known plaintexts and known ciphertexts. Below, two 
encryptions are put side by side, with a one round offset slide. 
F is the round function:

P0
F
P1  P0'
F   F
P2  P1'
F   F
P3  P2'
F   F
P4  P3'
F   F
C   P4'
    F
    C'

Pj is a plaintext in round j, C is a ciphertext.

The analyst does not have access to the inner workings
of hardware with the key nor software with the key.
The analyst knows the cipher design specifications.
The analyst has, for an n bit key, more than 2^n/2
matching plaintexts and ciphertexts. The cryptanalyst
can check whether it is possible that F(P0)=P0'. This 
checking requires that a round is "weak" so that
logic will rule out most candidate P, C pairs. The check is
made feasible by knowing F(P0)=P0'  AND  F(C)=C' are
simultaneously logically consistent for the same key.

Example: you are cracking DES and you have 5 billion 
pairs of ciphertext blocks and the matching plaintext 
blocks. Take the first plaintext and the second plaintext.
Using digital logic decide whether you could encrypt the first 
plaintext with a key to get the second plaintext, which is a
key that is consistent with the two matching slid ciphertexts.
If logic shows that it cannot be done, then move on to the third
plaintext. After a candidate slid pair is found, calculate any keys
that makes it logically correct. Try that key on other pairs
to confirm. One success will obtain some key bits, more bits can 
be obtained by more work on inner rounds, depending on the key 
schedule.

------------------------------

Date: Tue, 20 Apr 1999 16:20:31 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Question on confidence derived from cryptanalysis.

Geoff Thorpe wrote:
> 
> Now, statement (1) is wrong. Maybe you cannot make estimates, and maybe
> you do not like the estimates others may employ. But there are ways to
> make estimates whose rationalisation is acceptable to those involved.
> That includes me. You also referred in that post to a complete lack of
> evidence but I think you yourself would be well positioned to refute
> that. Take every damned cipher you ever heard of (with any degree of
> cryptanalysis against it), exercise some personal judgement as to some
> measure of time+effort that the cipher was subjected to (by publishing
> authors - obviously not the spooks) before it became widely regarded as
> unacceptable, and take a look at the resulting distribution. That may
> not be a precise science, and of course it involves warm-fuzzy personal
> interpretations (time+effort) but it is not unacceptable for many
> people, particularly those who would otherwise be rendered with NO
> effective way to evaluate. I dare say that your distribution, if you've
> made semi-reasonable interpretations along the way, will show that a
> ciphers that lasted 10 years had a much better chance of lasting another
> year than the average "expected life". It's a very basic and common
> mathematical model/argument, and it's common sense.

An interesting concept.  But in the absence of evidence I believe it to
be wrong.  Other than the obvious infant mortality due to negligence on
the part of many cipher designers, I'd bet predictions described above
are not very strong.

My reasonaing is that as a cipher gains the confidence of the community
by surviving the gauntlet of previously known and/or straightforward
attacks it will emerge from the background of many such ciphers and be
attacked in a more serious way.  The degree of emminence attracting more
and more attention indicates that the share of offensive effort aimed at
the cipher will continue to grow.

Given that there are few ciphers that have survived "the gauntlet" for a
respectable period of time compared to the many ciphers without that
maturity, the odds look to me much as Ritter described them.  If I pick
a young cipher, it may be broken tomorrow.  If I pick an elderly cipher
it may be broken tomorrow.

The appropriate metric for this kind of confidence is the expected wait
for failure.  Your claim amounts to the statement that the expected wait
is longer for an elderly cipher than young one.  I'm not comfortable
with that.

It is an interesting thesis for study.  Perhaps one of the frequent
crypto students mght pick it up, define the criteria for judgement, and
produce the results of the historical survey you suggested.  Then we
could discuss the merits of a carefully defined (tailoed?) set of
criteria.

> 
> I've already explained why I think that (2) is wrong - nobody knows any
> of this stuff FOR SURE, but you make a call when you don't have perfect
> information. Our Opponents are just well-paid versions of us, most of
> whom probably grew up around us, and who find their occupations not too
> unfathomably unethical to suffer day by day. I still maintain that what
> we can do and achieve is a statistical, probabilistic, and "confidence"
> variable that does not run along independantly of theirs. Depends how
> much George Orwell you read though ...
> 
> > > like to play and write papers. "They" just want information, and I'm
> > > guessing just do whatever they got to do to get it - and searching
> > > endlessly for little theoretical weaknesses is probably not their top
> > > priority. That's not to say they don't do it and do it very well, but I
> > > doubt their considerable advantages in resources are put so much to this
> > > task as to make our abilities so incomparable or unrelated as some might
> > > believe.
> >
> > A good point.  However, we canot deal with their (secret) intentions,
> > but must anticipate their possible (even more secret) capabilities.
> > Thus amplifying the threat model is a sensible thing to do.  It
> > eliminates some of the risk of catastrophically underestimating them by
> > enhancing the risk of expensively overestimating them.
> 
> Sure thing - but the whole system does not collapse down to a binary
> system of "broken" and "not-broken-yet" ... as you say, you put together
> a threat model ... consistent with your requirements and using a chosen
> method for judging a components "worth", and amplify it here and there
> as appropriate. A lot like putting together a cost-proposal I guess ...
> add in your known prices, choose an acceptable value for the "unknowns",
> amplify the costs of all the "risky" bits, add x% profit on top - and
> then bang another 30% on top for good measure, and generally covering
> your butt some more.

A good model for planning the construction of a bridge or any major
project.  But I can;t apply this model to the construction of a security
system because I have no clue how to make something 30% stronger.  Do
you?

> 
> > It appears to me that he *does* agree (tho he can certainly speak for
> > himself), which is why he has repeatedly proposed the use of multiple
> > ciphers both to spread eggs across baskets, and to provide layered
> > security where warranted.
> 
> 3 ciphers strung in a line is, to me, a cipher. You need all three in
> the same place and in the same order to have anything other than a
> "noise generator". Breaking 3 ciphers should be no more difficult than
> breaking one well designed one using 3 different stages

Here we part company.  Your statement assumes that we will choose three
ciphers whse strength adds up to the strength of one cipher that is well
designed and uses several stages.  But this is not the situation we
face.

The situation we face is that we have dozens of reasonably strong
ciphers, whose relative strengths are immeasurable.  We may theorize
about their relative merits, but we can't measure their strength in any
fundamental sense.  Given this I believe it makes sense to reduce risk
of catastrophic failure by using a composite of the strongest ciphers of
which we are aware.

In the limit (reductio...), we'd use all of the ciphers that we believe
to be independent.  The resulting monster would have severe
implementation problems, but these are engineering issues that *can* be
measured and shown solved.  The creation of such a monster would be a
never ending task because new ciphers worthy of inclusion are
continually developed.  This approach would limit the risk of
catastrophic failure to the absolute minimum.

Now the monster described aboce is a silly example, but it is not at all
absurd.  This is why I believe Ritter's concept of composite ciphers has
real value.

 (if a cipher is
> based on one "idea", "primitive", or whatever then your vulnerability
> must surely be higher than distinct ideas employed serially?). It seems
> the argument put forth was more one of splitting the traffic
> (conceptually across time and application, not packet by packet I
> assume) across ciphers, and rotating the old out and the new in on a
> regular basis. I see this as unacceptable in a real-world scenario for
> reasons of interoperability & standardisation, as well as security.

No.  Absolutely not.  We already have tools that support multiple
ciphers.  We know how to extend and manage that process.  It is not
perfect, but it is "merely" engineering effort.  Thus there are
existential proofs that the interoperability and standardization issues
to which you refer are minor compared to the issues of catastrophic
failure that is the anti-objective of all our security efforts.

The only additions required to implement dynamic substitution of ciphers
is a minor refinment of the existing cipher management capabilities. 
Automating what users can already do manually does not create the sort
of problems that will defeat modern engineering, and eos not compromise
security in the least.  This issue is a red herring.


> 
> Cheers,
> Geoff

------------------------------

From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: RC6 new key standard from AES conference?
Date: Tue, 20 Apr 1999 05:40:05 GMT

In article <7fgf1v$9pn$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
>> What debacle was that?  I missed something.
>> I think it is important that the AES perform reasonably well on smartcardts.
>>
>See the "Live from the Second AES conference" thread. In general the idea is
>this: you only need AES on a smartcard if you need high levels of security;
>but then the recently discovered practical attacks against smartcard
>implementation of ciphers (particularly PDA) make today's smartcard
>technology of little use for high security applications.

I don't think that makes sense.  The smartcard hardware attacks
assumed a hostile card reader.  Maybe you are using smartcards for key
management where you control the reader.  Or perhaps you're not
literally using a smartcard electrical interface, but you want to use
a smartcard processor in some other type of device to hold encryption
keys.  If the AES can't run acceptably on a smartcard, then the
smartcard will have to use a different algorithm, and the implementer
is left having to choose the alternate algorithm, precisely what the
AES hoped to avoid.  Even still, it's cumbersome to have to use a
separate algorithm on the smartcard.

Anyway, for most applications of AES on workstation processors, I
suspect the keys will be stored on disk.  That is almost certainly
less secure than any smartcard implementation.  That isn't necessarily
bad, since not all applications require really high security.  What
it means is that AES is intended to be used where the security is high
and also where it is not so high.  So it should work on smartcards
despite smartcards' vulnerability to hardware attacks.  Workstations
after all are also vulnerable to such attacks.

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Question on confidence derived from cryptanalysis.
Date: Tue, 20 Apr 1999 05:52:19 GMT


On Tue, 20 Apr 1999 00:28:14 -0400, in <[EMAIL PROTECTED]>,
in sci.crypt Geoff Thorpe <[EMAIL PROTECTED]> wrote:

>[...]
>Terry Ritter said:
>> In summary: 1) We cannot estimate the probability that an effective
>> attack exists which we did not find; and  2) We cannot estimate the
>> probability that even if such an attack does exist, our Opponents can
>> find it and use it.  I thus claim that we CAN know nothing of the
>> probability of future cipher failure, and cannot even reason that this
>> probability is "small."  The practical consequence of this is that we
>> cannot trust any cipher.  
>
>I disagree - and I disagree with every sentence moreover. I may not
>design ciphers but I can definately slug it out with most people
>regarding probability theory, statistics, and logic. 

You may be willing to "duke it out," as though this were some sort of
winner-take-all contest, but if you believe your logic is compelling,
you will have to think again.  Not only am I not compelled, I am
appalled to see you repeating things over and over, in the apparent
illusion that this has some relation to logic or scientific argument.


>I also have to
>assist with various API designs and have been on the (l)using end of
>quite a few if we want to talk standards, picking algorithms, and
>covering butts (oh yeah, I've done quite a bit of Risk Management
>related stuff too).

What a guy you are I'm sure.  Let's get on with it:

Recall that my position does not rest upon an estimation of someone
else's capabilities.  It is not my *opinion* that any cipher we have
*might* possibly break -- that is fact.  I assume the worst case, and
propose systems to provide strength even then.  

Your position, dare I state it, is that you *can* estimate the
capabilities of your Opponents.  You also say you can estimate the
future strength of a cipher from past tests.  But for all this
claiming, we see no similar statements in the scientific literature.
So these are simply your opinions, and I see no supporting facts.  


>Now, statement (1) is wrong. 

Which was:  "1) We cannot estimate the probability that an effective
attack exists which we did not find."

Since you think this is wrong, you must believe we *can* make an
estimate.  Fine.  Do it.  Show me.


>Maybe you cannot make estimates, and maybe
>you do not like the estimates others may employ. But there are ways to
>make estimates whose rationalisation is acceptable to those involved.
>That includes me. 

Alas, what people believe is not science.


>You also referred in that post to a complete lack of
>evidence but I think you yourself would be well positioned to refute
>that. Take every damned cipher you ever heard of (with any degree of
>cryptanalysis against it), exercise some personal judgement as to some
>measure of time+effort that the cipher was subjected to (by publishing
>authors - obviously not the spooks) before it became widely regarded as
>unacceptable, and take a look at the resulting distribution. That may
>not be a precise science, and of course it involves warm-fuzzy personal
>interpretations (time+effort) but it is not unacceptable for many
>people, particularly those who would otherwise be rendered with NO
>effective way to evaluate. I dare say that your distribution, if you've
>made semi-reasonable interpretations along the way, will show that a
>ciphers that lasted 10 years had a much better chance of lasting another
>year than the average "expected life". It's a very basic and common
>mathematical model/argument, and it's common sense.

Oddly, no such study has appeared in the literature.  That seems
somewhat strange, since you say it is very basic common sense.
Perhaps everyone else in cryptography has simply been blinded to this
fundamental truth.  When will you write it up for us?

You are arguing your opinion about cipher strength.  (Recall that I do
not argue an *opinion* about cipher strength, but instead the *fact*
that any cipher may be weak.)  If you have compelling factual
evidence, I will support it.  Show me the correlation you say exists.
Prove it.  Then you can use it.  


>I've already explained why I think that (2) is wrong - nobody knows any
>of this stuff FOR SURE, but you make a call when you don't have perfect
>information. 

Nobody has any problem with you making a call for yourself and risking
only yourself.  But if this "call" is intended to formulate what
"should" happen for much of society, you may need to revise your
estimate as to the consequences of failure.  Just how much disaster
are you willing for us to have?  

Will it be OK for everyone to use the single standard cipher which you
predict is strong, if you turn out to be wrong?  Will it be OK when
communications grind to a halt and incompatible low-security temporary
measures are instituted everywhere while a new cipher is integrated
into all the programs which must be replaced throughout society?  Is
that OK with you?

>Our Opponents are just well-paid versions of us, most of
>whom probably grew up around us, and who find their occupations not too
>unfathomably unethical to suffer day by day. 

This is simply breathtaking:  It is appallingly, plainly, immediately
false even to the most casual observer.  People do differ, we do have
limitations others do not have, and others often do take advantage of
knowledge to which we have no access.  


>[...]
>Sure thing - but the whole system does not collapse down to a binary
>system of "broken" and "not-broken-yet" ... as you say, you put together
>a threat model ... consistent with your requirements and using a chosen
>method for judging a components "worth", and amplify it here and there
>as appropriate. A lot like putting together a cost-proposal I guess ...
>add in your known prices, choose an acceptable value for the "unknowns",
>amplify the costs of all the "risky" bits, add x% profit on top - and
>then bang another 30% on top for good measure, and generally covering
>your butt some more.

Write whatever numbers you want: you cannot support them.


>> It appears to me that he *does* agree (tho he can certainly speak for
>> himself), which is why he has repeatedly proposed the use of multiple
>> ciphers both to spread eggs across baskets, and to provide layered
>> security where warranted.
>
>3 ciphers strung in a line is, to me, a cipher. 

The distinction is that each cipher is an independent and separable
element which can be "mixed and matched" with any other.  Each cipher
is tested as an independent unit, and brings whatever strength it has
independent of internal ciphering requirements.  Dynamic mixing and
matching prevents having any fixed target to attack.  


>You need all three in
>the same place and in the same order to have anything other than a
>"noise generator". Breaking 3 ciphers should be no more difficult than
>breaking one well designed one using 3 different stages 

Really?  Tell me more about how ciphers are designed.  How long did
you say you have been doing this?  How many ciphers have you designed?
Have you measured them?  Where can we see your work?

>(if a cipher is
>based on one "idea", "primitive", or whatever then your vulnerability
>must surely be higher than distinct ideas employed serially?). It seems
>the argument put forth was more one of splitting the traffic
>(conceptually across time and application, not packet by packet I
>assume) across ciphers, and rotating the old out and the new in on a
>regular basis. I see this as unacceptable in a real-world scenario for
>reasons of interoperability & standardisation, as well as security.

What you mean is that *you* do not know how such a system would be
standardized and made interoperable.  Then you imply this means
*nobody* *else* could do it either.  

I note that this is precisely how you reason about the cryptanalytic
capabilities of others:  It is false here, there, and everywhere you
present it.  You are not the best example, not everyone is like you,
and it is invalid to assume others have your limitations.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to