Cryptography-Digest Digest #33, Volume #11        Tue, 1 Feb 00 20:13:02 EST

Contents:
  Re: NIST, AES at RSA conference (Terry Ritter)
  Re: Block chaining (zapzing)
  Re: Is the following system acceptable for "casual" encryption? (Guy Macon)
  Re: NIST, AES at RSA conference (Bryan Olson)
  See my message at alt.politics.org.cia (Markku J. Saarelainen)
  Re: Is the following system acceptable for "casual" encryption? (John Savard)
  Re: Available Algorithms (John Savard)
  Re: NIST, AES at RSA conference (John Savard)
  Re: See my message at alt.politics.org.cia (John Savard)
  Re: NIST, AES at RSA conference (Shawn Willden)
  Re: Jaws Technologies' L5 Data Encryption Algorithm? (Keith A Monahan)
  Re: Does the NSA have ALL Possible PGP keys? (Ed Pugh)
  Re: The Best Books (Uri Blumenthal)
  Re: Biggest keys needed (was Re: Does the NSA have ALL Possible PGP  ("Trevor 
Jackson, III")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: NIST, AES at RSA conference
Date: Tue, 01 Feb 2000 22:02:18 GMT


On Tue, 01 Feb 2000 11:04:07 -0700, in
<[EMAIL PROTECTED]>, in sci.crypt Shawn Willden
<[EMAIL PROTECTED]> wrote:

>Terry Ritter wrote:
>
>> If cryptanalysis would be more effective by adding a cipher to the
>> cipher being attacked, we would see that proposed as an attack
>> technique.  We do not see that.
>
>We don't see that, but is it because it's not effective or because it's not
>explored?  It seems to me that to some degree this leads us back into the same
>quandary.  How do we know that layered ciphers are strong?  Because no one has
>published an attack that exploits layering.

But cryptanalysis is neither magic nor random.  We don't just throw
things into a pot and see what we get.  There is a purpose behind
attacks, and that is to exploit some analyzed (found) weakness of a
cipher.  Adding another cipher layer does not add weakness to the
previous layer(s).  

Of course, if the new cipher is actually the *inverse* of the previous
layer, that could -- with the right mode and key -- actually un-do
what has been done.  But that is almost unthinkable if the ciphering
layers are structurally different, and unbelievable when layers use
independent keys.  Waiting for the ciphering system to use the same
key for two ciphering layers is not an attack strategy.  That would be
basically the same as hoping a random key will unlock the cipher,
which is a risk inherent in all ciphering as we know it.  

But even when there is a known weakness in a cipher, that may not be
possible to exploit when the cipher is used in a multi-cipher stack.
Normally, the best attacks on block ciphers are of the known-plaintext
or defined-plaintext varieties, because those have the most
information with which to work.  So if we can avoid exposing both
plaintext and ciphertext for any cipher, the best attacks cannot be
used, which means the opponent has to do something else.  


>It's possible (I think unlikely, but then I'm a clueless newbie) that the only
>reason such attacks haven't been published is because no one has been trying,
>and that the reason no one has been trying is because layering is not common.
>Suppose someone did publish an attack that showed some considered-to-be-secure
>algorithm to be weak if the ciphertext were further enciphered using some
>other algorithm, but with the same key.  Wouldn't the response be: "Why would
>you do that?".

*I* think the response to such a result would be fall-over
astonishment -- and then a careful line-by-line analysis of the paper.
The implication would be that random-like functions are not behaving
as random functions.  It would seem to imply some sort of related
structure in ciphers which should not be related at all.  

Multi-ciphering is not new.  What is new is that we have the cycles
available to do it as the expected mode.  


>To summarize the point I'm trying (probably badly) to make:  The claim of
>added strength that may arise from layering ciphers seems indistinguishable
>from the claim of added strength that may arise from creating more
>sophisticated ciphers (with more layers, rounds, etc.).  

There is some truth to that:  If we have a 3 layers of cipher, we can
consider that to be a different cipher.  But then we can't test the
layers independently, and do not reap any benefit from the ability to
change ciphers and thus have n**3 cipherings with n different ciphers.
Considering multiple layers as one cipher also clouds the distinction
between independent keys for the layers.  


>In both cases, the
>*only* recourse we have to evaluate the security of the result is
>cryptanalysis, which we know to be ultimately ineffective.  

No.  I think we know that a cipher which cannot be attacked in the
most favorable ways is, in some sense, "stronger" than if it could be
so attacked.  I think we know that it is going to be a great deal more
trouble to force an attack on a cipher in a stack unless the other
ciphers have already been solved -- but each of the other ciphers must
be solved first, and they will have had full protection.  


>Further, it seems
>that even that ineffective evaluation has not been applied layered cipher
>systems, because common wisdom says that it's unnecessary.

That is more than common wisdom, that is the basis for understanding
cipher operation:  We expect ciphers to behave as key-selected
permutations from data to ciphertext, or random 1:1 functions.  It is
hard to even imagine any latent structure in these functions which
would be more exploitable in a multi-cipher stack than if it stood
alone.  But we do want all the analysis we can get.  


>OTOH, I believe (without any justification whatsoever) that in nearly every
>case a stack of ciphers is at least as strong as the strongest cipher in the
>stack, and a dynamically changing stack would be stronger yet.  It appears
>that Mr. Ritter believes this as well.

Yes.


>However, I also believe (again without real justification) that our key
>management practices are far weaker than our ciphers.

To agree that key management practices are weaker than ciphers, one
would seem to need knowledge of the weakness of ciphers -- and that is
knowledge one cannot have.  So this is an inherently insupportable
belief, but -- strangely -- one that academics seem more than willing
to embrace, and that is part of the problem.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: zapzing <[EMAIL PROTECTED]>
Subject: Re: Block chaining
Date: Tue, 01 Feb 2000 22:03:19 GMT

In article <[EMAIL PROTECTED]>,
  "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> zapzing wrote:
> > You know I , personally, have
> > always wondered why people bother to do
> > block chaining at all. If you had a
> > good block size, say 256 bits, you
> > could make half the block be random
> > numbers and half be plaintext.
> > Then no chaining would be necessary,
> > and an error in one block would not
> > propagate through the whole
> > subsequent message. Also you could
> > use parallel processing to encrypt
> > the message, something to think
> > about for the future.
>
> Chaining modes aren't intended for error recovery,
> although many of them do provide a measure of that
> as a side effect.  The main purpose of chaining is
> to thwart verious cryptanalytic attacks.  If every
> block is encrypted with exactly the same key, some
> attacks can exploit that.
>

Thank you. They are not intended for
error recovery. In other words "That's
not a bug it's a feature".
The random padding would thwart a
ciphertext-only attach, if that's
what you are talking about.

> Also, padding to such an extent with random data
> is a waste of bandwidth.
>

No, it is a use of bandwidth. Last time
I checked storage capacities and
bandwidth were growing by leaps and
bounds.

--
Do as thou thinkest best.


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: Is the following system acceptable for "casual" encryption?
Date: 01 Feb 2000 17:20:58 EST

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (David 
Goodenough) wrote:
>
>First of all, what do I mean by casual encryption?  To my somewhat
>naive way of thinking, I split the world of people who want to decrypt
>my data into two groups: the NSA and everyone else.  Casual encryption
>should keep out the second group, but is not expected to keep out the
>first.
>
>Having said that, this is the system I have in mind:
>
>Have the user enter a password.
>
>Hash the password using a hashing algorithm, e.g. SHA1.
>
>Take the first 128 bits of the hash, and use as a block key for a
>symmetric cypher, e.g. SAFER.
>
>Are there any obvious flaws in this system?  Will the two example
>above provide reasonable security, or should I look elsewhere?

No. Yes. Yes.

If you want a simple yet strong scheme and wish to write the code,
I suggest ciphersaber.  If you just want good security, use PGP.


------------------------------

From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: NIST, AES at RSA conference
Date: Tue, 01 Feb 2000 22:29:02 GMT

Terry Ritter wrote:
>
> David Wagner wrote:
>
> >Your post still doesn't address Brian Olson's elegant
> >one-line refutation:
> >   ``If all unprovable ciphers must be tripled, then we
> >     must triple our triples of triples with no end.''
>
> The statement is neither elegant nor refutation:  Since I have never
> claimed that "all unprovable ciphers must be tripled," the apparent
> response is no response at all;

It presented as a response to the two alternatives John
Savard presented, not to you.  No one cited what you put in
quotes as something you claimed.  The parts that refer to
your position are the ones I flagged in the text as my
reading of your position.

> it is instead the introduction of a
> "red herring."  It is a deliberate attempt to mislead the reader about
> my position and somehow "win" the argument.

Actually what I'm saying is that I see your stated position
as inconsistent.  The point here is that the arguments you
gave for tripling ciphers still apply given ciphers that are
themselves triple-ciphers. A cipher made by tripling three
ciphers is itself one cipher and I did not see in the
argument for tripling any consideration for what the
component ciphers may already do to thwart attacks or for
their efficiency.

Is there a reason not to re-triple other than efficiency?
Ultimately we have to make a judgment trading off unprovable
assessments of safety against practicality.  As Dave Wagner
pointed out, it is not clear that triple-ciphers are a good
deal.


--Bryan
--
email: bolson at certicom dot com


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Markku J. Saarelainen <[EMAIL PROTECTED]>
Crossposted-To: soc.culture.israel
Subject: See my message at alt.politics.org.cia
Date: Tue, 01 Feb 2000 22:34:31 GMT



See my message at alt.politics.org.cia .. you shall be delighted !

My regards also to all those cryptography researchers and
cryptoanalysts with whom I have communicated.

Best regards,

Markku


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Is the following system acceptable for "casual" encryption?
Date: Tue, 01 Feb 2000 15:44:31 GMT

David Goodenough <[EMAIL PROTECTED]> wrote, in part:

>Are there any obvious flaws in this system?  Will the two example
>above provide reasonable security, or should I look elsewhere?

That is the proper basic way to do it. Others have noted that you
should encourage a pass _phrase_ rather than a password. I'll also
note that using CBC mode instead of ECB mode is a good idea, as it
means that identical blocks are no longer visible.

John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Available Algorithms
Date: Tue, 01 Feb 2000 15:42:09 GMT

"Simon R. Love" <[EMAIL PROTECTED]> wrote, in part:

>My basic question, of all the well known algorithms, which ones are
>available for use without patent, copyright, trade secret etc which means
>its going to cost me ?

As someone else noted, TwoFish and Blowfish are available freely. So
are some of the other AES candidate algorithms, but not all of them.
My web site explicitly mentions the AES candidates which are
available, which include LOKI 97, Rijndael, FROG, SAFER+, and SERPENT
in addition to Twofish.

As well, SAFER is freely available, and so are DES and Triple-DES.

If you really want to be adventurous, there are the Quadibloc family
of algorithms, but they're not "well known", so they have not recieved
any amount of cryptanalytic attention. Nor have they been implemented
(with one exception, Quadibloc S, which I did in BASIC to produce some
test vectors).

John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: NIST, AES at RSA conference
Date: Tue, 01 Feb 2000 16:31:45 GMT

Shawn Willden <[EMAIL PROTECTED]> wrote, in part:

>Given independent keys.  What if the same key is used for all three?

That would be very silly, and would hardly say anything about the
value of using three different ciphers.

However, if one took a passphrase with sufficient entropy to key all
three, and took three different secure hash functions of it, that
would be the equivalent of independent keys.

John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Crossposted-To: soc.culture.israel
Subject: Re: See my message at alt.politics.org.cia
Date: Tue, 01 Feb 2000 16:38:31 GMT

Markku J. Saarelainen <[EMAIL PROTECTED]> wrote, in part:

>See my message at alt.politics.org.cia .. you shall be delighted !

A message telling me that cook books are good encryption manuals may
have an encryption algorithm steganographically concealed within it.

What a concept.

John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

Date: Tue, 01 Feb 2000 16:56:03 -0700
From: Shawn Willden <[EMAIL PROTECTED]>
Subject: Re: NIST, AES at RSA conference

Terry Ritter wrote:
[stuff elided, to which I'll respond later, after some more thought]

> >However, I also believe (again without real justification) that our key
> >management practices are far weaker than our ciphers.
>
> To agree that key management practices are weaker than ciphers, one
> would seem to need knowledge of the weakness of ciphers -- and that is
> knowledge one cannot have.  So this is an inherently insupportable
> belief, but -- strangely -- one that academics seem more than willing
> to embrace, and that is part of the problem.

The fact that one would need knowledge of the weakness of our ciphers in order to
say that our key management practices are weaker is precisely why I added the
parenthetical comment about my lack of real justification.

However, given the key management practices I see on a daily basis, I think my
statement is supportable.  When a key that protects data worth millions of dollars
is stored in cleartext in a file on an internet-connected computer in room that is
locked only occasionally...

I suspect that Twofish is stronger than that, if for no other reason that I could,
with very little thought, planning or effort, steal that key.  I think I'd have to
scratch my head a little to crack Twofish.

Even though my example above is a bit extreme, I rarely see key management
processes and infrastructures that I would consider secure -- even at major banks,
which frightens me.

Shawn.


------------------------------

From: [EMAIL PROTECTED] (Keith A Monahan)
Subject: Re: Jaws Technologies' L5 Data Encryption Algorithm?
Date: 1 Feb 2000 23:59:50 GMT

Bob,

I'm not familiar with the product or their supposed encryption algorithm
but I *AM* familiar with this type of snake oil marketing.  Maybe it's
a good algorithm, maybe it's not. Check snake oil FAQ for further details.

The two paragraphs that set off the snake oil detector were 

Jaws Technologies has applied for patents for their technology, which
requires more permutations to crack than the scientific community
currently has a number for, said Robert Kubbernus, CEO of Jaws.

The U.S. government restricts software for export to 56-bit encryption
in most uses, and 128-bit for specific circumstances, but does not
prohibit important software using higher encryption. The 'L5' in the
product name refers to level 5, or the U.S. government's highest rating
for security, said Kubbernus.

Read for yourself at http://www.winmag.com/news/1999/0201/0203a.htm.

Oh and their web site is pretty funny too.

http://www.jawstech.com

"Thanks to the statistically unbreakable nature of the JAWS 4096 bit
encryption algorithm, the confidentiality of files created and
maintained in your "

"The L5 methodology is claimed to be able to withstand the publication
of its methodology. Possession of this documentation and the source
code, by any party, would not reduce the strength of the encryption,
for any other party. "

Funny, I didn't see source posted?

Keith

[EMAIL PROTECTED] wrote:
: Can anyone tell me whether or not the claims from Jaws Technologies
: respecting their "Jaws Technologies' L5 Data Encryption Algorithm" are valid?

: Has it been broken, and if so, what was the time required to perform the
: break, and what equipment was used.   Whenever I see claims for a proprietary
: algorithm I tend to view them with a very jaundiced eye, and this is one of
: those cases...

: Thanks in advance,

: Bob

:  
: R.S. (Bob) Heuman      -     Willowdale, ON, Canada
: ===================================================
: <[EMAIL PROTECTED]>      or       <[EMAIL PROTECTED]>
:                   Copyright retained.
:              My opinions - no one elses...
:  If this is illegal where you are, do not read it!

------------------------------

From: [EMAIL PROTECTED] (Ed Pugh)
Crossposted-To: misc.survivalism
Subject: Re: Does the NSA have ALL Possible PGP keys?
Date: 2 Feb 2000 00:01:48 GMT
Reply-To: [EMAIL PROTECTED] (Ed Pugh)

cfm ([EMAIL PROTECTED]) wrote:
> What's the big deal, any one of us who wishes to spend the time can
> generate all possible PGP keys.

No.  That's not true.

No one person could ever possibly hope to generate all possible PGP
keys, even if (s)he *were* willing to spend the time.  A little thing
called "death" gets in the way.

It would take too many gazillions of googles of lifetimes (or whatever
the ridiculously huge number would be; someone else can do the math) to
generate all possible PGP keys.

Then, there is the small problem of data storage.  There just isn't
enough material in the whole universe (let alone, Fort Mead) to store
all of those big numbers.

So, you see, it really *is* (quite literally) a *BIG* deal.  It is
impossible for any person or organisation to possess all possible
PGP keys.
--
Ed Pugh, <[EMAIL PROTECTED]>
Richmond, ON, Canada (near Ottawa)
"Bum gall unwaith-hynny oedd, llefain pan ym ganed."
(I was wise once, when I was born I cried - Welsh proverb)

------------------------------

From: Uri Blumenthal <[EMAIL PROTECTED]>
Subject: Re: The Best Books
Date: Tue, 01 Feb 2000 19:26:35 -0500
Reply-To: [EMAIL PROTECTED]

William Stallings wrote:
> You might consider my book,
> 
> Cryptography and Network Security: Principles and Practice, 2nd Edition
> (1999, ISBN 0-13-869017-0)
> 
> Winner of the 1999 Texty Award for the best Computer Science and
> Engineering textbook, awarded by the Text and Academic Authors
> Association, Inc.

I vouch for this book. A great addition to my library.
-- 
Regards,
Uri           [EMAIL PROTECTED]
-=-=-==-=-=-
<Disclaimer>

------------------------------

Date: Tue, 01 Feb 2000 19:41:14 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Biggest keys needed (was Re: Does the NSA have ALL Possible PGP 

Darren New wrote:

> Newgroups trimmed...
>
> > much that it would exceed the chandrasekhar limit and collapse into
> > a black hole... so you couldn't retrive the data anyway".
>
> I suspect this has come up before, but is there an upper limit on how big
> keys need to be to be unbreakable via brute force? I.e., if I have a cypher
> that can't be attacked mathematically (yeah, I know, how do I know?) and
> brute-force key search is the only attack, is there a key size that due to
> physics will prevent someone from searching all the keys? For example, if
> the fastest operation consists of the time for one photon to cross the plank
> length, and you assume you have computers that are the size of photons and
> pack the entire observable universe with them, perhaps you couldn't count to
> 2^800 or something?

This issue came up a few months ago.  If every possible position in the
observable universe is a computer that tests a key in the Fermi time and they
all run until the breakdown of protons (1e31 years by a stale theory), then you
need a key of ~870 bits to prevent it being found.

N.B., this is as close as I can envision to Ritter's "Cryptanalyst's Stone"

>
>
> Is that still big enough given quantum computing advances? Can it be? (The
> only quantum computers I've been able to understand are Feynman's
> description, which focusses more on reversability than parallelism.)

QC gives you around sqrt() advantage, so doubling the key yields about the same
strength.


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to