Cryptography-Digest Digest #112, Volume #12      Mon, 26 Jun 00 18:13:01 EDT

Contents:
  Re: Compression & Encryption in FISHYLAND (James Felling)
  Re: breaking encryption - help! ("Joseph Smith")
  Re: How Uncertain? (Future Beacon)
  Re: Key agreement in GSM phones (Paul Rubin)
  Re: Surrendering Keys, I think not. (S. T. L.)
  Re: Algo's with no easy attacks? ("Joseph Ashwood")
  Re: Compression and known plaintext in brute force analysis (restatements caused by 
the missing info .... thread) ("Joseph Ashwood")
  Re: breaking encryption - help! (Steve Basford)
  Re: Surrendering Keys, I think not. ("Douglas A. Gwyn")
  Re: Weight of Digital Signatures (Shawn Willden)
  Re: On a notation issue of Feistel ciphers (Mok-Kong Shen)
  Re: Key agreement in GSM phones (David A. Wagner)
  Re: RPK (David A. Wagner)
  SSL CipherSuites in Netscape & Internet Explorer (Peter Kwangjun Suk)
  Re: Encryption on missing hard-drives ("Douglas A. Gwyn")
  Re: How Uncertain? ("Douglas A. Gwyn")
  Re: Surrendering Keys, I think not. (Simon Johnson)

----------------------------------------------------------------------------

From: James Felling <[EMAIL PROTECTED]>
Subject: Re: Compression & Encryption in FISHYLAND
Date: Mon, 26 Jun 2000 14:40:59 -0500



"SCOTT19U.ZIP_GUY" wrote:

> [EMAIL PROTECTED] (James Felling) wrote in
> <[EMAIL PROTECTED]>:
>
> >> <snip>
> >
> >> .
> >>
> >> I don't follow ya.  I never intended to suggest that compression
> >> will make it weaker, I just wanted to point out an interesting
> >> observation about how compression can make a cipher stronger
> >> (which it can't).
> >
> >It CAN make a cyper stronger, but it is not guaranteed to.  There are
> >advantages regarding Unicity distance, and also regarding the amount of
> >cyphertext available for analisys, but there are always cases where it
> >will actually make your situation worse. ( such as encrypting highly
> >random data -- it will probably worsen the statistacal character of your
> >data while actually lengthening the plaintext.
> >
> >I am of the opinion that compression before encryption is almost always
> >a good idea,  with special emphasis on the almost.
> >
>
>   I my be wrong but it seems your position is starting to crack.

No.  I will state that 99.99999% of the time compression is a good thing.
However, every so often there will be situations in which it is less than
optimal, and such circumstances can be noticed with a reasonable degree of
accuracy, so it is not really an issue.  I would use a headerless compression
of some sort, and if such compression were 1-1 it would be a good thing, but I
would have to consider a number of characteristics of the compressor before
deciding upon ANY compression scheme for a specific aplication.

>
> I will go to say that most compression methods in use may be of
> help to the attacker. The main reason is that they gave the attacker
> a black and white anwser as to weather or not the guessed key leads
> to a possible solution.

How is that? They may allow one to discard possibles more easily( i.e.
decompression fails ), but one still needs to make the attempt to decompress.
A much more important concern is how well it does at filling the space of
possible plaintexts.

I concede that the theoretical optimum for compressors is a bijective
compressor, but that I would still rather have a highly efficient compressor
that is non bi jective versus a poor bijective one.  I feel that your
compressor is an OK compressor, but before setting up any actual aplication of
that technology, I would have to test it versus others as to how it performs
with the likely traffic.

> Do to the fact most random files will not
> decompress and then compress back to the stated random file. By using
> most compression schemese you give the attacker who may know nothing
> about the message being sent a way to eliminate most files immediatly
> without having to check if the guessed file is a valid file for some
> word video or audio application. If one chooses a bijective or (1-1)
> compressor this problem does not exist. In the compression methods
> that have been modifed to make them bisjective namely the huffman rle
> and arithmetic they also compress better after the methods were modifed
> to make them bijective. There is no reason to belive this effect would
> not occur in other compression methods. It is just that people don't
> give it much thought yet. The NSA and phony crypto gods have let people
> but these improvements on the back burner for obvious reasons.
>
> David A. Scott
> --
> SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
>         http://www.jim.com/jamesd/Kong/scott19u.zip
> Scott famous encryption website NOT FOR WIMPS **no JavaScript allowed**
>         http://members.xoom.com/ecil/index.htm
> Scott rejected paper for the ACM
>         http://members.xoom.com/ecil/dspaper.htm
> Scott famous Compression Page WIMPS allowed ** JavaScript OK**
>         http://members.xoom.com/ecil/compress.htm
> **NOTE EMAIL address is for SPAMERS***
> I leave you with this final thought from President Bill Clinton:
>    "The road to tyranny, we must never forget, begins with the destruction
> of the truth."


------------------------------

From: "Joseph Smith" <[EMAIL PROTECTED]>
Subject: Re: breaking encryption - help!
Date: Mon, 26 Jun 2000 19:39:01 GMT

Steve,
    The data you provided rules out various simple schemes
like C = P +/-/xor K, C = K - P, C = (P xor K1) +/- K2,
for P being a plaintext character, C being ciphertext, and
K being one or more key bytes.
Can you try out the following values?

ddd
eee
fff

AAA
BBB
CCC
DDD
EEE
FFF

aba
aca
ada
aea
afa
aBa
aCa
aDa

abb
acc
add
aee





------------------------------

From: Future Beacon <[EMAIL PROTECTED]>
Subject: Re: How Uncertain?
Date: Mon, 26 Jun 2000 16:12:09 -0400




On Mon, 26 Jun 2000, Mok-Kong Shen wrote:

> I can't answer your question. But if you want to utilize the entropy in
> the stuffs of news groups, I recommend feeding these to a good cipher.
> See my  article in this group entitled 'On utilizing entropy in natural
> language texts' of 19th June.
> 
> M. K. Shen


M. K.,

Thank you for responding to my question.  I have read your post of
the 19th and I find it informative and encouraging.

I am writing also to inquire about this new (to me) use of the word
"entropy."  Is it just a metaphor for uncertainty, or does it have
a precise quantitative definition in cryptography that is different
from older statistical terms?

Does anybody have a formula?

Thank you for your help.


Jim Trek
Future Beacon Technology
http://eznet.net/~progress
[EMAIL PROTECTED]


------------------------------

From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: Key agreement in GSM phones
Date: 26 Jun 2000 20:20:10 GMT

In article <8j7rqh$92u$[EMAIL PROTECTED]>,
David A. Wagner <[EMAIL PROTECTED]> wrote:
>In article <[EMAIL PROTECTED]>, Gerard Tel  <[EMAIL PROTECTED]> wrote:
>>  1. What protocol is used by the two parties to agree on the
>>     64 bit keys used?
>
>The base station and handset share a common key, called Ki.  The base
>station sends a random challenge to the handset.  Both ends compute a
>response SRES and a session key Kc from Ki and the challenge using a
>symmetric `hashing' algorithm known as A3/A8.  The handset sends SRES
>back to the base station to authenticate itself.  Finally, both sides
>use Kc as their A5 key for that call.

Is Ki a single key shared by the base station and ALL the handsets?!
Followup questions left to your imagination.

------------------------------

From: [EMAIL PROTECTED] (S. T. L.)
Subject: Re: Surrendering Keys, I think not.
Date: 26 Jun 2000 20:22:46 GMT

<<So, take a dozen (or 12**12 even) plaintexts; XOR (or equivalent) them
together; XOR with a one time pad. Then there are 12 "keys" which
decrypt into 12 different messages.>>

Goofy.  Take a plaintext, OTP it.  There are infinite (*) keys which decrypt
into infinite (*) understandable messages.  That's what OTPs are for.  No need
for other XORing.

* - You know what I mean.

-*---*-------
S.T.L.  My Quotes Page * http://quote.cjb.net * leads to my NEW site.
Pages up: 395 Quotes, 31 reviews of 165 science books, and a review of
the Foundation series. Newest Page: S.T.L.'s Fighter Jet Paper Airplane!

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Algo's with no easy attacks?
Date: Mon, 26 Jun 2000 13:27:00 -0700

The answer is sort of, yes, and no. Given a plaintext you can always
determine more about the key than with no knowledge of the ciphertext, but
with strong algorithms this value is typically reduced to an epsilon (very
small). The basic reasoning for this is that if you have no idea what the
plaintext is, even if you brute force the entire key space, all values will
be weighted only by their occurance, but if the plaintext is known, the
attack can speculatively stop on a correct decryption. But yes IDEA,
Twofish, 3DES, Serpent, Rijndael, RC6, MARS, and many others, are quite good
in this regard (meaning that it is currently nearly impossible to concieve a
successful attack once the ciphertext is determined).
                    Joe

"matt" <[EMAIL PROTECTED]> wrote in message
news:XR_45.21763$[EMAIL PROTECTED]...
> Hi all.
>
> I've been lurking on this NG for a while, and often mention is made of
> various attacks on algorithms such as known plaintext, repeated
> messages etc.
>
> I don't have much experience in matters such as this, so are there
> any/many algorithms which are freely available, which don't suffer
> from any known attacks such as this. Basically, i want something which
> I don't have to worry about that the plaintext may be known, or a
> repeated pattern of messages is sent, or other problems such as that.
> Are IDEA, Twofish, 3DES etc OK in this regard, or are there problems
> with these?
>
> Thanks,
> Matt Johnston.
>
>
>
>



------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Compression and known plaintext in brute force analysis (restatements 
caused by the missing info .... thread)
Date: Mon, 26 Jun 2000 13:32:08 -0700


"James Felling" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>
>
> Joseph Ashwood wrote:
>
> > Compression is certainly not a cure all for cryptographic protection,
> > especially against a known plaintext attack. For this I am assuming that
the
> > compression algorithm uses a fixed table known to the attacker.
According to
> > results that were posted here certainty in a DES sized cipher should
take 2
> > blocks of attempts, with ASCII text. This is due to the amount of
entropy
> > and the unicity being at 2 blocks.
> >
> > But with compression a slightly different result can occur, we can end
up
> > with a deflation down to significantly smaller sizes. A quick test with
> > winzip on a large text file gave a size reduction of 65%, more usefully
for
> > this that means that the compression resulted in 3 bytes per byte.
Moving
> > outward this leads to 3 blocks per block, so with a (semi)known
plaintext,
> > under compression, it is entirely possible that the unicity distance
could
> > actually be reduced.
> >                         Joe
>
> Ok, I think I see what you are claiming.  That given n encrypted blocks
and a
> plaintext of known character, with compression you may actually need fewer
> cyphertext blocks than without compression.  I belive this to be a
possible
> condition to occur.  OTOH I believe you are missing something here.
>
> Given we have a plaintext P composed of  blocks p(i), and after
compression we
> have a compressed plaintext C composed of blocks c(i).  It may be possible
to
> recognise the code with fewer c(i)'s , however it would still require more
p(i)
> blocks.

But it will still only require the work of decrypting the c(i)'s, which was
the primary interest to me.

> need 2 plaintext blocks for recognition, and in case 2 we need 3 -- we are
still
> better off.

Generally yes, my example would only be applicable in a very, very, very
small number of cases, and I think they have may have to be fully crafted to
take advantage (considering the likelihood of it occuring otherwise).
                        Joe




------------------------------

From: Steve Basford <[EMAIL PROTECTED]>
Subject: Re: breaking encryption - help!
Date: Mon, 26 Jun 2000 21:44:24 +0100

On Mon, 26 Jun 2000 19:39:01 GMT, "Joseph Smith" <[EMAIL PROTECTED]> wrote:

>Steve,
>    The data you provided rules out various simple schemes
>like C = P +/-/xor K, C = K - P, C = (P xor K1) +/- K2,
>for P being a plaintext character, C being ciphertext, and
>K being one or more key bytes.

I've tried a few of those out today and just played... but
a) didn't get anywhere & b) don't really have the knowledge

>Can you try out the following values?

yep, some I couldn't do, as the uppercase letters are auto converted 
to lower case...


*** 1st char = url length, followed by 3 filler bytes
*** last char = ban index
*** uppercase chars are converted to lowercase


 03-00 00 00-3C 22 FF-01  = ddd
 03-00 00 00-3D CB 35-01  = eee
 03-00 00 00-3E BE 04-01  = fff

 03-00 00 00-39 54 B1-01  = aba
 03-00 00 00-39 55 46-01  = aca

 03-00 00 00-39 52 75-01  = ada
 03-00 00 00-39 53 E8-01  = aea

 03-00 00 00-39 53 E8-01  = aea
 03-00 00 00-39 50 7F-01  = afa

 03-00 00 00-39 54 B2-01  = abb
 03-00 00 00-39 55 44-01  = acc

 03-00 00 00-39 52 70-01  = add
 03-00 00 00-39 53 EC-01  = aee

 06-00 00 00-68 93 22 DF 49 45-01 = 012345
 06-00 00 00-6D 8B 77 04 B7 BE-01 = 543210

If you need a complete .CFG file of real url's, let me know and I'll email
it...

thanks for looking.... 


------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Surrendering Keys, I think not.
Date: Mon, 26 Jun 2000 20:33:04 GMT

"Tony T. Warnock" wrote:
> So, take a dozen (or 12**12 even) plaintexts; XOR (or equivalent) them
> together; XOR with a one time pad. Then there are 12 "keys" which
> decrypt into 12 different messages.

All you really need is an alternate one-time key made by XORing
the genuine ciphertext with any fake plaintext of the same length.
The problem is, you would have to be prepared to deliver the fake
alternate key for *every* intercepted ciphertext, which doubles
your storage requirement, and if the data were meant to be
decrypted at the pther end of a communication link you'd need to
get the fake keys to the other end without the eavesdroppers
being aware.

------------------------------

Date: Mon, 26 Jun 2000 15:02:57 -0600
From: Shawn Willden <[EMAIL PROTECTED]>
Subject: Re: Weight of Digital Signatures

Mok-Kong Shen wrote:

> John Savard wrote:
>
> > The difficulty with digital cash is if it is to be anonymous, or if it
> > is to be able to be processed off-line. Otherwise, there is no
> > problem, which is why we already have credit cards.
>
> Do you consider what is done with telephone cards an on-line or
> off-line processing?

There are different kinds of phone cards.  Some are on-line, some are
off-line.

Shawn.


------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On a notation issue of Feistel ciphers
Date: Mon, 26 Jun 2000 23:19:06 +0200



tomstd wrote:

> With three words you have an unbalanced feistel cipher.  They
> are not particularly usefull for encryption (but good as hash
> functions).  Check this out.
>
> A = A + F(C)
> B = B + F(A)
> C = C + F(B)
>
> Looks good because the previously modified word is going through
> the F function the avalanche will be high... but let's look at
> decryption.
>
> C = C - F(B)
> B = B - F(A)
> A = A - F(C)
>
> Now it's all backwards the previous word is not the input so the
> avalanche is much less.

Your last sentence is not very clear. Denoting the ciphertext with an
apostorphe, the encryption and decryption are with your notation as
follows:

         Encryption                          Decryption
      A' = A + F(C)                     C = C' + F(B')
      B' = B + F(A')                     B = B' + F(A')
      C' = C + F(B')                     A = A' + F(C)

In the first set F operates once on an input block and twice on already
processed blocks, while in the second set F operates twice on input
blocks and only once on already processed block. I suppose you
mean by this difference that there is less avalanche effects on
decryption
than encryption. However, while on encryption it is evidently desirable
to
have very good avalanche effect in order to maximize the uncertainty of
determining from ciphertext bits the plaintext bits, it seems less clear
why
one needs very good avalanche effect in the direction of decryption.
Perhaps you could give some convincing reasons. (I am not arguing that
dividing into three parts is good though; see my original post.)

M. K. Shen



------------------------------

From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Re: Key agreement in GSM phones
Date: 26 Jun 2000 13:33:55 -0700

In article <8j8dtq$254$[EMAIL PROTECTED]>,
Paul Rubin <[EMAIL PROTECTED]> wrote:
> Is Ki a single key shared by the base station and ALL the handsets?!

No, there's a different Ki for each handset.  When the handset first
connects to the base station, it supplies its identity, so that the GSM
infrastructure knows which Ki to use.

Sorry about the lack of clarity.

P.S. A real bug in my description: Actually, Ki usually lives not on the
base station but in the infrastructure, and the infrastructure computes
Kc and SRES from Ki and the challenge on behalf of the base station.

------------------------------

From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Re: RPK
Date: 26 Jun 2000 13:38:59 -0700

In article <[EMAIL PROTECTED]>,
Doug Kuhlman  <[EMAIL PROTECTED]> wrote:
> What is RPK?  I've never heard of it....

I think it's been discussed here in the past.

If I remember correctly, it combines Diffie-Hellman over GF(2^n) with
some home-brew stream cipher over GF(2^n), re-using the same field
representations for both.  Diffie-Hellman over GF(2^n) is a fine idea;
building your own stream cipher rarely is.

Anyway, I don't know of any advantage to RPK over just using
Diffie-Hellman and your favorite cipher (Blowfish, 3DES, Rijndael, ...).

------------------------------

From: [EMAIL PROTECTED] (Peter Kwangjun Suk)
Subject: SSL CipherSuites in Netscape & Internet Explorer
Date: Mon, 26 Jun 2000 21:31:56 GMT

Folks,

I've been searching for the answer to this for awhile, and can't seem
to get a straight answer.  What SSL ciphersuites are in Netscape 4.7
and Internet Explorer 5.0?  

Pointers to docs are appreciated the most.  


--
Peter Kwangjun Suk
Cincom Systems, Inc.
[EMAIL PROTECTED]  http://ostudio.swiki.net 
(Opinions expressed may not be that of my employer.)

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Encryption on missing hard-drives
Date: Mon, 26 Jun 2000 20:40:33 GMT

Mike Andrews wrote:
> Absolutely true. It's a page-turner. I'm surprised the military
> didn't suppress it and turn it into a page-burner.

That's not the business we're in.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: How Uncertain?
Date: Mon, 26 Jun 2000 20:46:33 GMT

Future Beacon wrote:
> I am writing also to inquire about this new (to me) use of the word
> "entropy."  Is it just a metaphor for uncertainty, or does it have
> a precise quantitative definition in cryptography that is different
> from older statistical terms?

He's using the term in Shannon's sense, more or less.
There is a certain asymptotic "entropy per character"
in newsgroup traffic, reflecting the predictability
of the next character from what has gone before.  It
appears that there are at least 6 bits of entropy per
8-bit octet of uncompressed newsgroup plaintext,
maybe closer to 7.

------------------------------

Subject: Re: Surrendering Keys, I think not.
From: Simon Johnson <[EMAIL PROTECTED]>
Date: Mon, 26 Jun 2000 14:50:02 -0700

The problem with all these suggestions, is that all these OTP or
equivlent keys take an equal space to the message.

What i'm suggesting is more subtle:

I'm saying that since the output of a cipher is
indistinguishable from random data, that it could be used to
masquerade as a OTP.

    U'd take another document, XOR it with the cipher-text, this
would give you a dummy key. U then surrender this key, instead
of the proper key used with the algorithm.

I think this is a good idea, because the legatimate owner of the
encrypted file only has to enter their encryption key to decrypt
the file.  It take nearly no extra effort to make the false key,
and doesn't affect normal use of the file atall.

There is no way they could prove that it isn't a OTP, unless
they brute-forced the underlying algorithm.

Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to