Cryptography-Digest Digest #89, Volume #11       Thu, 10 Feb 00 13:13:01 EST

Contents:
  Re: Weak Blowfish implementations? (Tom St Denis)
  Re: Compression cannot prevent plaintext recognition (was Re: is  (James Felling)
  UK publishes 'impossible' decryption law  ("NoSpam")
  Re: Message to SCOTT19U.ZIP_GUY (James Felling)
  Re: I'm returning the Dr Dobbs CDROM (Jack Diamond)
  Re: I'm returning the Dr Dobbs CDROM (wtshaw)
  Re: A query method for communications ... (wtshaw)
  Re: New standart for encryption software. (wtshaw)
  Re: Encryption protocol questions (Mike Rosing)

----------------------------------------------------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Weak Blowfish implementations?
Date: Thu, 10 Feb 2000 16:53:23 GMT

In article <87udr5$7qk$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> I have tried out some Blowfish implementations and found that regular
> plaintext patterns, like entries with line delimiters pasted
repeatedly,
> show up as regular patterns in the cyphertext. The regularities are
> visible by plain eye. As they do not appear when there are no obvious
> regularities in the plaintext, I assume that they cannot be product of
> any formating or block encoding after encryption.
>
> 1) I've always thought that a good cryptographic algorithm must
produce
> output that at least *looks* random. Is this correct?
>
> 2) Can regular patterns occur in correctly Blowfish encrypted
cyphertext?
> Or is it reasonable to assume that the implementations I've tried are
> buggy?
>
> 3) Is there anybody who is willing to take a look at these
> implementations?
>
> Best regards,
>
> John Stone

Blowfish itself is very safe, this particular system you describe
doesn't seem to be safe.  It's important to realize the distinction
between secure equipment and secure systems.

To answer a few of the questions:  Blowfish is a symmetric block
cipher.  It will always produce the same output [given the same input
and user key].  The most common method to defeat this shortcomming is
to use a chaining mode, where similar plaintext blocks are not
encrypted the same way.

So if I encode 'aaaabbbbaaaabbbb' with blowfish in ECB mode, I should
get two identical ciphertext blocks.  In CBC I would not.

I would ask the author of that implementation whether he/she knows what
a chaining mode is.  If not I would not readily trust that software.

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: James Felling <[EMAIL PROTECTED]>
Subject: Re: Compression cannot prevent plaintext recognition (was Re: is 
Date: Thu, 10 Feb 2000 10:58:45 -0600



Tim Tyler wrote:

> Anton Stiglic <[EMAIL PROTECTED]> wrote:
>
> : Wooo, this is deviating again.  Let me make this as simple as possible:
>
> : E_k: encryption algorithm using a key k,
> : D_k: decryption algorithm using a key k,
> : Z: compression function,
> : U: uncompression function
> : m: a plaintext message
>
> : So, encrypting m, would be done as follows:
> :  -first compress m,
> :   -then encrypt the result
> : That gives E_e(Z(m)), where e is the encryption key
>
> : Now, if you are testing out a decryption key d, you do
> : y <- D_d(E_e(Z(m)))
> : Now, if you have the correct key y=Z(m), so you simply
> : unzip y , call the result x  (x <- U(y)), if you have the right
> : decrypiton key, x = m.  So an attacker will simply look
> : for the headers in x.
>
> What if all x (such that x = U(f) for some f) have the headers the
> attacker is looking for?
>
> I.e. what if the information about message content available to the
> analyst is the same as the information about message content which was
> available to the author of the compressor - and the latter designed a
> scheme to correctly exploit it?
>
> : Where does anyone see this as complicating an attackers
> : job????
>
> Clearly it /can/ complicate the attackers job - in the case where all
> possible decrypted, decompressed messages contain the "headers" he
> would normally look for.
>
> : Seriously, you can talk about using modes of operations,
> : or come up with a different ciphertext, but compression doesn't
> : help to prevent an attacker from finding out if he has the
> : correct decryption key.
>
> Seriously, it clearly can.
>
> It can reduce the attacker who only knows statistical characteristics of
> the plaintext to virtual helplessness.
>
> Obviously if the attacker has definite known plaintext for a particular
> message - and (importantly) this information was not available to the
> designer of the compressor - he can still mechanically reject keys for
> that message, by checking to see if the decompressed result matches the
> plaintext.  Compression can make no difference to /that/.
> --

Just a couple of quibbles here.

1) If the compressor has some noticable traits when used on say english text(
most leave some artifacts of the compressing process) we may still be able to
setup a "candidate recogniser" that will look for those traits, or worst case
it will force decompression of candidate materials -- this will make the
attacker have to spend more effort, but it may be pipelinable in which case,
the attack speed is not necessarially reduced greatly.

2) The greatest benefit as far as attack resistance is the fact that there is
less cyphertext generated.  This reduces the material available to the
analyst, and thereby may prevent or make difficult some attacks vs the
encoding mechanism.

>
> __________
>  |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]
>
> I will never lie to you.


------------------------------

From: "NoSpam" <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: UK publishes 'impossible' decryption law 
Date: Thu, 10 Feb 2000 17:13:54 -0000

Se also http://news.bbc.co.uk/hi/english/sci/tech/newsid_638000/638041.stm

"UK publishes 'impossible' decryption law"

FLASH - FOR IMMEDIATE USE

FOUNDATION FOR INFORMATION POLICY RESEARCH (www.fipr.org)

=========================================================

News Release Thurs 10th Feb 2000

=========================================================

Today Britain became the only country in the world to publish a law which

could imprison users of encryption technology for forgetting or losing

their keys. The Home Office's "REGULATION OF INVESTIGATORY POWERS" (RIP)

bill has been introduced in Parliament: it regulates the use of

informers, requires Internet Service Providers to maintain "reasonable

interception capabilities", and contains powers to compel decryption

under complex interlocking schemes of authorisation.

Caspar Bowden, director of Internet policy think-tank FIPR said, "this law

could make a criminal out of anyone who uses encryption to protect their

privacy on the Internet."

"The DTI jettisoned decryption powers from its e-Communications Bill

last year because it did not believe that a law which presumes someone

guilty unless they can prove themselves innocent was compatible with the

Human Rights Act. The corpse of a law laid to rest by Stephen Byers

has been stitched up and jolted back into life by Jack Straw"



Decryption Powers: Comparison with Part.III of Draft E-Comms Bill (July 99)

========================================================================

The Home Office have made limited changes that amount to window-dressing,

but the essential human rights issue remains:

(Clause 46): authorities must have "reasonable grounds to believe" the key

is in possession of a person (previously it had to "appear" to authorities

that person had a key). This replaces an subjective test with one requiring

objective evidence, but leaves unaffected the presumption of guilt if

reasonable grounds exist.

(Clause 49): to prove non-compliance with notice to decrypt, the prosecution

must prove person "has or has had" possession of the key. This satisfies the

objection to the case where a person may never have had possession of the

key ("encrypted e-mail out of the blue"), but leaves unchanged the essential

reverse-burden-of-proof for someone who has forgotten or irreplaceably lost

a key. It is logically impossible for the defence to show this reliably.



HUMAN RIGHT CHALLENGE "INEVITABLE"

==================================

As part of the consultation on the draft proposals last year FIPR and

JUSTICE jointly obtained a Legal Opinion from leading human rights

experts (http://www.fipr.org/ecomm99/pr.html) which found that requiring

the defence to prove that they do not possess a key was a likely breach of

the European Convention of Human Rights.

Mr.Bowden commented, "following the recent liberalisation of US export

laws, as tens of thousands of ordinary computer users start to use

encryption, a test-case looks inevitable after the Human Rights Act comes

into force in October."



R.I.P. RESURRECTS KEY ESCROW BY INTIMIDATION ?

==============================================

Bowden said: "after trying and failing to push through mandatory

key-escrow, then voluntary key-escrow, it now looks like the government

is resorting to key-escrow through intimidation."






------------------------------

From: James Felling <[EMAIL PROTECTED]>
Subject: Re: Message to SCOTT19U.ZIP_GUY
Date: Thu, 10 Feb 2000 11:13:53 -0600


Wouldn't a better scheme be:
pass 1: compress with compression X
        2: encrypt with cypher A using key 1  in a chained mode starting at
"middle" of file (treating file as a  ring buffer)
        3: encrypt with cypher B using key 2 in a chained mode that starts at
begining of file
        4: encrypt with cypher C using key 3 in a chained mode starting at
"middle" of file (treating file as a  ring buffer)
       ( 5: compress with  compression Y)

Where cypher's A,B, and C are different cyphers, and X and Y good
compressors.(X is good for compressing whatever is expected as  input, and Y
if used is a good general purpose compressor). middle is defined in the spec.

Reasoning.
Compressing first will reduce the amount of input to cypher A -- this means a
shorter input so less data will need to be encoded, and less cyphertext will
be generated in the end.

chaining will diffuse the data through the file in a fairly efficient manner,
and in theory every block will have a chance to affect all others.

The final compression is im my opinion unnecessary, but may be done to save
bandwidth.



Tim Tyler wrote:

> Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
> : [EMAIL PROTECTED] wrote:
> :> Pass one encrypt with an "AES" ciper
> :> Pass two  use Compression A
> :> Pass three encrypt with a different key or different cipher
> :> Pass four use Compression B
> :> Pass five encrypt with a different Key
>
> : Those "compression" stages are not likely to compress,
> : so they simply amount to keyless transformations.
> : One might as well substitute real encryptions in their place.
>
> To my ears, the description quoted at the top sounds like an extremely
> garbled version of DS's recommendation of a method get diffusion of
> plaintext information through the entire message by applying adaptive
> compression programs "in both directions" through the file - in the
> absence of any better whole-message diffusion scheme.
>
> Such "compression in both directions" is mainly designed to produce
> diffusion of the plaintext information through the file.  Of course,
> the first pass would also compress - assuming it is applied to
> unencyphered text.
>
> *If* you were to try to replace such a process with a type of encryption
> - *and* you're trying to get the same effect - the encryption should be
> one that diffuses plaintext information through the file - and
> consequently prevents message fragments from be broken independently of
> the rest of the message.
>
> In other words, you *can't* use (many types of) stream cypher, or a
> ECB/CBC/CFB/OFB mode block cypher - since these constructs don't
> inhibit analysis based on message fragments to the extent that
> simple unbounded (and unkeyed) diffusion would.
> --
> __________
>  |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]
>
> Laugh and the whole world thinks you're an idiot.


------------------------------

From: Jack Diamond <[EMAIL PROTECTED]>
Subject: Re: I'm returning the Dr Dobbs CDROM
Date: Thu, 10 Feb 2000 17:24:54 GMT

You are giving up too quickly. 
Consider that if you are able to view the book you are able to print it.
Simply do a screen capture and save it in a compressed format.
You could, for example, run an OCR coder on the screen images
and save it as a simple text file.
Then print from the saved file.
Just a matter of digital recoding.

Jack

Victor Zandy wrote:
> 
>     A couple weeks ago I asked for opinions of the Dr Dobbs CDROM
> collection of cryptography books.  Overwhelmingly the response was
> positive, so I bought it.  (Thanks again to those of you who replied.)
> 
>     I am returning the CDROM because it is not suitable for printing.
> For example, to print chapter 1 of the Stinson book (44 pages) Adobe
> acroread (x86/Solaris 2.6) creates a 500MB postscript file.  I cannot
> print this file directly, probably because it is too big.  Although I
> might be able to find a way to print the file, at 500MB it would take
> too much time.
> 
>     I don't know how the PDF files on the CDROM were prepared, but
> they look like they were scanned from physical book pages.  For recent
> titles, like Stinson, they should have been generated from the
> computer originals to make a smaller file, with better image quality.
> 
>     Several people who responded to me said they appreciate being able
> to search and cut-and-paste the text on the CDROM.  For those
> features, and for anyone who doesn't mind reading books from computer
> displays, the CDROM is a great deal.  But it is useless for printing
> paper copies of its contents, even a tiny amount.
> 
> Vic Zandy
> 
>

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: I'm returning the Dr Dobbs CDROM
Date: Thu, 10 Feb 2000 10:50:29 -0600

In article <[EMAIL PROTECTED]>, "Douglas A. Gwyn"
<[EMAIL PROTECTED]> wrote:

> wtshaw wrote:
> > Even html is better than muddy images.
> 
> What does HTML have to do with it?

The complaint is that the formatting of vanilla text files is inadequate
as compared to muddy images.  Simple html as an option can make some files
somewhat easier to read.  

It's easy to have a viewer that will do the crude font sizing, etc., while
not requiring a bloated file.  The key here is what viewing is expected,
whereas just plain text is usually sufficient, the first choice if it gets
the job done, a little more can help sometimes.  Not knowing where to stop
on basic formatting can lead to bloated code, with diminishing returns
considering readibility vs. file length.

If we are generally reading as text, html just makes things worse, however.
-- 
If Al Gore wants to be inventor of the internet, complain that he 
did a lousy job.  If he admits to not doing much, complain that he 
is a slacker. Now, do we want him in charge of security?

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Crossposted-To: alt.politics.org.cia,alt.security,alt.2600
Subject: Re: A query method for communications ...
Date: Thu, 10 Feb 2000 10:58:17 -0600

In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

> I think the trick (if it could be called as such) is that one has
> a web page of innocent content but of sufficiently high interest
> to many, so that there would be plenty of people normally accessing
> it. This way, it is difficult to trace out the real intended
> recepients of the secret intelligence informations. Well, this is 
> actually nothing but sort of steganography, isn't it?
> 
Creative use of such a protocol would surely work if traffic analysis did
not get you. The best point of distribution would be on a default internet
home page with a few, but chronically changing graphics.  If thousands of
people used it as a starting point, this would kill being obvious for the
target.

What goes on with newsgroups dedicated to anonymous posts is that the
poster is going to check to see that his message made it, and the
recipient will too, but not many people are going to choose to look at
garbage for the fun of it.
-- 
If Al Gore wants to be inventor of the internet, complain that he 
did a lousy job.  If he admits to not doing much, complain that he 
is a slacker. Now, do we want him in charge of security?

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: New standart for encryption software.
Date: Thu, 10 Feb 2000 11:06:35 -0600

In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

> I believe that in software engineering it is not sufficient to
> stress that one should have reviews but also to demand that one
> should have codes that are, by design, easy to be reviewed in the
> first place. ...

Good code is modular and relocatable.  For instance, functions that deal
with key generation should result with the same results whether they are
incorporated into final product or a test application.  With simplified
code to study, it is more difficult to hide bad karma.
-- 
If Al Gore wants to be inventor of the internet, complain that he 
did a lousy job.  If he admits to not doing much, complain that he 
is a slacker. Now, do we want him in charge of security?

------------------------------

From: Mike Rosing <[EMAIL PROTECTED]>
Subject: Re: Encryption protocol questions
Date: Thu, 10 Feb 2000 12:01:27 -0600

[EMAIL PROTECTED] wrote:
> Say I receive an encrypted message over an unsecure channel.  I
> decrypt it, perform error correction on it, and then re-encrypt it
> with the same key and same algorithm.  I then transmit it to another
> place via another insecure channel.  As usual, all algorithms are
> public; only the keys are secret.
> 
> Is there a "rule" somewhere that says it's not a good idea to this?
> What if I re-encrypt using a different key?  (NOT encrypt once with
> key 1 and then again with key 2)  Do the answers here depend on the
> strength of the encryption algorithm?
> 
> My sense is that in the first case it is not a good idea because the
> corrected bit errors could reveal almost directly the underlying data
> and thus the cryptographic transformation.  This would probably be
> fatal in DES (would it?).  What about a stream cipher like RC4?

If you use something like CFB mode, and you change the first few bits
of the message, then it will look like completly different data.  
If the attacker knows the plaintext and the correction algorithm,
then they have 2 PT and 2 CT.  That's still not enough to crack the
next message.  If you use a different key, their problem is much harder.

> I am not a cryptanalyst, so I don't know if re-encypting with a
> different key would help.  My suspicion is that it wouldn't, but I
> don't know why, except that things that look like simple solutions in
> cryptography are usually flawed.  I haven't read anything of Bruce's
> yet that says this is a bad idea (I have his book and read his
> Counterpane articles).....  Pointers to other useful books or web
> sites are appreciated.

Using a different key will definitly help.  If you change the plain
text a bit (via error correction or on purpose) you make things
harder for the attacker as well.

What's more important is to change the key for each message.  Then
it really doesn't matter if you use the same key for the re-transmit,
you'll never give the attacker enough material to work with.

Patience, persistence, truth,
Dr. mike

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to