Cryptography-Digest Digest #117, Volume #11      Sun, 13 Feb 00 23:13:01 EST

Contents:
  Re: I'm returning the Dr Dobbs CDROM ("Trevor Jackson, III")
  Odp: New encryption algorithm ABC. ("Bogdan Tomaszewski")
  Re: Basic Crypto Question 3 (Bruce Schneier)
  Re: Period of cycles in OFB mode ([EMAIL PROTECTED])
  Re: Using Gray Codes to help crack DES ([EMAIL PROTECTED])
  Re: Basic Crypto Question 3 ("Joseph Ashwood")
  Re: RFC: Reconstruction of XORd data (Tim Tyler)
  Re: Block chaining (David Hopwood)
  Re: Basic Crypto Question 3 (stanislav shalunov)
  Re: Period of cycles in OFB mode (David Wagner)
  Re: Which compression is best? ("Douglas A. Gwyn")
  Re: Using Gray Codes to help crack DES ("Trevor Jackson, III")
  Re: RFC: Reconstruction of XORd data ("Douglas A. Gwyn")

----------------------------------------------------------------------------

Date: Sun, 13 Feb 2000 18:26:08 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: I'm returning the Dr Dobbs CDROM

Victor Zandy wrote:

>     A couple weeks ago I asked for opinions of the Dr Dobbs CDROM
> collection of cryptography books.  Overwhelmingly the response was
> positive, so I bought it.  (Thanks again to those of you who replied.)
>
>     I am returning the CDROM because it is not suitable for printing.
> For example, to print chapter 1 of the Stinson book (44 pages) Adobe
> acroread (x86/Solaris 2.6) creates a 500MB postscript file.  I cannot
> print this file directly, probably because it is too big.  Although I
> might be able to find a way to print the file, at 500MB it would take
> too much time.
>
>     I don't know how the PDF files on the CDROM were prepared, but
> they look like they were scanned from physical book pages.  For recent
> titles, like Stinson, they should have been generated from the
> computer originals to make a smaller file, with better image quality.
>
>     Several people who responded to me said they appreciate being able
> to search and cut-and-paste the text on the CDROM.  For those
> features, and for anyone who doesn't mind reading books from computer
> displays, the CDROM is a great deal.  But it is useless for printing
> paper copies of its contents, even a tiny amount.

Have you considered running the PDF images through an OCR filter?  You
might squeeze out most of the file at a cost of a few recognition errors.



------------------------------

From: "Bogdan Tomaszewski" <[EMAIL PROTECTED]>
Subject: Odp: New encryption algorithm ABC.
Date: Sun, 13 Feb 2000 23:15:29 GMT

>Then take the best compression tool
> you know and compress the cyphertext file. If there's a significant
> amount of compression as a result, your algorithm might be weak.

I work with this : 100% in ->about 100 % out....

--
Bogdan Tomaszewski
  [EMAIL PROTECTED]

Użytkownik <[EMAIL PROTECTED]> w wiadomości do grup dyskusyjnych
napisał:8872ri$7ie$[EMAIL PROTECTED]
> In article <T6yp4.8832$[EMAIL PROTECTED]>,
>   "Bogdan Tomaszewski" <[EMAIL PROTECTED]> wrote:
> > I discovered new encryption algorithm. I called him ABC - Alternative
Binary
> > Code.
> > Permits on usage of key about unrestricted lengths from this
> > neself( dependent only from lengths of key) with level of safety.
> > You know some programme testing entropia or breaking keys?
> > I can somewhere check whether message after coding realizes conditionses
of
> > good agorithm?
>
> To find out if the algorithm is good, you should post it here into the
> public. I think here are many people who at least can tell you when it
> happens to be not so good. If you test the cyphertext output with some
> programs, you probably will get to know that it is very insecure if it is
> very insecure. But if your algorithm passes the test, this might just
> mean that it is quite insecure, just not insecure enough to be detected
> by simple statistical analysis.
>
> I'm not an expert, just a newbie, but here's one test that might work:
> Take a very large chunk of plain text (e.g. a book in electronic form)
> and encrypt it to raw binary data. > ...
> Greetings,
>
> John Stone
>
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.





------------------------------

From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Re: Basic Crypto Question 3
Date: Sun, 13 Feb 2000 23:10:30 GMT

First, understand that you can't mathematically prove anything more
than: a cascade of block ciphers is as strong as the weakest block
cipher in the cascade.  Realistically, though, there is every reason
to believe that a cascade is stronger than the individual ciphers.

On Sun, 13 Feb 2000 20:02:48 GMT, [EMAIL PROTECTED] wrote:

>Cascading Multiple Block Ciphers:
>
>1.  If a plain text is encrypted with three Ciphers, is it as strong as
>the strongest Cipher in the Chain or as week as the weekest?

I believe it is much stronger than any of the three ciphers.

>2. Coud there be some subtle interaction between the algorithms that may
>reduce security or reduced keyspace?

Yes, that's the reason you can't prove anything very useful here.  But
such interactions are very unlikely.  It is possible, for example,
that there are attacks at the point where the ciphers change (see
Wagner's boomerang work as an example), but if the individual ciphers
are well designed this is not going to be a problem.

>3. Is there an optimum way of combining ciphers together or
>rules...assuming that the cascade is made up of block ciphers of the
>same size and key length?  i.e. should one choose IDEA, 3DES, CAST128 or
>Blowfish,IDEA,CAST128, or Blowfish, RC5 and 3DES......what are the
>criteria???

Do any feedback modes outside the cascade.  That is, treat the cascade
of ciphers as a single cipher for the purposes of implemeting a
feedback mode.  I can't think of any reasons why you might want to
order ciphers in your cascade a certain way.

>4. What if the ciphers have different block size and key length, is it
>still ok to cascade them?  Blowfish, Twofish, IDEA?

I see no reason why not.  Twofish has a 128-bit block, so you would
have to use two Blowfish or IDEA encryptions in parallel after the
Twofish encryption.  Off the top of my head, it seems to make sense to
put the large block algorithms on the outside of the cascade and the
smaller ones on the inside.

>5. Is it too complex to alternately encrypt the plaintext blocks with
>the different ciphers in one Pass?????  Does that make sense?

I don't know if it makes sense mathematically.  I can't make sense of
the first sentence, so I can't figure it out.

As a final word of caution, please remember that in any practical
cryptgraphic implementation, errors in the software, hardware,
implementation, and use are far more likely to cause insecurity than
the block cipher is.  Devoting energy to making everything else secure
is much more useful than taking an already-secure block cipher and
making it even more secure.

Bruce
**********************************************************************
Bruce Schneier, Counterpane Internet Security, Inc.  Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN  55419      Fax: 612-823-1590
           Free crypto newsletter.  See:  http://www.counterpane.com

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Period of cycles in OFB mode
Date: 13 Feb 2000 23:33:35 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Terry Ritter) 
wrote:

> In contrast, I have always been worried by "counter mode."
> 
> Clearly, an arithmetic count sequence is very regular:  The lsb
> changes every time; the next higher bit half as often, and then only
> when the lsb changes to zero.  That is a lot of statistical
> bit-position information which may be amenable to frequency and phase
> detection techniques.  

There are other ways of counting that can appear more complex:

 initialise:  count = x,  where  0 <= x < 2^64
 repeat:      count += y, where  1 <= y < 2^64 and y is odd
 
Use key material for x and y.  If this proves weak, xor with a 64-bit 
LFSR, using a start point and polynomial also generated from key material. 
Use tables to implement transposition to break up the shift/add 
patterns... I'm sure there are many tricks to add here.

The main problem is the workload involved in testing out all the 
combinations and the criteria used in the testing :-(


Keith
 http://www.cix.co.uk/~klockstone
 ------------------------
 'Unwise a grave for Arthur'
 -- The Black Book of Carmarthen

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Using Gray Codes to help crack DES
Date: 13 Feb 2000 23:33:36 GMT

In article <fFbp4.967$[EMAIL PROTECTED]>, [EMAIL PROTECTED] 
(Pentafurry Project) wrote:

> 
> > There's another odd algorithm that comes to mind.  It's used for 
> > counting the number of bits set to '1' in a word:
> >
> > count = 0;
> > while(x > 0)
> > {   x = x & (x - 1);
> >     count++;
> > }
> >
> the subtract clears the lowest bit being 1, and is masked away. nice.
> Here's another way to do the same. The example is for a 32-bit word.
> 
>   x=(x&0x55555555) + ((x&0xAAAAAAAA)>>1);
>   x=(x&0x33333333) + ((x&0xCCCCCCCC)>>2);
>   x=(x&0x0F0F0F0F) + ((x&0xF0F0F0F0)>>4);
>   x=(x&0x00FF00FF) + ((x&0xFF00FF00)>>8);  /*0F*/
>   x=(x&0x0000FFFF) + ((x&0xFFFF0000)>>16);  /*001F*/
> 
> No branches. It looks bigger, and it is in bytesize, but very fast.

The use of a 256-byte table for counting bits in each byte can be quite 
fast too.

However, the question I was really asking was: why is it legitimate to mix 
operators from different groups?  Also how many more of these algorithms 
are there?

Keith
 http://www.cix.co.uk/~klockstone
 ------------------------
 'Unwise a grave for Arthur'
 -- The Black Book of Carmarthen

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Basic Crypto Question 3
Date: Sun, 13 Feb 2000 15:52:56 -0000

> 1.  If a plain text is encrypted with three Ciphers, is it
as strong as
> the strongest Cipher in the Chain or as week as the
weekest?

Under the majority of circumstances, it is believed that it
is at least as strong as the strongest cipher. However it
may be possible to construct a situation where a set of
otherwise strong ciphers may be compromised.

> 2. Coud there be some subtle interaction between the
algorithms that may
> reduce security or reduced keyspace?

It is very possible.

> 3. Is there an optimum way of combining ciphers together
or
> rules...assuming that the cascade is made up of block
ciphers of the
> same size and key length?  i.e. should one choose IDEA,
3DES, CAST128 or
> Blowfish,IDEA,CAST128, or Blowfish, RC5 and 3DES......what
are the
> criteria???

Earlier on this NG I posted the sketch of a proof that
indicates limits of cascading, by keeping the same output
range you have severe limits, but it's not likely that
you'll get into those ranges. Outside of careful
constructions, the hard limit of an n-bit is 2^n
encryptions. I doubt you'll reach the limit. Personally I'd
say that it's probably a good rule of thumb to use the
strongest algorithm last for encryption/first for
decryption.

> 4. What if the ciphers have different block size and key
length, is it
> still ok to cascade them?  Blowfish, Twofish, IDEA?

I don't see much of any other reason with strong methods
like that, except that it would be easiest to implement
starting with small blocks on the inside, and large blocks
on the outside. ie
plaintext->IDEA->Blowfish->Twofish->ciphertext

> 5. Is it too complex to alternately encrypt the plaintext
blocks with
> the different ciphers in one Pass?????  Does that make
sense?
I don't think it's feasible to do it all in one pass as I
understand it, simply because the internals of the ciphers
tend to be quite complex.

I've actually considered writing an API to handle such.
                Joe



------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: RFC: Reconstruction of XORd data
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 23:24:09 GMT

Rick Braddam <[EMAIL PROTECTED]> wrote:

: I'm really disappointed that my choice of programming language hindered
: consideration of the effect I was trying to describe.

It may not have done.  All I can say is that it did so for me.
I was brought up on ARM code - and am known to be an x86 hater, though.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Access denied - nah nah na nah nah!

------------------------------

Date: Sun, 13 Feb 2000 00:29:54 +0000
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Block chaining

=====BEGIN PGP SIGNED MESSAGE=====

Tim Tyler wrote:
> 
> David Wagner <[EMAIL PROTECTED]> wrote:
> : In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:
> 
> :> Note that you can use parallel processing with some chaining modes
> :> anyway.
> 
> : Apart from counter mode, which ones let you do parallel processing
> : in the encryption direction?  Not OFB, not CBC, not CFB.  Which ones
> : did I forget?
> 
> When writing this, I was (irrelevantly) thinking about decryption ;-/
> 
> However... Plaintext Block Chaining (PBC), and Plaintext FeedBack (PFB)
> modes allow parallel processing in the encryption direction.

PBC and PFB are not at all a good idea, IMHO. There is a reason why these
modes aren't commonly used; they don't ensure that the input to the block
cipher is randomly distributed. I.e. if the plaintext blocks are not
uniformly distributed, neither will the cipher input be.

> You'll need to take care that the fact that, when using these modes,
> chosen plaintext attacks can directly probe the underlying cypher's
> innards isn't relevant, though.

The problems with PBC and PFB are not restricted to chosen-ciphertext
attacks.

Of course interleaved variants of CBC, CFB, and OFB can allow some
block parallelism, but they are not very convenient:

 - you need n IVs to get n-fold parallelism. For packet-based standards
   that don't guarantee ordering, for example, this may mean that you
   need a way to generate n independent IVs per packet, which can be
   inefficient.

 - n has to be chosen in advance (or negotiated), but the same value
   of n is not necessarily ideal for both ends of the connection.
   Regardless of whether n is fixed or negotiated, this makes the system
   more complex even for non-parallel implementations.

- -- 
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5  0F 69 8C D4 FA 66 15 01

"Attempts to control the use of encryption technology are wrong in principle,
unworkable in practice, and damaging to the long-term economic value of the
information networks."  -- UK Labour Party pre-election policy document


=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv

iQEVAwUBOKX68jkCAxeYt5gVAQGFQwgAxbWuyI8m2ykm5PTw9S+qEBQXrrZoii3e
bi7uO0m9Fw+gH5UeLDLW1ijPJBuHNyMTMjZblBG1EHFXxzVgP/0h6au1myLFsI27
HGjv2Kb8tHNVAUsOv01FV6Yaro7eqGOtPU10E/E61+GirCUHVnTd96QqEqet7a/f
a8oTg8GXkEZ8NCTLu3JJsJYIrAeE7aDwc1OnDKyUvc6CYLE+Z0CXfEYtjlnNO+Zv
c0Ttg4REz/iux1+uiOzhJRqOFxKpm+E+u2iadGkqGLVVUXkD01Yk4sfjwT7a8Toe
KEhfFyeqGGDuihAJy2lsYNzhTY+PjgNR/7bAOpKaOZRMRTndzmw6XQ==
=sMRU
=====END PGP SIGNATURE=====


------------------------------

Subject: Re: Basic Crypto Question 3
From: stanislav shalunov <[EMAIL PROTECTED]>
Date: Mon, 14 Feb 2000 02:21:39 GMT

[EMAIL PROTECTED] (Bruce Schneier) writes:

[Re cascading block ciphers.]

> I can't think of any reasons why you might want to order ciphers in
> your cascade a certain way.

In AC2, sec. 15.7, pp. 367-368 you essentially recommend putting the
"more trusted" cipher inside.  (With the reason that weak cipher [or
any algorithm] inside can facilitate chosen- and known-plaintext
attacks against the outer cipher.)

It seems that now you're saying that any order is fine.  Is this based
on the assumption that the ciphers that the original poster mentions
are all considered "good," or is this based on some newer research
that appeared after AC2 was published?

As a side note, many practical implementations use comression before
block encryption.  The ones that use libz seem to violate the
above-mentioned recommendation from AC2.  Do you feel that they lose
more than they gain by using libz?  Are you aware of any compression
libraries that would be more suitable for use before a block cipher?
(Anything should probably be legally uncompressible using that
library...)

--Stanislav

------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Period of cycles in OFB mode
Date: 13 Feb 2000 18:45:45 -0800

In article <887f0f$a59$[EMAIL PROTECTED]>,
 <[EMAIL PROTECTED]> wrote:
> There are other ways of counting that can appear more complex:
> 
>  initialise:  count = x,  where  0 <= x < 2^64
>  repeat:      count += y, where  1 <= y < 2^64 and y is odd

Good point.

Now that you mention it, I recall that GOST uses something similar
for its variant on counter mode.  Checking the source, it appears that
it splits the 64-bit bit block up into two counters modulo 2^32 - 1,
and the update operation adds a different constant to each half (y, in
your notation).

This still leaves a few mysteries, though:
* Why 2^32 - 1 rather than 2^32?
  (I suppose one can see that powers of two may be bad in the sense that
   they still leave some regularity -- e.g., the lsb flips every time.)
* Why use the same modulus in both counters?
  (It seems to me that it would be better to use two co-prime modulii.)
* Are two half-size counters better than a single full-size counter?

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Date: Mon, 14 Feb 2000 04:01:24 GMT

Tim Tyler wrote:
> What "appropriate theorems" should be used to quantify these things?

Proper theorems always state their assumed conditions.  To apply a
theorem to a particular case, you have to verify that these conditions
do hold.  If you can do that, the conclusions of the theorem apply
with as much confidence as you have that the theorem was established
(proven) correctly.

> Bear in mind that some attackers have been practicising their art in
> secret for an extended period, and consequently the capabilities of
> these opponents are very difficult to say much about.

You miss the point.  You don't *need* to know particular attacks
if you can *prove* security against sufficiently broad classes of
attacks.

> Quantifying security is /really/ difficult.

I suppose so, when you don't know what you're doing.

> People have been estimating the strength of their systems wildly
> incorrectly for centuries.

People have been squaring the circle for millenia -- so what?
The *pros* don't make that mistake.

------------------------------

Date: Sun, 13 Feb 2000 23:14:10 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Using Gray Codes to help crack DES

[EMAIL PROTECTED] wrote:

> In article <87v9jl$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Nick
> > Does Anyone know where I can get an explanation of what greycodes are
> > and their uses in cryptography?
>
> A Graycode is system of counting where there is only ever one bit
> difference between adjacent values.  A 2-bit sequence would be:
>
>   00
>   01
>   11
>   01
>
> Their main (non-crypto) use has been in devices such as angle encoders
> where the 1-bit difference meant that the maximum error was only +/- 1 LS
> bit.

To be a little clearer, it has little to do with the LSB of a code, and a lot
to do with the transition from one state to the next.  In mechanical systems
it's essentially impossible to change all the bits at the same instant.  So
the transition from  0001 to 0010, which needs two bits to change, will
either go 0001 -> 0000 -> 0010, or 0001 -> 0011 -> 0010.  Both of the
intermediate states are invalid.  For example, as an axle turns it goes from
0 to 359 degrees in a monotonic sequence.  But if you (electronically) sample
a straight binary encoder you'll see the invalid intermediate states as a
kind of noise at every degree boundary.

When you sample a gray code such invalid intermediate states don't happen
because only a single bit changes.  Either you get the old state or the new
state -- no noise.


------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: RFC: Reconstruction of XORd data
Date: Mon, 14 Feb 2000 04:09:27 GMT

Rick Braddam wrote:
> This appears to me to only work when shifting left, using signed
> values.

No, the example involved only carry resulting from addition,
not shifting, and would not have worked right for some signed
operand values.  In fact, the case for shifting is easy, since
the about-to-become-Carry bit is easily identified and captured
before the shift.  Depending on what you do with it, the peephole
optimizer has a good chance of turning it into an actual C-bit
operation.

For nearly all bitwise operations in C, unsigned integer types
are easiest to operate with correctly.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to