Cryptography-Digest Digest #110, Volume #11      Sat, 12 Feb 00 21:13:00 EST

Contents:
  Counter mode (was Re: Period of cycles in OFB mode) (David Hopwood)
  Re: Which compression is best? (David Hopwood)
  Re: Which compression is best? (lordcow77)
  Re: Do 3 encryptions w/ a 40-bit key = 120 bits? (Jerry Coffin)
  Re: "Farscape" CD-Rom ("Nanoforge")
  Re: RFC: Reconstruction of XORd data (Jerry Coffin)
  Re: Basic Crypto Question 2 ("Joseph Ashwood")
  Re: Which compression is best? (Tim Tyler)
  Re: Period of cycles in OFB mode (David Wagner)
  Re: Re: CHEATING AT PARADISE POKER ("Joseph Ashwood")

----------------------------------------------------------------------------

Date: Sun, 13 Feb 2000 00:02:46 +0000
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Counter mode (was Re: Period of cycles in OFB mode)

=====BEGIN PGP SIGNED MESSAGE=====

David Wagner wrote:
> 
> In article <[EMAIL PROTECTED]>,
> Paul Koning  <[EMAIL PROTECTED]> wrote:
> > That's why I've always liked counter mode.  Its cycle size is
> > obvious (2^64).
> 
> But it is worth mentioning that counter mode also starts to exhibit
> some regularities (a birthday paradox-like issue) after about 2^32
> blocks, so it would only be prudent to change keys well before that
> point.

Really? I agree that it's probably prudent not to encrypt that much
data with the same key, but I thought the regularities in, say, CBC mode
were due to collisions in the cipher input (i.e. with random 64-bit inputs
you expect a collision after about 2^32 blocks, and since the encryption
function is a fixed permutation, that corresponds to a collision in the
output). In counter mode there will be no collisions in the input.

Technically you can infer (under known plaintext) that an output block
that has been seen will not occur again, but after 2^32 known plaintext
blocks that only limits the output to 2^64 - 2^32 possibilities. I can
see why that would be detectable, in the sense that you're not seeing
repeats when you would expect to do so for other modes, but why would it
be a weakness? Unless I'm missing something, all it really tells you is
that counter mode (or some close variant of it) is being used.

OTOH, a criticism of counter mode is that it uses highly structured input
to the cipher, which could potentially allow attacks that are intermediate
in power between known- and chosen-plaintext.

> (I'm assuming you're using a block cipher with 64-bit block size.)
> 
> > It also has the nice property that you can
> > parallelize it for higher speed.
> 
> Yup.  From my point of view, this is the big win of counter mode.

It's also useful if you need to generate pseudo-random numbers that are
*guaranteed* not to repeat.

- -- 
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5  0F 69 8C D4 FA 66 15 01

"Attempts to control the use of encryption technology are wrong in principle,
unworkable in practice, and damaging to the long-term economic value of the
information networks."  -- UK Labour Party pre-election policy document


=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv

iQEVAwUBOKUOEzkCAxeYt5gVAQE27wgAtWoPLNseiYzjWEdAW5vMS7VGwrqm6zOj
a0WSnRO3VD5KZGx5qdxWp7qL9MNpiBNhn3Gh5F6jfriC1PdzyxXT1Q+kPt7fJBQc
8120xf+DUtDX95FQ3s6GzSkjPacdAX6NMhVBKUvgm0DL0J+b+GpuoF1vX/+Ru7Tf
6Pr0C2xZ2oiMM2VL94+dQk3qeKx6zoNC4kG3/fnqEvKsVdyHghfNZ+sCX61tPH5y
1Lo2Tvu3Mef8gPLta0pw6F6R7M36WtdIQIRxIqKDkFBTiwcFuH225jeBJlJCLGoS
CM+9ZCIki/P0MDSKxNW+vU6SgPV0AlEGWSOopohXniLcmTKsmZbKSg==
=MkX6
=====END PGP SIGNATURE=====


------------------------------

Date: Sun, 13 Feb 2000 00:01:48 +0000
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Which compression is best?

=====BEGIN PGP SIGNED MESSAGE=====

Tim Tyler wrote:
> 
> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> : John Chandler wrote:
> 
> :> I just noticed a post crediting "DS" with suggesting using
> :> adaptive compression in _both_ directions before encrypting.
> 
> : If the forward pass does a good compression job, I wonder how much
> : additional compression could be expected from the backward pass.
> 
> The backward pass is not intended to compress.  AFAICS, It's purpose is
> primarily to make sure that information necessary to decrypt plaintext is
> not present in any fragment of the message.

Depending on the compression algorithm, the backward pass is not guaranteed
to do anything substantive at all. Many adaptive algorithms will simply
give up and copy the data verbatim (with a tag indicating that is what has
been done) when they find that it isn't compressible by a significant
amount - which it won't be after the forward pass. In fact if an algorithm
*doesn't* do this, it can end up substantially expanding some inputs; this
is one of the problems with the Unix "compress" (.Z) format.

> This means (for one thing) that an analyst is effectively forced to
> decrypt the whole message in order to recover any decompressed text at all
> - which you will probably need to do before you can reject a key.

The analyst is not necessarily forced to do any such thing, unless you
place much stronger constraints on the compression algorithm than are
normally assumed (different contraints than the "good enough" property we
have been discussing in another thread). The problem seems to be that this
construction tries to use a compression algorithm for mixing, which it is
not designed for.

If you need an all-or-nothing-transform, use something that has been
analysed for that specific use; don't try to construct one badly from a
compression algorithm.

- -- 
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5  0F 69 8C D4 FA 66 15 01

"Attempts to control the use of encryption technology are wrong in principle,
unworkable in practice, and damaging to the long-term economic value of the
information networks."  -- UK Labour Party pre-election policy document


=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv

iQEVAwUBOKUV9jkCAxeYt5gVAQF4XwgAy/+7/Y/fcIgfix8VtknihRMvthBVrJgj
dTBkids/Uu3R2CICrL/cZDbGVnRo46AjV1E8eSbSjFKDr4oQIXuQa9hXReM9cqBa
Ei/kXCPvBDXDO5e7oHEblftWmkW+B6RnPjxMW1qfiGcUS+ZcON3OjOklmTNflXZl
APz7OPyjX/QKv9tQh+HQPPP7CI25lEzwDuBMYQoakhVkv0PQ8cFxzpcZVK2/GXOk
1LGR0spZq0r8CIGsPPNGcCQH85b5Aa4R1M22Iu4qQoDNEd4fYaBw/kI1IY5vUuKJ
wfyd28R+7pKz7cfE+yGyDqzLz3Q2HJvBKMZy+zanZmqiLFhUslSe7g==
=bOIS
=====END PGP SIGNATURE=====



------------------------------

Subject: Re: Which compression is best?
From: lordcow77 <[EMAIL PROTECTED]>
Date: Sat, 12 Feb 2000 16:34:56 -0800

> - although possibly less so than cryptography, compression is
still a specialist field in which it is very difficult for
newcomers to design an efficient or effective algorithm.

A specialist field? I agree with the rest of your post, but it
truly doesn't help your position to misstate the difficulty of
building an efficient compression algorithm. It's pretty much a
part of any introductory algorithms course; I would expect the
average undergrad CS major to be able to spit out a compression
algorithm and explain how it works.


* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!


------------------------------

From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: Do 3 encryptions w/ a 40-bit key = 120 bits?
Date: Sat, 12 Feb 2000 17:41:48 -0700

In article <883jib$vai$[EMAIL PROTECTED]>, [EMAIL PROTECTED] says...
> In article <[EMAIL PROTECTED]>,
>   Jerry Coffin <[EMAIL PROTECTED]> wrote:
> > Most triple encryption schemes (e.g. 3DES) only effectively double
> the
> > key size.  OTOH, an 80-bit key is only RIGHT at the very edge of
> being
> > breakable at the present time -- there was a thread a while back in
> > which I did some extrapolation putting it at around $200 million (or
> > so) to exhaust an 80-bit key space in a reasonable length of time.
> 
> You are of course assuming a type of mitm attack will work.  Sometimes
> they require alot of memory and are not fesaible.

The memory depends directly on the key size.  It's generally assumed 
that it at least borders on feasible for DES; for a 40-bit key, it 
requires roughly 1/65536th the memory (though this could be modified 
somewhat based on block-size).
 
> Again, let's say you have a million computers [not far fetched] running
> at 2^19 keys/sec [which is about right since not everyone owns a PIII]
> you can exaused a 80 bit key in 80-20-19-1=2^40 seconds [34,865 years
> average].  To search the keyspace in a day you would have to search
> 2^63.60 keys a second which is far fetched.

To exhaust an 80-bit key-space in a reasonable length of time, you'd 
want to build a machine especially for that purpose, not simply use a 
large collection of general-purpose computers.  The very nature of a 
general purpose computer is that it's good at a lot of things, but NOT 
necessarily at its best at any one time.  The number of operations 
involved in exhausting an 80-bit key-space justifies the effort of 
designing a machine that's optimized for exactly that purpose.

-- 
    Later,
    Jerry.
 
The universe is a figment of its own imagination.

------------------------------

From: "Nanoforge" <[EMAIL PROTECTED]>
Subject: Re: "Farscape" CD-Rom
Date: Sat, 12 Feb 2000 19:43:36 -0500
Crossposted-To: alt.tv.farscape

 I just got the the issue myself, and unfortunately I am going to have to
return it. It seems someone inventive found a way to crack the disk inside
its packaging. I hate Barnes and Noble. About unlocking. I would record that
premiere episode incase there is some code word flashed on screen or
something. You would think sci-fi channel would be advertising this as it is
their best show with a hip product now attached to it.



------------------------------

From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: RFC: Reconstruction of XORd data
Date: Sat, 12 Feb 2000 18:01:48 -0700

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] 
says...

[ ... ] 

> It should also be noted that there are two forms of auto-key
> encryptions, see Menezes et al.  In the auto-key component used
> in my humble algorithm WEAK3-EX, I used addition instead of XOR.
> This causes some diffusion among neighbouring bit positions
> due to carry-overs.

Right -- this certainly helps diffusion but it's not quite the perfect 
diffusion function since fewer of the inputs create a carry than not.  
OTOH, even slightly less than perfect diffusion is still a LOT better 
than none at all.

> By using larger units (32 bits) than bytes
> ('IV' is correspondingly also 32 bits) and doing two passes, from
> left to right and back, I think that this as a whole does a quite
> good job of diffusion. Auto-key by itself is rather weak in my
> opinion. But in my own algorithm it is only one component among 
> several and is employed mainly for its diffusion effect to aid
> the other components.

Right -- when you get down to it, the individual components of most 
ciphers aren't worth a whole lot by themselves.  It's only when you 
combine a number of these components that you gain a lot of strength.  
You've pointed out the importance of diffusion, and Doug Gwyn 
mentioned the importance of non-linearity, typically implemented in S-
boxes.

At least in a block cipher, you want to ensure that you use enough 
rounds that your diffusion is complete throughout the block -- I.e. 
that each bit of input has had a chance to affect each bit of the 
output.  Addition gives a diffusion rate of roughly 3/8ths, meaning 
you want a minimum of three rounds of addition in each direction to 
give each bit of input a good chance of affecting each bit of the 
output.

-- 
    Later,
    Jerry.
 
The universe is a figment of its own imagination.

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Basic Crypto Question 2
Date: Sat, 12 Feb 2000 17:03:11 -0000

> 1.Which are the Top 5 PRNG's for practical cryptosystems?
Well the most common one used is probably the RC4/ARCFOUR,
and it is fairly good. Basically it's hard to identify a
best in a field that we can't even prove if perfection is
possible.

>
> 2. Is there a Universal Tester to test the top 5 for
Randomness?
DIEHARD seems to have been quite well accepted.

>
> 3. Is it possible to test that a Crypto system is
generating truely
> random keys...and the PRNG has not been chilled
??...Difficult task
> huh..with all the possible combination of keys..
We can't even really identify true randomness, and if you're
going to generate true randomness we know that it cannot be
done with a pRNG (pseudo random number generator).

>
> 4. What about using PRNG's based on thermodynamics and
random motion of
> molecules...or a theoretical model of some other natural
random
> phenomena  ....

As others have said there are several varients
available.Probably the one that has been accepted the most
is the Intel RNG. But these being hardware it becomes
cheaper to use a believed true random number generator.

>
> 5. Has any vendor come up with a Physical Random Number
generator box
> ... white heat ..whatever, which sits on a network
suppleying random
> numbers to client machines on a wan/lan?..  Need some kind
of a tamoer
> proof secure box...

It shouldn't be too hard to take a system with the Intel RNG
and broadcast bits across the network. This isn't likely to
be of much use in crypto because your adversary should be
assumed to have access to the network.
                Joe



------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 01:36:17 GMT

Jerry Coffin <[EMAIL PROTECTED]> wrote:
: In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] says...

:> : A few people consider it extremely important, but most of us realize 
:> : otherwise.  It's of no use at all against a known-plaintext attack.
:> 
:> ...though it can helps no end with partial plaintexts.

: As long as you define "no end" as "maybe a tiny bit", sure.

If you have a partial plaintext - and a plaintext attack - it seems to be
next to useless if the information in the plaintext is diffused before
encryption, as it would be by, for example, bi-directional compression.

:> Protecting against known /partial/ plaintexts may well be worth doing.

: If it really did so, yes.  It doesn't.

This seems to be plain false.  A partial plaintext is very
likely to be mangled into an unrecognisable form by adaptive
compression, especially if it is applied in both directions.

How do you justify the flat statement that it doesn't help?

:> Besides - it's not true that it's "no use at all" against known-plaintext
:> attacks.  This is true, *even* if we're talking about *complete* known
:> plaintexts! ;-)

: I suppose if you want to get technical, yes, it can (but doesn't 
: necessarily) make a _minuscule_ difference even if the entire 
: plaintext is known

You're right that the difference is miniscule.  I should probably not have
bothered mentioning it.

:> : If it's designed with encryption in mind, it can be of minimal help in 
:> : slowing a ciphertext-only attack. [...]
:> 
:> It can make the difference between having a broken cypher, and a
:> completely unviolated one.  Isn't this what security is all about?

: I have yet to see or hear you, Dave, or anybody else point out a 
: situation in which it makes anywhere close to that kind of difference.

This would happen when the analyst is effectively deprived of a halting
criterion.

:> : The most fundamental problem is that it doesn't really make enough
:> : difference to care about: just for example, David Scott makes much
:> : of the compression method he uses, and the fact that if you decompress
:> : the wrong text, it produces output that's statistically similar to
:> : normal plaintext.
:> 
:> Does he?  Frankly, I don't recall the occasion.

: Yes, he has.

I guess I'll just have to take your word for it.  I don't recall ever
seeing it - but I can't say I've read every one of his posts.

:> *I* bang on about this point - but I /never/ claim it as a propery of
:> David's current Huffman scheme.
:> 
:> I've examined its compression ratios myself and - in the version I
:> looked at - they were nothing to write home about.

: Apparently you didn't read what I said.  It's true that his 
: compression ratios are lousy, but that's NOT what I was talking about.

If the compression were maximal in some sense, decompressed texts would be
indistinguishable from plausible messages.  The failure of decompreseed
files to resemble correct messages is one symptom of poor compression.
So it *WAS* what you were talking about ;-)

:> *Where* did David claim this for his scheme?  Can you quote from the post?

: I've long since gotten too sick of reading his crap to bother spending 
: a couple hours on Deja News to find it again.  If you really care, 
: feel free to do so yourself.

I was the one who was questioning the existence of these things.  Why
should I spend time searching for something I doubt even exists, in order
to demonstrate your point for you?

:> In case you are unaware of them, I'll give an example:
:> 
:> Compressing in one direction only diffuses plaintext in one direction.
:> This means any header to the file translates into a header in the
:> compressed file.

: This is simply NOT necessarily true.  Just for one obvious counter-
: example, there's a two-pass variant of Huffman compression in which 
: you collect statistics on an entire data set in the first pass, and 
: then do compression based on those statistics in the second pass.
:
: Your assertions do little beyond demonstrating your ignorance.

Oh, bullshit.

It's questionable whether what you describe qualifies as "compressing in
one direction".  It takes information from near the end of the file and
allows it to influence output from near the start of the file.  You could
not use such a method as a stream compressor.

I never claimed that compressing in both directions was the /only/ way
to get information diffusion through the whole file.  Obviously there are
other ways of getting similar effects.

:> Since you don't seem to appreciate the benefits of the technique in
:> resisting such attacks, I'm suprised that you feel that you are in a
:> position to criticise it :-(

: You may think appreciation is necessary.  I'm convinced that knowledge 
: is enough to put me in a position to both criticize it, and to LACK 
: any appreciation for it.

Well - for what it's worth - compressing in each direction seems to me to
be a type of diffusion that can easily be applied with a serial machine.

If I had a parallel machine to hand, I'd probably use a parallel diffuser
- instead of anything to do with compression - to produce this effect.

So I too would probably avoid using it if I was able.

It /does/ have virtues, though.  You can get bi-directional diffusion
*and* the pick of compression algorithm of your choice, for example.

The reverse "compression" can also occur /after/ an encryption pass -
helping to confound the boundaries between blocks between encryptions.

: The simple reality is this: if your cipher, by itself, is _extremely_ 
: margin, it's barely possible that this method will make a tiny bit of 
: difference.  I've never said that it made no difference at all: I've 
: only said that IMO, the difference is too small to care about in 
: nearly any case of which I'm aware.  You've demonstrated nothing to 
: the contrary.

Non-1-1 compressors (I assume this is the subject? - or are you referring 
to compressing once in each direction here?) allow systematic rejection of
keys, regardless of the plaintext they transmit.

If your cypher's keyspace is reduced to dimensions which allow an
automated search, whether by a bad RNG used to generate the keys, a cypher
break, key leakage through a programming bug, key knowledge gained through
captured key documents, or any of a zillion other factors, use of 1-1
compression can make the difference between having one document which is
identified as the genuine article, and having millions of possible
plaintexts, with no known automatic method of choosing between them.

:> : For these purposes, special forms of compression make little 
:> : difference.  Instead, you're simply looking for whatever produces the 
:> : smallest output.
:> 
:> If you don't care about compressors systematically adding information to
:> the files they compress, that's your look out.

: "that's your look out"?  You're apparently trying your own version of 
: security through obscurity, saying things that simply make no sense.

? Did I say something that made no sense?

:> Most compressors allow automated rejection of keys based on the compressed
:> file alone.  For non arithmetic/Huffman techniques this is often possible
:> given only a small fragment of the file.  It seems clear enough to me that
:> as this is something that can be avoided, it *should* be avoided.

: Yes and no.  If that were the only available criteria, it should be 
: avoided.  Unfortunately, it's not.  If you'd ever bother to sit down 
: and do some of the math, you'd quickly find that the superior 
: compression ratio provided by a different algorithm will _usually_ 
: make more difference.

It's hardly a case of "doing the maths".

Whether good compression in terms of size, or good compression in terms of
lack of added information depends on all kinds of factors, including what
type of data will typically be transmitted, and what sorts of weakness are
most likely in the rest of your system, and what information you can
reasonably expect an analyst to have access to.

I'm not blindly advocating choosing 1-1 compression over compression
that targets structure in your data as effectively as possible.  There are
cases where either choice is justified.

1-1 compression /is/ generally a good thing, though.  Algorithms that
aren't 1-1 are wasting their range, and losing their efficiency.

Huffman and arithmetic compressors already have the fundamental technology
behind their compression available in 1-1 variants - and shorter files
result.  Hopefully, other systems will follow.
-- 
__________
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Cleanliness is next to impossible.

------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Period of cycles in OFB mode
Date: 12 Feb 2000 17:52:18 -0800

In article <[EMAIL PROTECTED]>, Tim Tyler  <[EMAIL PROTECTED]> wrote:
> From the point of view of an attacking analyst, it can sometimes be a plus
> if some cryptographic process can be parallelised - since then they can
> build a big parallel machine to break it more rapidly.

I see your point now.
Hmm, on second thought, I don't think it's a problem.
Exhaustive keysearch is inherently parallelizable in a
way that ordinary operation of cryptosystem isn't: namely,
you can distribute a different chunk of the keyspace to
each processor participating in the keysearch.
So, I'm not worried: I don't think the parallizability
property buys the brute-force attacker anything.

> I suppose I can conclude from your response that this wasn't what you meant.

You're right, I didn't!  Sorry to be slow on the uptake.

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Re: CHEATING AT PARADISE POKER
Date: Sat, 12 Feb 2000 18:01:14 -0000
Crossposted-To: rec.gambling.poker

Their statements are almost all fluff.

The facts they do give:
Their choice of using modular division in their so-called
random number generation leads to a bias towards the low
order cards, this results in a tendancy for the cards that
started at the top of the deck to end at the top.

Their choice of using the C/C++ rand() function allows for a
good cryptographer to begin computing the future values
without too much difficulty.

They state where they get their random seeds from, it's
almost a joke it's so poor. The low order bits of the
counter on a computer are only random when sampled at
suitably random, fairly distant intervals, if their doing it
to get the stated 17 bits per second it's no longer random.
                    Joe





------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to