Cryptography-Digest Digest #340, Volume #14      Sat, 12 May 01 04:13:01 EDT

Contents:
  Re: How do boomerang attacks work? ("Tom St Denis")
  Re: Serpent LT is not complete? ("Tom St Denis")
  Re: PRNG question from newbie ("Scott Fluhrer")
  Re: Best encrypting algoritme (SCOTT19U.ZIP_GUY)
  Re: Best encrypting algoritme ("Tom St Denis")
  Re: extracting random bits from low-entropy data (wtshaw)
  Re: Best encrypting algoritme (wtshaw)
  DES Crypto Myth?? ([EMAIL PROTECTED])
  Re: Micro Video Camera Suitable for Documents? (Benjamin Goldberg)
  Is Differential Cryptanalysis practical? ([EMAIL PROTECTED])
  Re: OAP-L3:  "The absurd weakness." (Xcott Craver)
  Re: Security proof for Steak (Benjamin Goldberg)

----------------------------------------------------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: How do boomerang attacks work?
Date: Sat, 12 May 2001 01:14:25 GMT


"John Savard" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> On Fri, 11 May 2001 22:46:58 GMT, "Tom St Denis"
> <[EMAIL PROTECTED]> wrote, in part:
>
> >I am currently reading the boomerang paper very closely... if I get it
> >right, we take the cipher and break it into two halves (or take the
rounds
> >we are attacking) ie ENC(x) = E1 o E0(x),
>
> >Next we have the four inputs P, P', Q and Q' where P xor P' = Q xor Q' =
D1.
>
> My web page at
>
> http://home.ecn.ab.ca/~jsavard/crypto/co041001.htm
>
> explains the boomerang attack, I hope simply and understandably, if I
> may toot my own horn.
>
>
> The thing about the boomerang attack is that you start out only with P
> and P', such that P xor P' is D1.
>
> Enciphering them, then, you create from E(P) and E(P') two other
> values which differ from them by D2 ... and these two other values
> shall be E(Q) and E(Q'). Then you decipher them to get Q and Q'.
>
> _That's_ how it works, and that's why everything adds up.

Yes I re-read that they state from the single pair they go up and down the
cipher... coo..

I will read your page shortly...

Thanks,
Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Serpent LT is not complete?
Date: Sat, 12 May 2001 01:19:35 GMT


"Tom St Denis" <[EMAIL PROTECTED]> wrote in message
news:Yk_K6.68827$[EMAIL PROTECTED]...
> I was tinkering with my LT analyzer I wrote for TC15.  I think I adapted
it
> to the LT from serpent but it tells me that each output bit is only a
linear
> function of 44 other input bits.

Bit more analysis for the curious lurker.

More states from my LTA program

Serpent:
1: 0.036743, 4.703125, (2 => 7)
2: 0.181458, 23.226562, (7 => 37)
3: 0.596252, 76.320312, (31 => 106)
4: 0.954224, 122.140625, (92 => 128)
5: 0.999390, 127.921875, (126 => 128)
6: 1.000000, 128.000000, (128 => 128)

Current TC15:
1: 0.039062, 5.000000, (3 => 7)
2: 0.171875, 22.000000, (15 => 29)
3: 0.515625, 66.000000, (49 => 81)
4: 0.914062, 117.000000, (106 => 125)
5: 1.000000, 128.000000, (128 => 128)

The first number before the semi colon is the current iterration of the LT.

The second is the % of dependant bits.  So 1/128 or 0.0078125 means each bit
is independent of each other.

The third value is the average dependence per itteration.  In other words
each output bit is dependent on average (over all output bits) on X input
bits.

The last values "X => Y" are the min and max dependencies.  so "3 => 7"
means the outputs are dependent on a minimum of 3 and max of 7 input bits.

If you notice in Serpent the min from one round is lower than the max in the
next... i.e 31 => 106 in round 3, whereas in TC15 the numbers only go up.  I
think this is a sign that TC15 has a better avalanche since the number of
dependent bits always increases over all the rounds.

A new copy of my LTA program is at

http://tomstdenis.home.dhs.org/tc15_lta.c

Tom



------------------------------

From: "Scott Fluhrer" <[EMAIL PROTECTED]>
Subject: Re: PRNG question from newbie
Date: Fri, 11 May 2001 18:07:14 -0700


John Myre <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Bugs Bunny wrote:
> <snip>
> > So, a good PRNG is really dependent on a good hash function?
> <snip>
>
> The required qualities for a PRNG and a hash function are
> indeed quite similar.  And hashing a simple counter to get
> a PRNG is often proposed (and, so far as I know, should work).
> It doesn't *have* to be that way, however.
>
> The creators of Rijndael (*) have said that "PRNG" and "hash
> function" ought to be considered as the same idea, they just
> have different input and output sizes.  In support of this
> opinion, they have created an algorithm, Panama, that can be
> used both ways.  (I don't know how far they've gone with this
> line of thought, or if they still support the idea.)
You are aware that Panama was broken as a hash function (but not as a stream
cipher) at FSE2001 by Rijmen, Rompay, Preneel and Vandewalle.

--
poncho




------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Best encrypting algoritme
Date: 12 May 2001 01:38:11 GMT

[EMAIL PROTECTED] (Tom St Denis) wrote in
<MY%K6.70162$[EMAIL PROTECTED]>: 

>Um so I have to reverse engineer the program to figure it out?  That's
>not very scientific.  Can't you post a snippet of the code or
>pseudo-code for the benefit of the group?

   Now TOM you don't have to reverse engineer it. The source code
comes totally with it. THe Rijndeal is right from the AES people
so why post it. THe concept has been hand feed to you. I think you just
trying as usually not to learn anything but to be a pain in the ass.
Reverse engineering usually invlovoes looking at outputs. Or if your
lucky the binary executables. So no I wont post any snippet they
would only confuse more.

David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Best encrypting algoritme
Date: Sat, 12 May 2001 01:43:45 GMT


"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> [EMAIL PROTECTED] (Tom St Denis) wrote in
> <MY%K6.70162$[EMAIL PROTECTED]>:
>
> >Um so I have to reverse engineer the program to figure it out?  That's
> >not very scientific.  Can't you post a snippet of the code or
> >pseudo-code for the benefit of the group?
>
>    Now TOM you don't have to reverse engineer it. The source code
> comes totally with it. THe Rijndeal is right from the AES people
> so why post it. THe concept has been hand feed to you. I think you just
> trying as usually not to learn anything but to be a pain in the ass.
> Reverse engineering usually invlovoes looking at outputs. Or if your
> lucky the binary executables. So no I wont post any snippet they
> would only confuse more.

If you can't explain the concept in under a few 100 lines of english text at
the most it's not worth it.

Why is it that competent cryptographers can explain things like differential
cryptanalysis in under five minutes but over the course of YEARS you can't
explain any of your ideas to the point that they a) make sense and b)
demonstrate an improvement over any current technology.

Sure your stuff is different... question is:  Is it any better?

Tom



------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: extracting random bits from low-entropy data
Date: Fri, 11 May 2001 22:14:20 -0600

In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Mark) wrote:

> [Apologies if this is a naive question, but I did try the FAQ first.] 
> 
> How do I extract the entropy from low-entropy data to produce high-quality
> random bits, assuming that I know a lower bound for the amount of entropy
> in the input? 
> 
> For example, suppose I want 100 uniform random bits, and my source of 
> randomness is the conversation in a chat room.  Making the assumption that 
> there are at least 2 bits of entropy per line, it seems reasonable to 
> assume that I can convert 50 lines of chat into 100 bits, each bit of very 
> high (approaching 1) entropy.
> 
> (1) Is this a reasonable assumption and (2) is there a cookbook solution?

Figure that text has 1 to 3 bits of randomness per letter on the average,
so there is lots to start with.  You could sum the bit values of each
group of N characters to produce a series of bits. Try 10 for N and see
what you get.
-- 
Mother Nature always gets her revenge. 

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Best encrypting algoritme
Date: Fri, 11 May 2001 23:44:39 -0600

In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

> 
> It is also possible to consider the entire message as
> a single block. Thus a strict distinction of stream and
> block processing is only a conceptual aid in my humble
> view and is not a necessity.

Again, there is another option to block handling, not just fixed blocks
and sometimes padding to fill one out, nor a single block for a big
message, but variable length blocks to adapt to natural text divisions. 
Space terminated blocks are what I like.
> 
> There are many different posssiblities to dynamically 
> effect 'variability' of processing of individual blocks. 
> Use of a pseudo-random source may be considered in this
> connection. I suppose in the last sentence you are 
> referring to use of homophones, including introduction of 
> random dummies.

More the former than the later.  The pseudorandom series can also be made
obscure by inserting extraneous factors derived from the text being
encoded.  As I just need a little randomness, it is easy to pluck some
modifying values from a many character block, even hash the whole block
with the generator output to produce new values as needed. 
...
> 
> It is better to have larger blocks (the extreme is the
> whole message). But one has somehow decided to use block
> encryption and has chosen a small block size (e.g. for
> hardware reasons). In order to get nevertheless a bit the 
> benefits of large block processing, one employs a (particular) 
> technique, which is the chaining. So chaining is a compromise,
> or an afterthought, if you like.

I think I addressed this somewhere above.  With a variable block length
cipher like the GVA, large blocks are slightly more efficient in that each
block regardless of size carries the same overhead.

> It may be mentioned that
> Scott has for years advocated whole file processing in the
> group (though personally I am not fond of the specific
> methods he uses).

There are pluses and minuses in such implementations.   But, strength
needs not be versus fragility.  Strength should be maximized and fragility
minimized.   An all-or-nothing system maximizes both.
> 

I wrote:

> > Better block ciphers have some sort of obvious key changes going on that
> > allows for increased variability; best is from an input of randomness.
> > The problem of reestablishing a random sequence at both ends can be
> > removed if the encryptive algorithm stores the path back as part of the
> > cipher text.  Therefore, original random generation need not be
> > duplicated.
> 
> I agree on the benefit of having variability (see above). 
> I don't understand what you mean by 'stores the path back as 
> part of the ciphertext'? (Is that utilization of previously
> sent messages?)

One way of viewing the GVA is as a maze determined by random values. 
Knowing a key called the pathcode or pathword means that the decryptive
algorithm looks for its characters as signposts back through the maze. 
Each block stands alone.  The key characters sought represent the flags
for reversing random offsets used in encrypting that particular block.
> 
> Commonly, a good block cipher has a few simple and clearly 
> defined layers in each round. I suppose you are not objecting 
> to using a number of rounds or to using multiple encryptions 
> with different algorithms (eventually dynamically changing
> the mix).

Fine.  I object to an algorithm strung together like the rat's nest of
wires in the typical PC.  It is a matter of personal preference but good
designs are laid out well, not a patchwork or statement of demented
organization.  Surely complication is a admired hurdle for some, but
artistic integration is always a statement of beauty.

> 
> I suppose that two ways are available: use a larger key
> in the algorithm, or vary the key for different blocks
> in some manner. It seems that there is the popular opinion 
> that key materials are fairly costly. However, the cost 
> issue surely depends on the concrete economical factors of 
> one's applications. One can certainly provide for the use 
> of keys of widely different sizes for different classes 
> of messages.
> 
> M. K. Shen

For a variable length situation, peal off as much key as needed.   Blocks
except the last one should be close to the maximum allowed, so the duty
factor is fairly well distributed for the key alphabets.  However, the
true percent usage of any alphabet is problematic.  While on the subject,
the GVA is, as I see it, a problematic process, at least as far as the
overview suggests as characteristic of problematic encryption.
-- 
Mother Nature always gets her revenge. 

------------------------------

From: [EMAIL PROTECTED]
Subject: DES Crypto Myth??
Date: Fri, 11 May 2001 21:51:40 -0800


I've been reading through papers on differential cryptanalysis and s-box
generation and I seem to have uncovered a myth regarding DES that I've
run across several times (i.e. _Applied Cryptography_, posts on this
newsgroup, etc.).

The myth is that the DES team knew about differential cryptanalysis (I
think
Coppersmith makes this claim) esp. iterative characteristics, and
specifically
designed the S-boxes to be resistant ("robust") to differential
cryptanalysis.
I've also read in several different places that the changes that the NSA
made
to the S-boxes were intended to increase their resistance to
differential
cryptanalysis.


The problem with this "common crypto knowledge" is that the DES S-boxes
aren't
very robust! According to Seberry, Zhang, and Zheng in "Systematic
Generation
of Cryptographically Robust S-boxes" (1994) the robustness of the DES
S-boxes
range from 0.316 and 0.469 which is much lower than the upper bound of
0.861
for 6x4 S-boxes.

It seems to me that either the claim that the DES team knew about
differential
cryptanalysis isn't true, or they didn't understand it well enough to
design
S-boxes with close to optimal robustness.

Am I missing something?



veb3
5/11/01


=====BEGIN PGP MESSAGE=====
Version: PGP 6.5.8

pCxibgtG9gGHw3oOaZIu0w6iUz27izdTRvp0EdNSrMJ8La93oiygo6+eQtU4Tw==
=IZJ2
=====END PGP MESSAGE=====

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: Micro Video Camera Suitable for Documents?
Date: Sat, 12 May 2001 02:50:00 -0400

-- 
Shift to the left, shift to the right, mask in, mask out, BYTE, BYTE,
BYTE !!!

------------------------------

From: [EMAIL PROTECTED]
Subject: Is Differential Cryptanalysis practical?
Date: Fri, 11 May 2001 22:41:01 -0800


I read through the Biham and Shamir 1990 paper on "Differential
Cryptanalysis
of DES-like Cryptosystems". The technique is fascinating. 

However, when the technique was applied to 16-round DES, 2^57 plaintext
pairs were required
(plus huge counter arrays).

A later paper by Biham and Shimir (1994) detailed an improved technique
which could be implemented with 2^49 chosen ASCII plaintexts (and
counter
arrays eliminated).

I think it's realistic to assume that the algorithm is known by the
cryptanalyst.
I think it is realistic to assume that the analyst has SOME chosen
plaintexts
with their corresponding ciphertexts (ye old "send it to the embassy"
trick).
I think its CRAP to assume that even an embassy is going to encrypt
2^(large # here)
carefully chosen plaintext pairs for the analyst and I think its CRAP to
assume that
Joe PC User is going to encrypt 2^(small # here) carefully chosen
plaintext
pairs for the analyst (feel free to do whatever differential analysis
you can
with the commonly encrypted file headers).

I'm not even sure that 2^49 chosen ASCII plaintexts is realistic.

"Certifiable weakness"? Yes. Practical cryptanalysis technique? Er, help
me out 'cause I don't see it. I've only been through two papers so there
are probably improved techniques. Give me URLs, pointers to other
papers...

Maybe a probability of getting 2^(whatever) carefully chosen plaintexts 
encrypted should be calculated and included as part of the round
probabilities
(kidding, kidding).


veb3
5/11/01

=====BEGIN PGP MESSAGE=====
Version: PGP 6.5.8

pCxibgtG9gGHw3oOaZIu0w6iUz27izdTRvp0EdNSrMJ8La93oiygo6+eQtU4Tw==
=IZJ2
=====END PGP MESSAGE=====

------------------------------

Crossposted-To: alt.hacker,talk.politics.crypto
Subject: Re: OAP-L3:  "The absurd weakness."
From: [EMAIL PROTECTED] (Xcott Craver)
Date: Sat, 12 May 2001 07:50:37 GMT

Anthony Stephen Szopa  <[EMAIL PROTECTED]> wrote:
>Xcott Craver wrote:
>> 
>>         You might be under the false impression that people respond
>>         to your posts out of interest in your cipher, when in fact
>>         they only do so to inform 3rd parties, who may have recently
>>         taken a peek into sci.crypt, that you are one of the regular
>>         crackpots.
>
>Anyone who would let you do their thinking for them deserves 
>no better.

        Look, if you think you know so fricking much about crypto,
        I'll be happy to challenge you with some basic questions
        about permutation groups.  Your cipher is based so much
        on permutations, after all, that if you can't answer some
        basic Qs about permutations, then you are unmasked as a 
        con-man.  Game?  Agree?  Disagree?
        
        I'm going to go out on a limb, and accuse you of hawking
        a cipher you just hobbled together with no actual 
        mathematical or cryptological expertise.  I would like
        to see you prove me wrong.  For instance, how about a 
        proof of the orbit counting lemma?  Do you even know what
        the orbit counting lemma is?
                                                        -S

        [That's actually an easy one, 'cos you can look it up.]

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: Security proof for Steak
Date: Sat, 12 May 2001 03:58:48 -0400

Henrick Hellstr�m wrote:
> 
> Have a look at http://www.streamsec.com/sattacks.asp
> 
> It might be considered to be a draft so far. I would appreciate any
> comments.

I didn't look at the security proof, but did glance at the source code.

My comments:
        I think more people can read C/C++ than delphi code.  This is
relatively trivial, as I can wade my way through it (I learned some
pascal way back in high school), but you get negative brownie points for
not having psuedocode or C/C++ source code available.
        Having any asm code in the reference implementation is a nono.
If you want an optimized version, fine, but have it separate from the
reference implementation (i.e., two versions).  Some people [even highly
experienced programmers] can't read asm.  I'll admit that I wouldn't
call myself *highly* experienced, but it takes me quite some effort to
wade through asm.
        Wtf are those big huge tables?  How the bleep do I know that they
aren't some sort of back door?  If they are supposed to be "random"
strings, then you should at least document where (what generator) you
got them from, and comment in the code that they are random.  Actually,
you should *not* try to use "real" random data, but instead take some
well known but random looking constants, as MD5 and Blowfish do.  MD5
uses as it's constants floor(2**32*abs(sin(i))) for 1<=i<64.  Blowfish
uses the first hexadecimal digits of pi (less the initial 3).
        By using some simple transcendental(s) as your constant(s), which
anyone can calculate, you significantly decrease the chance of there
being any backdoor into your cipher.  As an additional benefit, if your
arrays are very large (and in your case they are), it may be simpler to
allocate them as whatever size they are to be, and include some code
which generates the data to fill them as a static initializer; this may
result in smaller source code, and may be less prone to errors.
        Your encryption and decryption functions are more important to most
readers than your key schedule.  Put them up at the beginning of the
file, encryption first, then decryption, and then your key schedule, and
lastly any clean up.  This keeps things in the order people want to look
at them.
        I'll reiterate that asm is a nono in a reference implementation. You
should have as reference a "pure" version written for clarity, with no
asm, no difficult to understand optimization.  If you want to have a
second version, optimized, which might have asm, that's fine, but don't
expect people to use it for anything other than timing your code; if I'm
analyzing code for weaknesses, I look at the most legible version, not
the fastest version.  Even if I could read asm easily, it's much less
compact than C or whatever.

-- 
Shift to the left, shift to the right, mask in, mask out, BYTE, BYTE,
BYTE !!!

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to