Cryptography-Digest Digest #510, Volume #12      Tue, 22 Aug 00 23:13:01 EDT

Contents:
  Re: On pseudo-random permutation (Tim Tyler)
  Re: On pseudo-random permutation (Tim Tyler)
  Re: SHA-1 program (cool!) ("Ed Suominen")
  Re: Intermittent stream cipher? (Joaquim Southby)
  Re: The DeCSS ruling (lcs Mixmaster Remailer)
  Re: WinACE encryption algorithm (Joaquim Southby)
  Re: New algorithm for the cipher contest ([EMAIL PROTECTED])
  Re: New algorithm for the cipher contest ([EMAIL PROTECTED])
  Re: The DeCSS ruling and the big shots (wtshaw)
  320-bit Block Cipher ([EMAIL PROTECTED])
  Re: New algorithm for the cipher contest ([EMAIL PROTECTED])
  Re: Bytes, octets, chars, and characters (Dennis Ritchie)
  Re: blowfish problem (Dan Pop)
  Re: New algorithm for the cipher contest ([EMAIL PROTECTED])
  Re: New algorithm for the cipher contest (David A. Wagner)

----------------------------------------------------------------------------

Crossposted-To: comp.programming
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: On pseudo-random permutation
Reply-To: [EMAIL PROTECTED]
Date: Wed, 23 Aug 2000 00:20:13 GMT

In sci.crypt Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
: Tim Tyler schrieb:
:> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
:> 
:> I observe that you omit any form of detection of collisions between the
:> first components of B.  Without such a check the result does not form a
:> truly unbiased random permutation (on the assumption that the RNG is
:> good).

: That collision is to be resolved by the sorting process. 
: One has to decide on a resolution rule, though.

No resolution in the sort routine can possibly produce an unbiased
sequence.

You can see this by a counting arument, after observing that n! doesn't
usually divide exactly into 2^(n * x) very well [x is the width of the
RNG output in bits].

I usually suggest rejecting any collisions in the sort routine - and
starting again - if an /exactly/ even distribution is needed.
-- 
__________                  http://alife.co.uk/  http://mandala.co.uk/
 |im |yler  [EMAIL PROTECTED]  http://hex.org.uk/   http://atoms.org.uk/

------------------------------

Crossposted-To: comp.programming
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: On pseudo-random permutation
Reply-To: [EMAIL PROTECTED]
Date: Wed, 23 Aug 2000 00:27:45 GMT

In sci.crypt Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
: Mok-Kong Shen wrote:

:> My assumption limits me to the availability of a bit
:> sequence. I can get from the beginning only random
:> integers in [0, 2^s-1] for any s, but not random
:> integers in arbitrary ranges.

: Sorry. What I said above is nonsense. One can get a
: random number in a smaller range from a generator of
: a larger range by simply discarding the numbers that 
: are outside of range.

Yes.  java.util.Random, for example does:

        int bits, val;
        do {
            bits = next(31);
            val = bits % n;
        } while(bits - val + (n-1) < 0);
        return val;

...where next(n) returns a value from 0 to 2^n - 1.

: BTW, one thought that motivated my proposal is that
: sort routines applicable to arrays of almost arbitrary
: sizes are often available in one's programming
: environment and can hence be conveniently utilized.

FWIW, Java has java.util.Collections.shuffle(List, Random).

This saves tedious re-invention of the wheel in many circumstances.
-- 
__________                  http://alife.co.uk/  http://mandala.co.uk/
 |im |yler  [EMAIL PROTECTED]  http://hex.org.uk/   http://atoms.org.uk/

------------------------------

From: "Ed Suominen" <[EMAIL PROTECTED]>
Subject: Re: SHA-1 program (cool!)
Date: Tue, 22 Aug 2000 18:25:59 -0700

STL wrote:
> I've
> gone over the algorithm as it's implemented a few times in my head, and I
would
> have thought that I could confidently say that after twenty revisions to
> version 0.01 (the first "finished" version that gave a hash, albeit a very
> corrupted one), the implementation is logically perfect and should cope
with
> all possible inputs under 4GB.  Sadly, I just found that one exists,
although I
> have NO idea where it's coming from as it hasn't shown up either in the
1M-"a"
> vector or your 2MB file.  I took the largest darn file I had on my hard
drive,
> a hulking behemoth called "pak0.pak" in the \valve\ directory of the game
> Half-Life (weighing in at a crushing 302,907,835 bytes), and the reference
I
> use (HASHcipher demonstration by Bokler.com) output a different hash than
my
> SHA-1 program.  Something is afoot, and I'm not sure exactly what it could
be.

It seems to me that you really pushed things to the limit by testing a 300
MB file. If there is some way of showing that your program is good for files
below a certain size, it would still be very useful for my purposes. So do
we prove that, to a statistically acceptable level of certainty? Would you
need to somehow run your program and a "gold standard" program with N random
files and, if no differences are seen, compute the chances of there being a
problem in your program with the Poisson distribution? Or is the code simple
enough to inspect and attribute the 300 MB file "error" to something other
than your code?

NIST has some test vectors (beyond those in FIPS-180) that may be useful,
though they like kind of hard to use:
http://csrc.nist.gov/cryptval/shs/sha1-vectors.zip

Ed Suominen
Registered Patent Agent
Web Site: http://eepatents.com
PGP Public Key: http://eepatents.com/key





------------------------------

From: Joaquim Southby <[EMAIL PROTECTED]>
Subject: Re: Intermittent stream cipher?
Date: 23 Aug 2000 01:34:54 GMT

In article <hggn5.5650$[EMAIL PROTECTED]> mconroy,
[EMAIL PROTECTED] writes:
>I would like to know where I might find a cipher algorithm that allows me to
>stream plaintext input on an intermittent basis.  The application is logging
>messages to a file which must be encrypted.  The file has to remain
>encrypted at all times.  However, opening the file, decrypting, adding ONE
>line, and reencrypting just doesn't make sense.  I know there are a few
>commercial encrypted volume solutions, but I can't believe there isn't an
>algorithm out there built to do this sort of thing.  Then again, I'm in
>Dilbert-hell and I've only had this assignment for a week.  Using blowfish
>and twofish (with mcrypt under RH6.2) I can write one message and decrypt
>the file successfully.  However, all subsequent messages decrypt with the
>first few bytes mangled.  This happens regardless of the mode I choose.
>(which may, in fact, be a problem with mcrypt).  If this isn't the forum for
>this type of question, please accept my apologies...and please point me in
>the right direction.
>
Look at state machines whose output is a stream cipher.  At the end of
enciphering each piece of input, all you have to do is save the state of
the machine and use that as initialization the next time you have to do
more encipherment.  To decipher the entire file, you just start with your
original conditions and continue to the end of the file.

An example of such a machine would be a bank of Linear Feedback Shift
Registers (LFSR).  (How many depends on how wide a stream you want: 8
registers for a byte, 16 for a word, etc.)  Initialize each register with
a known (to you) vector and clock off as many bytes (words, what have
you) as you need to encipher your input.  Save the internal state of each
register at the end of the encipherment and use it as the init vector the
next time you encipher.

Cautions on this:  an LFSR by itself has been shown to be vulnerable to
attack.  Look at Applied Cryptography for some suggestions as how to
combine LFSR�s to strengthen them.  My personal recommendation for your
needs would be to use a bilateral stop-and-go generator for each bit in
your key stream bank.

Besides the weakness of lone LFSR�s, there�s the problem of protecting
the interim states of the machine.  These are the equivalent of keys in
other cipher schemes and must be protected with the same energy -- if
they are compromised, your data will be, too.  If you store that info in
a file somewhere, you should take steps to make sure it doesn�t fall into
the wrong hands.  Keep in mind that a machine like this is completely
predictable.  If you know the present state of the machine, you know all
future and past states.  That knowledge gives the entire keystream and
therefore the data you enciphered with that keystream.

Should you choose to implement such a scheme, there are some refinements
you could add: 1) besides storing the interim states of the machine,
store a pointer to the spot in the log file where each encipherment began
as well as the number of bytes (words, etc.) used.  That way you can pull
stuff out of the middle of the file without deciphering the whole thing. 
2) You can use this state machine to encipher multiple files and maintain
them concurrently -- each file would require its own record of interim
states, of course.

A 32 bit LFSR with a properly chosen tap sequence will produce a stream
of over 4 billion bits before repeating.  The bilateral stop-and-go
generator is hard to quantize like this, but it will give at least that
number as well as providing immunity to attack.  If you model 8 of these
generators to produce a byte-wide keystream, that will give you a minimum
of 4 Gigabytes of key material before the stream starts to repeat.

There�s a good primer on LFSR�s as well as some C code to model one at: 
http://homepage.mac.com/afj/lfsr.html

Besides the code and primer, there are lists of tap sequences that give
maximum periods for varying lengths of registers.  There are several
thousand at the 32-bit length, so you have plenty to choose from.

------------------------------

Date: 23 Aug 2000 01:40:04 -0000
From: lcs Mixmaster Remailer <[EMAIL PROTECTED]>
Subject: Re: The DeCSS ruling

The New York Times would only be publishers, not reverse-
engineers.  I think they would have mounted a much tougher
defense.
A note:  In the Europeon Union reverse-engineering is legal.
The reverse engineering was not done in the United States.

What is the feeling as pointed out in the trial if the method
was translated from C to 'regular english'.  Could talk about
the method be banned also?  I see the 2600 web site has the
links in english, not as hyperlinks, they claim that is okay??

The ruling did not touch really touch on the T-Shirt, right?

In article <fDAo5.42262$[EMAIL PROTECTED]>
[EMAIL PROTECTED] wrote:
>
> A. Melon <[EMAIL PROTECTED]> wrote:
> > If the New York Times published the source (a la Pentagon Papers)
> > would the judge have ruled against them?  Another example would be
> > the magazine that published detailed directions to making atomic
> > bombs.
>
> Yes, because there's a law against reverse-engineering copy
> protection, not against publishing.
>
> --
> Matt Gauthier <[EMAIL PROTECTED]>


------------------------------

From: Joaquim Southby <[EMAIL PROTECTED]>
Subject: Re: WinACE encryption algorithm
Date: 23 Aug 2000 01:47:52 GMT

In article <[EMAIL PROTECTED]> Leroy Kimna,
[EMAIL PROTECTED] writes:
>Just keep in mind that there's no such thing as "basically a 160 bit
>Blowfish code". There's only one Blowfish and "basically" doesn't cut it.
>Your people have either properly implemented it or they haven't, and you're
>not giving us very much confidence at this point. In fact, the only way we
>can really know for sure is if the source code is released.
>
What would you call a Blowfish implementation that changed the
initialization values from the digits of pi to the digits of e?  That
would produce a different output than the original one even using the
same key, wouldn�t it?  Is that one or two Blowfish?  (Blowfishes?)

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: New algorithm for the cipher contest
Date: Wed, 23 Aug 2000 01:36:35 GMT

In article <8ntl0l$dv$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
>
> Multiplication is not a group, this means the function doesn't depend
> on all of the bits.  In a field it would be better.
>

I agree that using only multiplications in the rounds, some output bits
will behave differently from others. One of the reasons I use a bit-
reversal function (Mirror) is just to *avoid* this effect. Then I could
maintain the multiplication mod 2^64 that is much faster then
multiplication in a field (mod p, p prime).
In the cipher documentation I report a test that do small variations
(increment by 1 or random bit flipping) in only 8% of the input bits
(plainblock + primary-key (S) ). This input bits can be, for example,
the first 3 bytes of the plain-block. The result is: the encrypted
block flips **all** the bits with probabability **1/2**. This is called
the generalized avalanche effect. I did this with as much inputs as I
could (for example S = {0, 0} and B = 0).

> > Remember I said K[0]..K[3] are *allways* *odd* integers, so K[0] lsb
> > will never be zero. I need this to avoid the effect you pointed and
to
> > invert the rounds.
>
> Yes, but weak keys still exist.

I agree. The derived-key K = {0,0,0,0,0,0,0,0} is a weak key. But the
probability of MakeKey function generate a derived-key like this is
very very low. Even if this happens we can throw away such K (remember
that the derived-key space (K) is much larger then the primary-key
space (S))

Currently I'm doing tests on the cipher function differentials. Using a
fixed primary-key S and doing small variations (increment by 1 and
random bit flipping) on the plain-block, several derived-keys K are
generated to encrypt the plain-block. Then I compute the difference
(subtraction and xor) of the encrypted blocks, the difference of the
difference, etc, up to order 100. Tests indicates that the bits of
cypher-blocks differentials flips randomly. I think the attacker can't
deduce much of K based on cypher-text difference. This suggest me that
the MakeKey function is not generating weak keys, at least for
differential attacks.


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: New algorithm for the cipher contest
Date: Wed, 23 Aug 2000 01:42:20 GMT

In article <8nv9r1$vvn$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> In article <8ntl0l$dv$[EMAIL PROTECTED]>,
>   [EMAIL PROTECTED] wrote:
> >
> > Multiplication is not a group, this means the function doesn't
depend
> > on all of the bits.  In a field it would be better.
> >
>
> I agree that using only multiplications in the rounds, some output
bits
> will behave differently from others. One of the reasons I use a bit-
> reversal function (Mirror) is just to *avoid* this effect. Then I
could
> maintain the multiplication mod 2^64 that is much faster then
> multiplication in a field (mod p, p prime).
> In the cipher documentation I report a test that do small variations
> (increment by 1 or random bit flipping) in only 8% of the input bits
> (plainblock + primary-key (S) ). This input bits can be, for example,
> the first 3 bytes of the plain-block. The result is: the encrypted
> block flips **all** the bits with probabability **1/2**. This is
called
> the generalized avalanche effect. I did this with as much inputs as I
> could (for example S = {0, 0} and B = 0).

Yes but one-round differentials will be trivial to find in your
cipher.  The only trick now is to hook them up across more then one
round.  Multiplication in Zn is such as silly idea if n is not prime
simply because not all of the input bits affect the output bits which
means the function fails to meet SAC.  It's hardly non-linear and also
has bad differentials.

> > > Remember I said K[0]..K[3] are *allways* *odd* integers, so K[0]
lsb
> > > will never be zero. I need this to avoid the effect you pointed
and
> to
> > > invert the rounds.
> >
> > Yes, but weak keys still exist.
>
> I agree. The derived-key K = {0,0,0,0,0,0,0,0} is a weak key. But the
> probability of MakeKey function generate a derived-key like this is
> very very low. Even if this happens we can throw away such K (remember
> that the derived-key space (K) is much larger then the primary-key
> space (S))
>
> Currently I'm doing tests on the cipher function differentials. Using
a
> fixed primary-key S and doing small variations (increment by 1 and
> random bit flipping) on the plain-block, several derived-keys K are
> generated to encrypt the plain-block. Then I compute the difference
> (subtraction and xor) of the encrypted blocks, the difference of the
> difference, etc, up to order 100. Tests indicates that the bits of
> cypher-blocks differentials flips randomly. I think the attacker can't
> deduce much of K based on cypher-text difference. This suggest me that
> the MakeKey function is not generating weak keys, at least for
> differential attacks.
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.
>


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: The DeCSS ruling and the big shots
Date: Tue, 22 Aug 2000 19:19:37 -0600

In article <%AAo5.42250$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:

> rot26 <[EMAIL PROTECTED]> wrote:
> > DeCSS case in a nutshell:
> > Someone designed and used an insecure encryption scheme. Before long
> > someone else broke it. And the someone else was sued.
> 
> > What the hell?
> > Exactly.
> 
> The what the hell is that it's clearly illegal in the US to reverse
> engineer copy protection mechanisms. The real problem is that the
> defendants tried to blow everything out of proportion in this case by
> making grandious arguments about free speech, and the expressiveness
> of source code.

Remember the bozo bit that won its name through its sheer stupidity.
-- 
Now, lazily juxtapose five objects to make a quincunx.

------------------------------

From: [EMAIL PROTECTED]
Subject: 320-bit Block Cipher
Date: Wed, 23 Aug 2000 01:50:31 GMT

For fun I made a 320-bit block cipher that uses SHA as the round
function.  I did the inputs to SHA as the first 160 bits as one half of
the block and 352 bits of key material so it's really

L = L xor SHA(R||K1)
R = R xor SHA(L||K2)
L = L xor SHA(R||K3)

It runs at about 2.2mbyte/sec on my K6-2 (with the ref source code) and
only requires 132 bytes of ram.  It's sorta like what they talked about
in the BEAR/LION paper except no stream cipher is involved.

http://www.geocities.com/tomstdenis/tc7.c

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: New algorithm for the cipher contest
Date: Wed, 23 Aug 2000 01:59:31 GMT

In article <8nug8a$1fe$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
>
> Since in his cipher Multiplication is not a group operation, and since
> there is no modulus it's a poor method of diffusion/confusion.
>
Normally, an operation alone don't make a good cipher. This is wy I
combine multiplication (mod 2^64) with bit-reversal and XOR. My tests
indicates very good diffusion/confusion.

If you don't believe, do the avalanche (difusion) test: Do small
variations (for example bit flipping) on a plain-block, encrypt each
one and save an specific bit value of the encrypted-blocks. Then run
the ENT random tester on the saved file.

Alexis


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Dennis Ritchie <[EMAIL PROTECTED]>
Crossposted-To: comp.lang.c,alt.folklore.computers
Subject: Re: Bytes, octets, chars, and characters
Date: Wed, 23 Aug 2000 02:17:19 +0000



[EMAIL PROTECTED] wrote:
 ...
> The I/O on Stretch was done in 8-bit bytes. The VFL instruction let you
> access any byte up to 64 bits in length. There was an add-on processor
> called Harvest that was limited to 8-bit bytes.
> 
> Does anyone have a copy of either the 7030 manual, the book "Design of a
> Computer System" or the procedings of the 1959 Eastern Joint Computer
> Conference?

Yes, yes, and no.

Stretch was a very interesting project.  It was in some ways
utterly at odds with ISA architecture as it evolved in
even the fairly-near future (e.g. the /360): it had
a single 64-bit accumulator which in some contexts could
be considered as 128 bits, a bunch of very fancy index
registers, bit-addressible memory.

At the same time, in many implementation details
(the 8-bit byte and general 2^n bit datapath,
peripherals and memory interface, OOO execution and instruction
look-ahead) it was an effort that had far-reaching
effects.

        Dennis

------------------------------

From: [EMAIL PROTECTED] (Dan Pop)
Crossposted-To: comp.lang.c
Subject: Re: blowfish problem
Date: 23 Aug 2000 01:42:26 GMT

In <45Eo5.2405$[EMAIL PROTECTED]> "Kelsey Bjarnason" 
<[EMAIL PROTECTED]> writes:

>Right - but an 8-bit char with no traps could handily fit inside a 17-bit
>byte which itself _does_ have traps - just not ones representable within the
>limits of a char.

If this is the case, the extra 9 bits DO NOT EXIST for the C 
implementation, i.e. they CANNOT contribute to the value of ANY C
object (because the representation of the value can be accessed via
unsigned chars). 

If no C program can "see" those bits, how do we know they actually
exist, in the context of the C programming language?

Dan
--
Dan Pop
CERN, IT Division
Email: [EMAIL PROTECTED] 
Mail:  CERN - IT, Bat. 31 1-014, CH-1211 Geneve 23, Switzerland

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: New algorithm for the cipher contest
Date: Wed, 23 Aug 2000 02:06:23 GMT

In article <8nvb5p$1cn$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> In article <8nug8a$1fe$[EMAIL PROTECTED]>,
>   [EMAIL PROTECTED] wrote:
> >
> > Since in his cipher Multiplication is not a group operation, and
since
> > there is no modulus it's a poor method of diffusion/confusion.
> >
> Normally, an operation alone don't make a good cipher. This is wy I
> combine multiplication (mod 2^64) with bit-reversal and XOR. My tests
> indicates very good diffusion/confusion.
>
> If you don't believe, do the avalanche (difusion) test: Do small
> variations (for example bit flipping) on a plain-block, encrypt each
> one and save an specific bit value of the encrypted-blocks. Then run
> the ENT random tester on the saved file.

look dopey, flipping bit 'i' of the input to a multiplication in Zn (n
not prime) will not affect any bits prior (x < i).  That demonstrates
the very poor rate of diffusion.  Even with the bit reversal there is a
stronger bias towards upper words... Figure this out please.

I am not saying your cipher is immediately weak, I am just pointing
things out.

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Re: New algorithm for the cipher contest
Date: 22 Aug 2000 19:37:41 -0700

Fascinating.  The round function is
   X_{i+1} := (X xor K_{i,0}) * K_{i,1} mod 2^64,
and there are four rounds.  That's it.  Now that's simplicity.

If the middle xor is changed to addition, then I _think_ there is a
boomerang attack (unverified).  But the xor destroys the boomerang.

If you consider a three-round variant, I _think_ there is a similar
boomerang (unverified).  But four rounds seems to kill the attack.

It's an appealingly-simple cipher.  But can it really be secure?
Very interesting...

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to