Cryptography-Digest Digest #506, Volume #14       Sun, 3 Jun 01 18:13:00 EDT

Contents:
  Re: Uniciyt distance and compression for AES (Tim Tyler)
  Re: Uniciyt distance and compression for AES (SCOTT19U.ZIP_GUY)
  Re: Best, Strongest Algorithm (gone from any reasonable topic) ("Scott Fluhrer")
  Re: Luby-Rackoff Theorems? (David Wagner)
  Re: Luby-Rackoff Theorems? (David Wagner)
  Re: Dynamic Transposition Revisited Again (long) (Mok-Kong Shen)
  Re: Uniciyt distance and compression for AES (SCOTT19U.ZIP_GUY)
  PRP vs PRF (was Luby-Rackoff Theorems) ("Tom St Denis")
  Re: BBS implementation ([EMAIL PROTECTED])
  Sv: Top Secret Crypto ("Peter Nielsen")
  Re: Best, Strongest Algorithm (gone from any reasonable topic) (SCOTT19U.ZIP_GUY)
  Re: BBS implementation ("Tom St Denis")
  S-Boxes ("Yashar Alishenas")
  Re: S-Boxes ("Tom St Denis")
  Re: James Felling:  Sorry to break your bubble (Matthew Montchalin)
  Re: Dynamic Transposition Revisited Again (long) (David Wagner)

----------------------------------------------------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Sun, 3 Jun 2001 19:59:44 GMT

SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
: [EMAIL PROTECTED] (Tim Tyler) wrote in <[EMAIL PROTECTED]>:

:>That's not /quite/ what Shannon's perfect secrecy implies:
:>
:>Perfect secrecy means that the cyphertext contains no information about
:>the plaintext.  This can happen if one cyphertext maps to one plaintext -
:>but can also happen if exactly ten cyphertexts map to every plaintext.

:    I have to look it up but there a difference between ideal and perfect.

http://www.io.com/~ritter/GLOSSARY.HTM#PerfectSecrecy
http://www.io.com/~ritter/GLOSSARY.HTM#IdealSecrecy

: I thought in perfect every key tested has to lead to a unique input
: message so the bijection is between the set of 3 the key plaintext
: and message.

All that's needed is for there to be no statistical relationship between
the plaintext and cyphertext.

Having said this, the only way to build such a system without using a
bijection, is by using a random oracle.

Since no random oracles are known to exist, a bijection is indeed the only
practical construction that demonstrably manages to get perfect secrecy.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Uniciyt distance and compression for AES
Date: 3 Jun 2001 20:32:29 GMT

[EMAIL PROTECTED] (Tom St Denis) wrote in
<BGwS6.19308$[EMAIL PROTECTED]>: 

>
>> If you have a compressor that succeeds in compressing your target
>> data, then compression before encryption will increase the unicity
>> distance of the resulting system.
>

  Your off my KILLFILE since its a new month at least for a while.
First of all as you know I belive in bijective compression something
far from the likes of deflate and LZ77. These compression methods
are not bijective and things like deflate are designed to make it
easier for groups like the NSA to break the code.

>That's not true.  There are limits to data compression too.  You can use
>codewords that haven't been defined yet, etc...  So in fact one could
>argue that the # of 16 byte blocks which are valid compressed stream
>data *AND* turn into english is actually smaller.  [For example in
>Deflate you can't reach 1500 bytes back when you're only 12 bytes into
>the file, etc...] 
>
>Let's look at a 3 byte LZ77 system (12 bit index, 4 bit length, 8 byte
>immediate).  A 16 byte block will have 5 code words.  When this is the
>first block of the message obviously the 12 bit index must be be between
>0 and 80 (5*length=80), so there are (2048-80)^5 ~= 2^54 invalid 12 bit
>lengths. There are (256-96)^5=2^36 invalid immediates.  (Together this
>is 2^90 invalid blocks) etc...  For the first few bytes of an english
>message the lengths will most likely be 0 (immediate) since matches
>really only occur after a few 100 bytes.  So this is another (16-1)^5 =
>2^20 for a grand total of 2^110 invalid texts.
>

   There are two reasons for this. One the methods your talking about
should never be used for compression before encryption. Two most
compressors are so sucky for short files. It hard to get real compression
you get actaully expansion. This is a well known effect. Thats why
especailly for small files you need to compress and whiten at least
the first part if you using most types of encryption. A scott19u
which treats whole file as a single block would have less problems.
  Since compressors start so poorly the entropy per bit could drop
for the whole message if long. But unfortunately the data does
not smoothly compress and especailly the data in front of file is
actully expanded with most compressors. So in effect since you
measure unicity distance from the start of the file where the bad
compressors are the wores and text actaully expanded the unicity
real gets smaller. But its because no real compression done on front
end its actaully expansion. But your being just an ass since you
know all this. Why do you play so dumb.

>Just looking at 16 bytes of normal text right away you can only throw
>away (256-96)^16=2^117 invalid texts.
>
>Keep in mind this is off the top of my head.  I don't see a big argument
>for filtering out compressed data being very hard...

  This becasue it wasn't off your head it was rectal retrival.

<SNIP> rest of BS misguided rant cut.

 
  If your really want to learn which I doubt. Consider text being
compressed by mine or Matt' arithmetic compressor. reverse file
uncompress usenig h2unc.exe ( it expands compressed by at most 8 times)
now reverse file again. 
Now compress with mine or matt again. Now you have a whiten compressed
file.
  Encode RC what ever and look at first 50 blocks. What are you
going to do. Any key you test will like like crap. You need the
whole decypted file with each key you want to test before you can even
look at it. It the file is longer than the UNICITY distance one
key would lead to a solution. But if shorter. Then several keys
will lead to possible soultions.
  But I have stated all this before. You just act like an ass and
never really seem to look at the problem or concept. Frankly Tim
is wasteing his time with you. Only some false god you whorship
could set you straiht at this time. But telling you the truth
might weaken him in your eyes. He would really have you decieved
and confused instead of actaully helping you.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: "Scott Fluhrer" <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: Sun, 3 Jun 2001 13:42:02 -0700


Tom St Denis <[EMAIL PROTECTED]> wrote in message
news:UyuS6.17399$[EMAIL PROTECTED]...
>
> "Tim Tyler" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
> > Tom St Denis <[EMAIL PROTECTED]> wrote:
> > :[D. Scott wrote:]
> >
> > :>   If you encrypt one byte with BICOM you most likely will
> > :> get several bytes out. However if you did get one byte
> > :> out the input file could be far more than one byte. This
> > :> is beacuse if the key is not known many many thousands of
> > :> different possible input files could have mapped to that
> > :> single one byte output file.
> >
> > : Ok bud, there is more to the world than "encryption".  BICOM as I
> pointed
> > : out earlier is vastly inefficient as compared to CTR modes and as you
> have
> > : failed to point out not any more secure.
> >
> > He explained it - you just didn't understand the explanation.
>
> What explanation?  All he does is flame me.
>
> > : Besides BICOM can't be a "bijection" if one input maps to several
bytes.
> >
> > Yes it can.  Indeed it is.
> >
> > : Let's say a 1 byte file maps to 3 bytes.
> >
> > OK.
> >
> > : Now to be fully bijective you must map all 3 byte files to 1 byte.
> >
> > A false statement on your part - bijectivity in no way requires such a
map
> > to exist.  Want to try again?
>
> Um wrong.  A bijection is where the domain and range are the same set.
I.e
> a permutation of 0..255 is a bijection.  Squaring modulo a prime is not a
> bijection, etc.. (since not all roots have roots, etc..)

Tom, no.  A bijection is a function that is both an injection and a
surjection.  That is, a function F is a bijection iff both of the following
are true:

For any i!=j, F(i)!=F(j)

For any x, there exists an i s.t. F(i) = x

(where i, j are members of the domain of F, and x is a member of the range
of F).


In addition, for BICOM, (I believe) that the domain and the range are, in
fact, identical: all possible bitstrings (limited to lengths a multiple of
8?  I never examined it, so I am unclear on that subtlety).  This is an
infinite set, but that is not a problem from the above definition, which
never assumes finiteness.


> So we don't get confused, what is the real def of bijection?  I'm not sure
> off hand... but from what I gather it's based on permutations.
No, it isn't.  For example, it is possible to define a bijection between the
integers and the rationals.  See above for the real definition.

--
poncho





------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Luby-Rackoff Theorems?
Date: Sun, 3 Jun 2001 21:02:32 +0000 (UTC)

Nicol So  wrote:
>The construction is secure iff *asymptotically* [...]

Yeah, yeah, I know.  I was hoping to not have to get into the
difference between asymptotic treatments and concrete security,
but I guess I should have known. :-)  I am too lazy to give all
the details.

Concrete security is where you measure exactly the number of
texts and the advantage (the distinguishing probability), rather
than talking about asymptotics and polynomials and big-O notation.
Personally, I prefer the concrete security, and in this case,
you can remove the asymptotics pretty easily.  In a concrete security
treatment of the Luby-Rackoff theorem, one can show that m^2/2^n is
an exact bound on the distinguishing probability (no constant factors
needed), if I'm remembering correctly.

This is why I used an expression like m ~ 2^{n/2}, rather than
m = 2^{n/2}: the "~" is intended to be a warning that this isn't
an exact equality (or if you want to think of it as an equality,
you shouldn't take it too seriously).  In particular, if you want
an upper bound e on the distinguishing probability, you want
m^2/2^n <= e, and in general, e << 1.

If you prefer asymptotics, you can think of "~" as representing
asymptotic equivalence, disregarding constant (and polynomial) factors.
So, if you prefer to use asymptotics, you should interpret my claim
to mean that the LR cipher is secure so long as m/2^{n/2} -> 0 faster
than any polynomial as n -> infinity, or something like that.  
But I personally happen to feel that worrying over these technical
issues distracts from what's really going on in the theorem.

>We cannot take the expression of an upper bound, and
>then argue that if you choose m to violate it, the construction suddenly
>becomes insecure.  The upper bound in the main lemma represents *a*
>sufficient condition for security, [...]

Actually, as I mentioned in my original post, the m^2/2^n bound
is tight.  In particular, there are explicit attacks (lower bounds)
that are within a constant factor of the LR upper bound.  As a
consequence, the LR bound is a sufficient *and necessary* condition.

The original Luby-Rackoff paper is probably not the best place to
be looking for what is currently known on the Luby-Rackoff construction.
More is known today.

>I think there's another issue in the way the main lemma is interpreted
>in your argument. It seems to assume that it is OK so long as the *upper
>bound* on the distinguishing probability is <= 1.

Yeah, yeah, ok.  As I wrote above, my post was only intended to
give the rough estimate.  You can make this all precise by using
the notion of advantage (the distinguishing probability, a difference
of two probabilities); then the LR bound says that the advantage of
any adversary is <= m^2/2^n.

>I'm not faimilar with the newer results so I can't discuss them
>intelligently, but I'd appreciate some pointers so that I can get
>updated on the developments.

Alas, now that I looked again, I didn't find any that really capture
everything I wanted, i.e., (1) concrete security (rather than
asymptotics), and (2) the PRF->PRP theorem (rather than just the
information-theoretic case where the Feistel function is a truly
random function).  But here are the best two references:

Naor & Reingold: http://www.wisdom.weizmann.ac.il/~naor/PAPERS/lr_abs.html
 (see Theorem 3.1; covers part (2) but not (1))
Rogaway's class notes: http://www.cs.ucdavis.edu/~rogaway/classes/227/winter99/
 (see Lecture 3; covers part (1) but not (2))

Here are two other papers that are probably less useful, but I'll
include a pointer just for completeness:

Lucks: http://th.informatik.uni-mannheim.de/m/lucks/papers/LR-fast.ps.gz
Maurer: ftp://ftp.inf.ethz.ch/pub/publications/papers/ti/isc/wwwisc/Maurer92d.ps

>You don't need to use newer and more precise results to argue about the
>"security" of recursive Luby-Rackoff constructions. It's true that the
>main lemma was proved assuming the F functions are truly random. But the
>security of a recursive construction comes from the definition of
>psuedorandomness. Basically if you can find a distinguisher for the
>recursive construction, you can turn it into a distinguisher for the
>inner construction, contradicting the main lemma.

Yes, that's true, but the key thing is to measure concretely the
lower bound on how many texts would be needed by this distinguisher.
In particular, you'll find that, by this measure, security degrades
as you use the recursive construction, once you remove the asymptotics.

------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Luby-Rackoff Theorems?
Date: Sun, 3 Jun 2001 21:04:29 +0000 (UTC)

Nicol So  wrote:
>I just realized that allowing 2^{n/4} queries (n = block size) is
>incompatible with the notion of pseudorandomness used in the
>Luby-Rackoff paper. (Polynomial-size oracle circuits cannot have a
>superpolynomial # of oracle gates).

Yes, Luby-Rackoff only considered attackers that make a polynomial
number of chosen-text queries to the cipher.  But it turns out that
this restriction is not essential; the LR result was subsequently
generalized to hold for any number of queries.  See the introduction to
the following paper for a long list of references and further discussion.
http://www.cs.berkeley.edu/~daw/papers/prf-crypto98-long.ps

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Dynamic Transposition Revisited Again (long)
Date: Sun, 03 Jun 2001 23:01:11 +0200



[EMAIL PROTECTED] wrote:
> 
> [EMAIL PROTECTED] (Mok-Kong Shen) wrote:
> 
> > My article 'On encryption through bit permutations' posted
> > on 6th March may interest you.
> 
> Yes that seems to be a good general solution.
> 
> By the way I should have said:
> 
>   invert half the bytes
> 
> not bits as this is a byte-oriented problem. Other strategies might
> be better for other file types.

I don't understand yet what do you mean by 'inverting half 
the bytes'? Randomly choosing one half of the bytes in the 
message and inverting all bits in these bytes while leaving 
the other half of the bytes in the message unchanged, or 
what? (How good is that for crypto purposes?)

> 
> There seems to be the need for a cryptographically oriented compression
> system that has the facility for bit balancing and other devious tricks.
> Have a look at:
> 
>     http://www.cix.co.uk/~klockstone/cbrev.htm
> 
> for some proposals for an updated codebook system, featuring once-only
> substitution on the placode and transposition over the full message
> block. One of the advantages of the codebook is the resulting compression.

Using code book is commonly considered (like compression)
a processing that is in a sense 'orthogonal' to what is 
done by block (or stream) encryption processing, if I 
don't err. Some time ago I considered the possibility
of coding each word of a medium/large dictionary with 
some 15/20 bits (see 'On dictionary encoding' of 14th 
Sep 2000).

M. K. Shen

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Uniciyt distance and compression for AES
Date: 3 Jun 2001 21:06:35 GMT

[EMAIL PROTECTED] (Tim Tyler) wrote in <[EMAIL PROTECTED]>:

>SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
>: [EMAIL PROTECTED] (Tim Tyler) wrote in <[EMAIL PROTECTED]>:
>
>:>That's not /quite/ what Shannon's perfect secrecy implies:
>:>
>:>Perfect secrecy means that the cyphertext contains no information
>:>about the plaintext.  This can happen if one cyphertext maps to one
>:>plaintext - but can also happen if exactly ten cyphertexts map to
>:>every plaintext. 
>
>:    I have to look it up but there a difference between ideal and
>:    perfect. 
>
>http://www.io.com/~ritter/GLOSSARY.HTM#PerfectSecrecy
>http://www.io.com/~ritter/GLOSSARY.HTM#IdealSecrecy

   I found Terry definations but. I am not sure its complete
however I will go with it. I have to find my Shannon papers
I lose them for a while but I know I have them some where.
But Terry states "This sort of cipher needs as much keying
information as there is message information to be protected
A cipher with perfect secrecy has at least as many keys as 
messages, and may be seen as a (huge) "
in his perfect secrecy area.  However I don't really
care about definations all that much. But in sytems
where each key leads to possible vald plaintexts.
 

>
>: I thought in perfect every key tested has to lead to a unique input
>: message so the bijection is between the set of 3 the key plaintext
>: and message.
>
>All that's needed is for there to be no statistical relationship between
>the plaintext and cyphertext.
>
>Having said this, the only way to build such a system without using a
>bijection, is by using a random oracle.
>
>Since no random oracles are known to exist, a bijection is indeed the
>only practical construction that demonstrably manages to get perfect
>secrecy. 

  I can live with this.



David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: PRP vs PRF (was Luby-Rackoff Theorems)
Date: Sun, 03 Jun 2001 21:12:18 GMT

A PRF is a pseudo-random function and a PRP is a pseudo-random permutation?
(I'm reading the paper Wagner posted today).

So basically a PRF is any type of mapping and a PRP is any injection?
--
Tom St Denis
---
http://tomstdenis.home.dhs.org



------------------------------

From: [EMAIL PROTECTED]
Reply-To: [EMAIL PROTECTED]
Subject: Re: BBS implementation
Date: Sun, 03 Jun 2001 14:18:33 -0700

How do I know it is not on a short or degenerate cycle?

Tom St Denis wrote:

> <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > I want to know how to initialze it from a key.
>
> Do you already have a blum integer?
>
> Just make a random integer X (between 1 and N) and use X^2 mod N as your
> starting value.
>
> Tom
>
> >
> > Tom St Denis wrote:
> >
> > > <[EMAIL PROTECTED]> wrote in message
> > > news:[EMAIL PROTECTED]...
> > > > Where can I find some info on practical BBS implementation?
> > >
> > > Um, square an integer modulo a composite blum integer repeatedly and
> output
> > > the log log n lower bits
> > >
> > > What else do you want?
> > >
> > > Tom
> >
> >
> >




------------------------------

Reply-To: "Peter Nielsen" <[EMAIL PROTECTED]>
From: "Peter Nielsen" <[EMAIL PROTECTED]>
Subject: Sv: Top Secret Crypto
Date: Sun, 3 Jun 2001 23:18:54 +0200

It is very easy to refer to:
http://www.interhack.net/people/cmcurtin/snake-oil-faq.html
or something similar.

But that kind of information's are the first you throw yourself into when
you have interest in encryption.
I am probably the only one of those who have responded to awns letter who
also have worked with the program and especially have read the documentation
which follows the program. By the way there are a complete section with
program codes in the helpmenue, but I do not have such a good knowledge
about programming that I am in the position to evaluate that.
Instead of smart remarks which appear to the material which awn put into
this newsgroup      (probably as an advertisement for the program) I think
that you must expect a serious contribution, which means that you test the
program and examine how it works.
>From this you then could discuss if there were some things in the program,
which perhaps could be changed or added. The comments which have appeared up
to now indicate arrogance and lack of interest in learning a new and
exciting program to know.

Peter Nielsen.




------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: 3 Jun 2001 21:16:35 GMT

[EMAIL PROTECTED] (Scott Fluhrer) wrote in
<9fe878$r73$[EMAIL PROTECTED]>: 

>
>Tom St Denis <[EMAIL PROTECTED]> wrote in message
>news:UyuS6.17399$[EMAIL PROTECTED]...
>>
>> "Tim Tyler" <[EMAIL PROTECTED]> wrote in message
>> news:[EMAIL PROTECTED]... 
>> > Tom St Denis <[EMAIL PROTECTED]> wrote:
>> > :[D. Scott wrote:]
>> >
>> > :>   If you encrypt one byte with BICOM you most likely will
>> > :> get several bytes out. However if you did get one byte
>> > :> out the input file could be far more than one byte. This
>> > :> is beacuse if the key is not known many many thousands of
>> > :> different possible input files could have mapped to that
>> > :> single one byte output file.
>> >
>> > : Ok bud, there is more to the world than "encryption".  BICOM as I
>> pointed
>> > : out earlier is vastly inefficient as compared to CTR modes and as
>> > : you 
>> have
>> > : failed to point out not any more secure.
>> >
>> > He explained it - you just didn't understand the explanation.
>>
>> What explanation?  All he does is flame me.
>>
>> > : Besides BICOM can't be a "bijection" if one input maps to several
>bytes.
>> >
>> > Yes it can.  Indeed it is.
>> >
>> > : Let's say a 1 byte file maps to 3 bytes.
>> >
>> > OK.
>> >
>> > : Now to be fully bijective you must map all 3 byte files to 1 byte.
>> >
>> > A false statement on your part - bijectivity in no way requires such
>> > a 
>map
>> > to exist.  Want to try again?
>>
>> Um wrong.  A bijection is where the domain and range are the same set.
>I.e
>> a permutation of 0..255 is a bijection.  Squaring modulo a prime is
>> not a bijection, etc.. (since not all roots have roots, etc..)
>
>Tom, no.  A bijection is a function that is both an injection and a
>surjection.  That is, a function F is a bijection iff both of the
>following are true:

  In BICOM there are 2**256 differnt F's one for each key,
>
>For any i!=j, F(i)!=F(j)

  in BICOM i and j come from the set of 8-bit binary files.

>
>For any x, there exists an i s.t. F(i) = x

   likewise x and Fwhateverkey(i) are also members of the
8-bit binary file set.

>
>(where i, j are members of the domain of F, and x is a member of the
>range of F)

   In BICOM case the domain and range are the possible 8-bit byte
files that are in common use in most computer systems.
. 
>
>
>In addition, for BICOM, (I believe) that the domain and the range are,
>in fact, identical: all possible bitstrings (limited to lengths a
>multiple of 8?  I never examined it, so I am unclear on that subtlety). 
>This is an infinite set, but that is not a problem from the above
>definition, which never assumes finiteness.

   However by various matching transforms it trival to make matching
front and backend programs to BICOM to make it bijective to the
set of all bit srings. It is just that most people work with 8-bit
byte files.

>
>
>> So we don't get confused, what is the real def of bijection?  I'm not
>> sure off hand... but from what I gather it's based on permutations.
>No, it isn't.  For example, it is possible to define a bijection between
>the integers and the rationals.  See above for the real definition.
>
>--
>poncho
>

  I still doubt TOMMY will grasp it. He dooesn't really
want to.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
        http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
 made in the above text. For all I know I might be drugged or
 something..
 No I'm not paranoid. You all think I'm paranoid, don't you!


------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: BBS implementation
Date: Sun, 03 Jun 2001 21:25:53 GMT


<[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> How do I know it is not on a short or degenerate cycle?

Good question.  I don't know.

I heard the average cycle length approaches SQRT(N) over randomly selected
starting values.


>
> Tom St Denis wrote:
>
> > <[EMAIL PROTECTED]> wrote in message
> > news:[EMAIL PROTECTED]...
> > > I want to know how to initialze it from a key.
> >
> > Do you already have a blum integer?
> >
> > Just make a random integer X (between 1 and N) and use X^2 mod N as your
> > starting value.
> >
> > Tom
> >
> > >
> > > Tom St Denis wrote:
> > >
> > > > <[EMAIL PROTECTED]> wrote in message
> > > > news:[EMAIL PROTECTED]...
> > > > > Where can I find some info on practical BBS implementation?
> > > >
> > > > Um, square an integer modulo a composite blum integer repeatedly and
> > output
> > > > the log log n lower bits
> > > >
> > > > What else do you want?
> > > >
> > > > Tom
> > >
> > >
> > >
>
>
>



------------------------------

From: "Yashar Alishenas" <[EMAIL PROTECTED]>
Subject: S-Boxes
Date: Sun, 3 Jun 2001 23:37:31 +0200

Why exactly do the S-Boxes in the DES-algorithm play such an important role?
And which exact impact do they have on the main attacks, differential and
linear cryptanalysis, of DES? How do these forms of attack take advantage of
the S-Boxes?

I am trying to understand the influence that these Boxes have on the
security of DES, but I can't figure it out. Grateful for any kind of help...

/ya, a newbie



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: S-Boxes
Date: Sun, 03 Jun 2001 21:32:46 GMT


"Yashar Alishenas" <[EMAIL PROTECTED]> wrote in message
news:9fea5u$gtv$[EMAIL PROTECTED]...
> Why exactly do the S-Boxes in the DES-algorithm play such an important
role?
> And which exact impact do they have on the main attacks, differential and
> linear cryptanalysis, of DES? How do these forms of attack take advantage
of
> the S-Boxes?
>
> I am trying to understand the influence that these Boxes have on the
> security of DES, but I can't figure it out. Grateful for any kind of
help...

Try to break DES without the sboxes :-)

That will answer all your questions.

(Hint:  The sboxes are non-linear)

Tom



------------------------------

From: Matthew Montchalin <[EMAIL PROTECTED]>
Crossposted-To: alt.hacker,talk.politics.crypto
Subject: Re: James Felling:  Sorry to break your bubble
Date: Sun, 3 Jun 2001 14:29:02 -0700

On Sun, 3 Jun 2001, Tim Tyler wrote:
|: Array number:   101     0 1 2 3 4 9 5 8 7 6 
|: Array number:   102     0 1 2 3 4 9 6 5 7 8 
|: Array number:   103     0 1 2 3 4 9 6 5 8 7 
|: Array number:   104     0 1 2 3 4 9 6 7 5 8 [...]
|
|...if so, you just bored them to tears :-(

Well, >I< thought it was interesting.  Of course, it really should
have been posted to sci.crypt.random-numbers.  If your newsserver
doesn't have that newsgroup, you should ask your newsadmin to add
it.



------------------------------

From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Dynamic Transposition Revisited Again (long)
Date: Sun, 3 Jun 2001 21:40:20 +0000 (UTC)

>  invert half the bytes

I would expect that there will still be noticeable correlations between
bits within the same byte.

I suppose one view is that reducing the bias is conservative
crypto-engineering, can't hurt, and just might help (who knows?).

Another view might be as follows.  If your encryption algorithm is strong
against known-plaintext attacks, maybe it is not critical to remove these
biases.  If the encryption algorithm is not strong against known-plaintext
attacks, then inverting half the bytes may not be sufficient.  In either
case, there is not much point to doing this, or so this view would say.

I don't have any strong personal opinion about which viewpoint is better.
Probably it depends on the intended usage, how much assurance you need,
and whether there are other things you can do to improve security that are
a better use of your time.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to