Cryptography-Digest Digest #525, Volume #14       Tue, 5 Jun 01 15:13:00 EDT

Contents:
  Re: Best, Strongest Algorithm (gone from any reasonable topic) (Mark Wooding)
  Re: BBS implementation ("Tom St Denis")
  Re: BigNum Question ("Harris Georgiou")
  Re: PRP => PRF (TRUNCATE) ("Tom St Denis")
  Re: Best, Strongest Algorithm (gone from any reasonable topic) ("Tom St Denis")
  Re: Best, Strongest Algorithm (gone from any reasonable topic) ("Tom St Denis")
  Re: Best, Strongest Algorithm (gone from any reasonable topic) ("Tom St Denis")
  Re: Def'n of bijection ([EMAIL PROTECTED])
  Re: Def'n of bijection ([EMAIL PROTECTED])
  Re: Best, Strongest Algorithm (gone from any reasonable topic) (Tim Tyler)
  Re: Def'n of bijection (Tim Tyler)
  Re: Welcoming another Anti-Evidence Eliminator stooge to USENET (P. (Kyle Paskewitz)
  Re: Def'n of bijection (Tim Tyler)
  Re: Def'n of bijection (Tim Tyler)
  Re: National Security Nightmare? (Mok-Kong Shen)
  Re: Best, Strongest Algorithm (gone from any reasonable topic) (Tim Tyler)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: 5 Jun 2001 17:19:30 GMT

Tim Tyler <[EMAIL PROTECTED]> wrote:

> DS> And you never anwsered the FACT that a one byte ouput file
> DS> from CTR mode (though you have no working program) would imediately
> DS> lead an attacker to realize that the input file could only have
> DS> come from 1 of 256 possible messages. With BICOM you have many
> DS> many more messages. That alone makes it more secure. [...]

This is wrong.  I assume here that the BICOM encryption scheme is
something along the lines of

  B_k = E_k o C

where E_k is some conventional cipher (Rijndael using some bizarre
chaining mode, I think, but it doesn't matter) and C is some permutation
over finite bitstrings {0, 1}^* (called a compression -- this is
irrelevant here).  If there is actually some unkeyed invertible
transformation following the encryption step then we can ignore that,
because it won't affect the cardinalities of any of the sets we're
interested in.

Consider the set of 8-bit strings {0, 1}^8.  Since we expect a cipher to
be invertible, we must have |E_k^{-1}({0, 1}^8})| <= |{0, 1}^8| = 256
(since otherwise we'd be unable to recover some distinct plaintexts by
decrypting).  Now, since C is bijective, we must also have

  |B_k^{-1}({0, 1}^8})| <= 256

Hence, there are at most 256 possible plaintext messages for any
one-byte ciphertext.  They might not all be one-byte long, but there are
at most 256 of them.

-- [mdw]

------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: BBS implementation
Date: Tue, 05 Jun 2001 17:42:48 GMT


"Mark Wooding" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Tom St Denis <[EMAIL PROTECTED]> wrote:
>
> > As I understand it, if you know the length of the cycle you only know
> > factors of (p-1)(q-1) right?
>
> I'm caught out.
>
> If you can find short cycles with nonnegligible probability then you can
> factor.  Just falling over one by accident I believe has a nontrivial
> probability of allowing you to factor given additional polynomial-time
> effort, but won't automatically drop the factors out.

Depends.  If your primes are SG primes then I think it will :-)

if you can do it repeatedly then yes you can factor.

Tom



------------------------------

From: "Harris Georgiou" <[EMAIL PROTECTED]>
Subject: Re: BigNum Question
Date: Tue, 5 Jun 2001 20:40:12 +0300

Ο Tim Tyler <[EMAIL PROTECTED]> έγραψε στο μήνυμα συζήτησης:
[EMAIL PROTECTED]
> JGuru <[EMAIL PROTECTED]> wrote:
> : "George" <[EMAIL PROTECTED]> wrote in message
>
> :> I'm trying to develop a program for Macintosh and I need to operate on
> :> very large numbers.  What is the best BigNum library for Macintosh
where
> :> the source code is also available?
>
> : Java has BigInteger and BigDecimal. As long as there is a JDK available
for
> : your platform, you can write code that can run on any platform.
>
> OS >= 9 Java: http://www.apple.com/java/
> OS X    Java: http://www.apple.com/macosx/java2.html
>
> The source code for BigInteger and BigDecimal is available.

The big number packages in JDK work quite well, they even have embedded
functions for most cryptosystem implementations (like secure random prime
number generator, modulo exponetials, etc) - I have actually implemented the
basic RSA encryption from scratch in less than 100 lines of code. The only
problem is that Java is slow and encryption using Java is even slower.



--

Harris

- 'Malo e lelei ki he pongipongi!'




------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: PRP => PRF (TRUNCATE)
Date: Tue, 05 Jun 2001 17:46:49 GMT


"Scott Fluhrer" <[EMAIL PROTECTED]> wrote in message
news:9fisbo$ngc$[EMAIL PROTECTED]...
>
> Tom St Denis <[EMAIL PROTECTED]> wrote in message
> news:VC2T6.31386$[EMAIL PROTECTED]...
> > Reading the paper David pointed to a bit ago I see they have one way to
go
> > from PRP to PRF by truncating bits of the output.
> >
> > Obviously there will be alot of PRPs that make the same PRF.  Wouldn't a
> > better method of truncating be reducing modulo a prime?
> >
> > I.e if you want a four bit PRF you do (PRP mod 17) mod 16 = PRF.  That
way
> > the higher order bits will affect the output.  Did I misunderstand the
> > original intent?
> Well, yes.  For one, if you assume a perfect PRP, what advantage would
your
> approach have over simple truncation?  He achieved a provable result --
can
> you make the analogous proof with your more complicated method.

Well the problem is there will be many collisions.  For example if you
truncate 7 bits to get a 8 => 1 function you can have any permutation of the
upper 7 bits to get an equivalent system (i.e (2^7)!).

>
> In addition, if the two modulii are too close, you'll get problems.  In
the
> simplistic example you gave, 0 is be output twice as often as any other
> value.

Ah true.

Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: Tue, 05 Jun 2001 17:47:47 GMT


"Tim Tyler" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
> Tom St Denis <[EMAIL PROTECTED]> wrote:
> : "Tim Tyler" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> :> Tom St Denis <[EMAIL PROTECTED]> wrote:
> :> : "Tim Tyler" <[EMAIL PROTECTED]> wrote in message
>
> :> :> Knowledge that a message comes from a set of billions of possible
key
> :> :> selected messages, rather than a set of 256 possible key selected
> :> :> messages *is* a feature that has an immediate impact on security.
> :> :>
> :> :> If you can narrow the plaintext down to one of 256 possibilities,
then
> :> :> that is already a significant leak of information about the message
> :> :> contents.
> :>
> :> : OTP encrypted message.
> :>
> :> : C=1101111010001
> :>
> :> : What is P?
> :>
> :> : (How long must this go on?)
> :>
> :> I don't know:
> :>
> :> Maybe until you realise that an OTP doesn't have perfect secrecy if
it's
> :> dealing with finite files, and converting them to cyphertexts of the
same
> :> length as the plaintexts?
>
> : SO WHAT?
>
> : What is your problem?
>
> : All OTP's are named OTP, ah a flaw!
>
> : That's about as valid as your logic.  Tell me what P was in the system
above
> : and we can talk.
>
> I can't say what P was *exactly* - but I can narrow it down to one of only
> 256 possibilities.
>
> In a system with a 128 bit key, being able to narrow the message down
> from one of 2^128 possibilities to one of 2^8 possibilities surely
> represents a gigantic loss of security, no?

In this case the key is the same length of the message.

What cipher are you talking about that uses a 128-bit key and a 8-bit block?

Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: Tue, 05 Jun 2001 17:52:38 GMT


"Tim Tyler" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
> Tom St Denis <[EMAIL PROTECTED]> wrote:
> : "Tim Tyler" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> :> Tom St Denis <[EMAIL PROTECTED]> wrote:
>
> :> : So what? [...]
> :>
> :> So a 1-byte cyphertext can only represent 256 possible plaintexts.
> :>
> :> Do you think it is acceptable to have a 128 bit key and then
> :> throw away 120 bits of entropy from it when dealing with
> :> particular plaintexts?
> :>
> :> What sort of "provable security" do you think that is?
>
> : This is meaningless and you should know it.  First off there are 256! =
> : 2^1680 ways to permute a byte.  So technically you are not throwing away
> : entropy.  So even if it *were* a 1-byte block cipher you're not reducing
the
> : key space.
>
> We're *not* talking about "permuting a byte"!
>
> We're talking about *XOR*ing it with another 1-byte value!
>
> You *do* know what CTR mode is, don't you? ;-)

You lost me here.  If you encrypt 32 bytes with CTR for example, you are
encoding different the bytes by encrypting different counters.  If the
cipher is a 64-bit block ciphere there are 2^64! possible permutations so a
128-bit key would not be lose entropy here.

I really don't get the comparison.  Why is encrypting via CTR throwing key
material away (as you pointed out).

> : Second, in CTR mode the counter is not fixed in size.  You do encode
full
> : blocks.  You just end up not having to use all of the output.
>
> If you like - so you encode 128 bits and then throw away 120 of them,
leaving
> you with 8.  How on earth do you think this is a point in your favour?

Why is a point against?  Yes there will be equivalent keys but not enough to
tell from random.  If you are encoding a single byte the unacity (arrg)
distance will be larger then the byte so you can't tell the correct key from
random.

So how are you throwing key material away?  The only way I could see is if
you encode larger messages where many keys collide and encode the counter to
the same value.  This is however, not a property of a good cipher.

Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Date: Tue, 05 Jun 2001 17:53:55 GMT


"Mark Wooding" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Tom St Denis <[EMAIL PROTECTED]> wrote:
>
> > So what?  I know that 45893475893475893475378893573895 is an n-digit
> > composite, but I don't know the factors...
>
> 45893475893475893475378893573895 =
>   3.5.53.5741.10055328798694131799486841
>
> Glad to be of service.

Pollard-rho or just trial division? :-)

My point was that the length is not the crucial information.  Did you factor
that because you knew the length?

Tom



------------------------------

Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 05 Jun 2001 13:57:56 -0400

Tim Tyler <[EMAIL PROTECTED]> writes:
> 
> [Compression] also helps with bandwidth - but it doesn't help with
> security.

That's what I keep telling you--but you only half believe it. Because next
you say: 

> Do you understand that "checking for meaningful content" can become a
> less effective strategy after compression...?

So apparently you *do* believe that compression is intended to add
security. Make up your mind.

But no, checking for meaningful content does *not* become a ``less
effective strategy'' after compression--it just takes more work. In
particular, BICOM simply adds work: it adds a decompression step. Why
is this simple fact eluding you?

> Say the keys are stored in a key book which is partially destroyed during
> capture.  Then a brute force search of the remaining keyspace becomes
> possible.

And your choice of compression algorithm has essentially no effect on that
situation. It simply adds the work of a decompression step, or some other
constant-factor overhead.

> Say the keys are generated by a faulty RNG that is 80% predictable.
> Then a brute force search of the remaining keyspace becomes possible.

And BICOM has no serious effect on that situation, either.

Len.

-- 
Could you describe the mess, and explain why you think that the lack
of opportunistic bombardment is a contributing factor?
                                -- Dan Bernstein

------------------------------

Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 05 Jun 2001 14:09:10 -0400

"Douglas A. Gwyn" <[EMAIL PROTECTED]> writes:
> [EMAIL PROTECTED] wrote:
>> Anyway, that seems to be the problem here: Scott (and some others)
>> are conflating the notions of ``compression'' with the desirable
>> properties of a OTP, and then expressing their confused ideas with
>> confusing language.
> 
> No, there is more to it.  If you take a 2048-bit ciphertext from
> a system that uses a 128-bit key, there are only 2^128 possible
> plaintexts, nowhere near the 2^2048 that an ideal system would have.
> D.Scott's goal, in general terms, seems to be to develop a system that
> avoids having that property.

Precisely--that's the desirable OTP property that Scott is after.
But he's failing to recognize (or at least describe) his goal clearly. If
he put it as clearly as you have, it would become clear that the number
of decrypts can be no larger than the size of the keyspace, period.

In particular, if the key has fewer bits of entropy than the message, then
potential decrypts will not include all potential messages, period. This
follows from an elementary combinatorial argument.

Len.

-- 
Just because it's automatic doesn't mean it works.
                                -- Dan Bernstein

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 18:03:18 GMT

Paul Pires <[EMAIL PROTECTED]> wrote:
: Tim Tyler <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
:> Paul Pires <[EMAIL PROTECTED]> wrote:
:> : Tim Tyler <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
:> :> Tom St Denis <[EMAIL PROTECTED]> wrote:

:> :> : (How long must this go on?) [...]
:> :>
:> :> Maybe until you realise that an OTP doesn't have perfect secrecy if it's
:> :> dealing with finite files, and converting them to cyphertexts of the same
:> :> length as the plaintexts?
:>
:> : Ehrr?  Why not? [...]
:>
:> See the definition of "perfect secrecy" - e.g.:
:>   http://www.io.com/~ritter/GLOSSARY.HTM#PerfectSecrecy

[...]

:> : It seems to me that that a Ct could be from any possible Plaintext of
:> : exactly the same size.
:>
:> That's perfectly correct.
:>
:> : Are you saying that just leaking the size is a lapse in perfect secrecy?
:>
:> Yes.  It is - unless your system is a bizarre one which can only transmit
:> one byte messages.

: I don't believe you answered the question.

Not even when I said "Yes"...?

: You pointed me to Terry's site and a description of perfect secrecy
: in wich OTP is an example. It clearly states that a requirement
: is that it have "as much keying information as there is message
: information" and "at least as many keys as messages".

Yes...

: Does "a finite file converted to a ciphertest of the same size"  have
: something about it that cannot fulfill this requirement?

It does - provided we're not dealing with messages of only one length.

I am *not* restricting the discussion to a system where all the files are
the same length.

I said "finite files" - and wasn't referring to some subset of them.

: If you saying that perfect secrecy requires the masking of the true length

Yes.  It does - provided some of the possible messages have different
lengths from one another.

: Well, that's in there already if one chooses to use it. If the information is
: secure, there is nothing to stop someone from securely encrypting an
: instruction to remove some random nonsense that was added to change
: the file length. It is still a finite file where the ciphertext is the
: same size as the message encrypted but the true file size is now
: unknowable right?

It is not clear to me what you're proposing - but you *seem* to be
proposing making the cyphertexts larger than the plaintexts - and that
violates my original condition that the plaintexts and cyphertexts are
the same size:

``an OTP doesn't have perfect secrecy if it's dealing with finite files,
  and converting them to cyphertexts of the same length as the plaintexts''
-- 
__________
 |im |yler  Try my latest game - it rockz - http://rockz.co.uk/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 18:17:02 GMT

Mark Wooding <[EMAIL PROTECTED]> wrote:
: Tim Tyler <[EMAIL PROTECTED]> wrote:

:> You are totally mistaken.  That strategy can fail after compression, due
:> to the greater number of decompressions that look like plausible
:> messages.

: I'm not sure this can work.

: Let C (our bijective `compression' function) be a permutation on
: {0, 1}^* -- the set of finite bitstrings.  There is a subset P \subseteq
: {0, 1}^* of `plausible' strings.  I assume that E is relatively small
: compared to {0, 1}^*, otherwise the whole effort is rather pointless.

: We claimed that C was a permutation on bitstrings.  So |C(P)| = |P|.
: That is, the proportion of compressed strings whose decompressions are
: plausible is equal to the proportion of plausible strings among all
: bitstrings.

: What have I missed?

It /looks/ like you're doing something like performing sums with
finite sets and expecting the results to make sense when considering
infinite ones.

Here's a function that's bijective from the set of integers to itself -
yet increases the proportion of even numbers that map to small numbers.

0  <-> 0
1  <-> 2
2  <-> 1
3  <-> 4
4  <-> 6
5  <-> 3
6  <-> 8
7  <-> 10
8  <-> 5
9  <-> 12
10 <-> 14

"Even numbers" are like files that compress, and "odd numbers" are like
files that expand.

For any LHS's less than N there are more even numbers than odd on the RHS
- but that doesn't mean that any odd numbers have been missed out -
or that we don't have a bijection.

I /think/ this should cover things.  If not post again.

:> For any possible *cyphertext* there are many possible decompressions -
:> probably about one for each key.

: No more than there are possible plaintexts prior to compression, since
: as claimed the decompression function is bijective.

Indeed not.  Even if there were an infinite number of possible plaintexts
there might still not be one for each key due to duplications.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: Kyle Paskewitz <[EMAIL PROTECTED]>
Crossposted-To: alt.privacy,alt.security,alt.security.pgp,alt.privacy.anon-server
Subject: Re: Welcoming another Anti-Evidence Eliminator stooge to USENET (P.
Date: Tue, 5 Jun 2001 14:24:09 -0400


> > You've forgotten that 2 is also prime.  If you take the product of any
> > number of consecutive primes beginning with 2 (the first prime) and add 1,
> > you will get another prime.  E.G.
> >
> > 2*3 + 1 = 7
> > 2*3*5 + 1 = 31
> > 2*3*5*7 + 1 = 211 , etc...
> 
> Really???  I was under the impression that:
> 
> 2*3*5*7*11*13+1 = 30031 = 59*509
> 2*3*5*7*11*13*17+1 = 510511 = 19*97*277
> 2*3*5*7*11*13*17*19+1 = 9699691 = 347*27953
> 2*3*5*7*11*13*17*19*23+1 = 223092871 = 317*703763
> 
> weren't prime.  I must be delusional, I suppose...

Yes, you are correct, I stated that incorrectly.  The correct proof is as
follows:  If you assume N is the product of all consecutive primes AND N +
1 is prime, then you've contradicted your original assumption that N is
the product of all consecutive primes.  Otherwise, N is composite, and the
product of two or more primes that do not divide N, therefore also
contradicting your assumption.  QED



------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 18:44:14 GMT

[EMAIL PROTECTED] wrote:
: Tim Tyler <[EMAIL PROTECTED]> writes:

:> [Compression] also helps with bandwidth - but it doesn't help with
:> security.

: That's what I keep telling you--but you only half believe it. Because next
: you say: 

:> Do you understand that "checking for meaningful content" can become a
:> less effective strategy after compression...?

: So apparently you *do* believe that compression is intended to add
: security. Make up your mind.

You wicked twister ;-)

You snipped the context for the former statement.

This was:

``The case where the key is as large as the original message is
  not where compression helps.''

Compression *doesn't* help with securtity *if* your key is already the
size of the message.

Under other circumstances, it might well help.

: But no, checking for meaningful content does *not* become a ``less
: effective strategy'' after compression--it just takes more work.

Repeating this error won't make it true.

: In particular, BICOM simply adds work: it adds a decompression step. Why
: is this simple fact eluding you?

It is an error - on your part - not a "simple fact".

:> Say the keys are stored in a key book which is partially destroyed during
:> capture.  Then a brute force search of the remaining keyspace becomes
:> possible.

: And your choice of compression algorithm has essentially no effect on that
: situation. [...]

It can make or break whether a brute force search of the remaining
keyspace uncovers a unique plausible message or not.

Is that "essentially no effect"?

:> Say the keys are generated by a faulty RNG that is 80% predictable.
:> Then a brute force search of the remaining keyspace becomes possible.

: And BICOM has no serious effect on that situation, either.

Oh well, at least it is mainly the same error that you keep repeating.

I assume you're new to this particular discussion.  You apparently hadn't
even heard of BICOM when we started.  I've had these same discussions quite
a few times now, so certainly /should/ have got to know the ropes by now.

Compression might well help with this.  The cases where it is most likely
to help is where the message is relatively short - in which case it can
increase the number of plausible messages a cryptanalyst has to examine.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 18:51:48 GMT

[EMAIL PROTECTED] wrote:
: "Douglas A. Gwyn" <[EMAIL PROTECTED]> writes:

:> If you take a 2048-bit ciphertext from a system that uses a 128-bit
:> key, there are only 2^128 possible plaintexts, nowhere near the 2^2048
:> that an ideal system would have. D.Scott's goal, in general terms,
:> seems to be to develop a system that avoids having that property.

: Precisely--that's the desirable OTP property that Scott is after.
: But he's failing to recognize (or at least describe) his goal clearly. If
: he put it as clearly as you have, it would become clear that the number
: of decrypts can be no larger than the size of the keyspace, period.

AFAICS, neither of you is discussing anything like what David Scott is doing.

: In particular, if the key has fewer bits of entropy than the message, then
: potential decrypts will not include all potential messages, period. This
: follows from an elementary combinatorial argument.

...and is another mistake :-(

The message can have *huge* entropy - even if there are only two possible
messages: "0" and "1".

All that is necessary for the "0" message to have a very large entropy is
for it to be *extremely* rare.  This follows from Shannon's definition of
entropy.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: National Security Nightmare?
Date: Tue, 05 Jun 2001 20:56:30 +0200



"Douglas A. Gwyn" wrote:
> 
> Mok-Kong Shen wrote:
> > If everyone of the world were honest and behaved correctly,
> > laws would have been an absolute nonsense from the very
> > beginning.
> 
> That's nonsense, since "behaving correctly" necessarily
> translates to "following some laws", such as obeying
> traffic signals.  Also, disputes can arise even among
> reasonable people, and a system is necessary to resolve
> them, unless you really prefer that disputes be settled
> by uncontrolled brute force.
> 
> It is possible, and useful, to make a distinction between
> arbitrary regulations vs. rational, objectively defined
> rules.  Nobody of good will should object to the latter.

One can behave correctly to some non-written, commonly
esteemed, moral or ethical or religious ideals instead
of paragraphs issued by a governmental organ. If you
consider such to be 'laws', then yes these are what I
had in mind in the quote but that's certainly quite
different from the term 'laws' or regulations that were 
considered in previous posts in this thread. BTW, I was 
hinting at an evidently impossible utopia (anyone thinks 
that it could ever come true that everyone of the world 
would be honest??).

M. K. Shen

------------------------------

From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic)
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 18:56:53 GMT

Tom St Denis <[EMAIL PROTECTED]> wrote:
: "Tim Tyler" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...

:> In a system with a 128 bit key, being able to narrow the message down
:> from one of 2^128 possibilities to one of 2^8 possibilities surely
:> represents a gigantic loss of security, no?

: In this case the key is the same length of the message.

In bits: Len(key) = 128.  Len(msg) = 8.  128 != 8.

: What cipher are you talking about that uses a 128-bit key and a 8-bit block?

I never mentioned an 8 bit block cypher.  We're talking about an ordinary
block cypher in CTR mode.
-- 
__________
 |im |yler  [EMAIL PROTECTED]  Home page: http://alife.co.uk/tim/

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to