Cryptography-Digest Digest #105, Volume #12      Sun, 25 Jun 00 21:13:01 EDT

Contents:
  Analysis of IS-95 CDMA Voice Privacy (Chuck)
  Codebook pages complete at last (Mike Andrews)
  Re: "And the survey says" (Simon Johnson)
  Re: Quantum computing (Simon Johnson)
  Re: Linear Feedback Shift Register *with* lock-up? ("Trevor L. Jackson, III")
  How Uncertain? (Future Beacon)
  Re: "And the survey says" (Jonathan Edwards)
  Re: Tiny hash? (Benjamin Goldberg)
  Re: Variability of chaining modes of block ciphers ("Scott Fluhrer")
  Re: Variability of chaining modes of block ciphers ("Scott Fluhrer")
  Re: Quantum computing ("Douglas A. Gwyn")
  Re: Quantum computing ("Douglas A. Gwyn")
  Re: "And the survey says" ("Paul Pires")
  Re: How Uncertain? (tomstd)
  Re: DES and questions (tomstd)
  Re: Compression and known plaintext in brute force analysis (restatements caused by 
the missing info .... thread) (zapzing)
  Re: "And the survey says" ("Paul Pires")

----------------------------------------------------------------------------

From: Chuck <[EMAIL PROTECTED]>
Subject: Analysis of IS-95 CDMA Voice Privacy
Date: Sun, 25 Jun 2000 17:19:09 -0500
Reply-To: [EMAIL PROTECTED]

Hello,

I understand that this paper will be presented at CAS 2000; however, I
will be at USENIX at the same time.  Is there anyway I can get a copy of
this without attending CAS 2000?

The paper is authored by M. Zhang, C. Carroll and A. Chan

Thank you,
Chuck


------------------------------

From: [EMAIL PROTECTED] (Mike Andrews)
Subject: Codebook pages complete at last
Date: Sun, 25 Jun 2000 22:21:25 GMT

The US Army Division Field Code, Training Edition No. 2
pages are all scanned in and connected up, as far as I 
can tell, after a rather long hiatus during which nothing
got done. 

        http://mikea.ath.cx/codebook

-- 
"Tam byl, to delal, futbolka byla defitsit..." [2]
[2] BTDT, T-shirts were in short supply
        -- [EMAIL PROTECTED] (Beebit), 
                in news.admin.net-abuse.email

------------------------------

Subject: Re: "And the survey says"
From: Simon Johnson <[EMAIL PROTECTED]>
Date: Sun, 25 Jun 2000 15:14:49 -0700

Has it actually been proven|disproven that such a perfect cipher
could ever be constructed?

Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------

Subject: Re: Quantum computing
From: Simon Johnson <[EMAIL PROTECTED]>
Date: Sun, 25 Jun 2000 15:25:43 -0700

Hasn't expodential!=p been proven?


Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------

Date: Sun, 25 Jun 2000 19:14:12 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Crossposted-To: comp.theory,sci.math
Subject: Re: Linear Feedback Shift Register *with* lock-up?

Chris F Clark wrote:

> Ponder wrote:
>
> > I'm looking to use a Linear Feedback Shift Register to implement
> > a counter with minimal resources (i.e. fewer gates and wires than
> > an additive incrementor).
>
> Trevor L. Jackson, III wrote:
>
> > Assume you have a two-cycle LFSR with periods 1 and 2^N-1, and that
> > the singular cycle has the value zero (the classic situation for a
> > maximal-period LFSR).
>
> I played around with this a little bit on paper yesterday.  I assumed
> you were going to implement that LFSR with a set of flip-flops (FFs)
> where the output of each FF chained to the next (in a cyclic ring with
> the Nth FF chaining to the 1st one) with a little logic to implement
> the feedback. I will call the array of FFs a[0] .. a[n].
>
> The 1st LFSR that came to mind was a[i] = a[i] ^ a[i-1] except a[0] =
> a[n]--no xor for the loop-around stage.  That has some nice
> properites.  000... is the 1 period loop.  00001 -> ... -> 1111111 is
> the 2**n-1 period loop.  Trevor's pattern detector of an and of all
> bits works as the saturation detector.

The key to the low gate count is using wire rather than extra gates for the
detector.

The proposed feedback pattern is not be maximal for all values of N.

>
>
> Another LFSR is even more minimal, with a[i] = a[i-1], except a[0] =
> a[n] and a[1] = a[0] ^ a[n].  This uses only 1 xor gate. 000... is the
> 1 period loop and 0001 -> ... -> 100000 is the 2**n-1 period loop.
> The saturation detector is almost the same as the previous one, except
> all the bits but the high bit are fed in complemented.  If you are
> doing this on an ASIC with FFs but without ROM cells, I don't think
> you can get (much) more minimal.

This feedback pattern is not classical because the input bit is labeled one rather
than zero.  If all bits i are relabeled as i-1 and bit zero relabeled as bit n-1,
you get a classical bit ordering.  Only a few values of N give maximal period with
this feedback pattern.

>
>
> As to figuring out the values for defining which settings of the FFs
> give which counts before looping, it can be done in linear time (with
> linear space) if you count operations the same way I do.  (It's n**2
> if you count them at the bit level.)

Hmm, linear on 2^N maybe, but that's exponential on the length of the register.

>
>
> Here is some pseudo C-code for it:
>
> int next[n], prev[n], count[n], which[n];
> int i, c;
>
> for (i = 0; i < n; ++i) {
>         next[i] = 0;
>         prev[i] = 0;
>         count[i] = 0;
>         while[i] = 0;
>         }
>
> // now the values are correct for [0] and initialized for all others
>
> c = n - 1;
> for (i = 1; next[i] == 0; i = next[i]) {
>         next[i] = a[i];
>         prev[next[i]] = i;
>         count[i] = c;
>         while[c] = i;
>         --c;
>         }
>
> // now the values are correct for the entire array
> // thus, count[i] gives the number of entries before saturation,
> // and, which[i] gives the value to pick to be i away from saturation

How do you plan to do this for a 64-bit register?  Even if you could allocate four
arrays with 8-byte elements containing 2^64 such elements (2^69 bytes), how long
do you expect it to take your computer to count to 2^64?

>
>
> This works well for LFSR that have two cycles, 1 and 2**n-1.  With
> another loop or two, one can get a linear algorithm for any FSA's

I think you left out an adjective:  " ... algorithm for any _*/small/*_ FSA's ..."

>
> (with state numbered 0..n) that end with a cycle, even if there are
> some elements that are not part of the cycle but only lead into it at
> various points.
>
> Hope this helps,
> -Chris
>
> *****************************************************************************
> Chris Clark                    Internet   :  [EMAIL PROTECTED]
> Compiler Resources, Inc.       Web Site   :  http://world.std.com/~compres
> 3 Proctor Street               voice      :  (508) 435-5016
> Hopkinton, MA  01748  USA      fax        :  (508) 435-4847  (24 hours)
> ------------------------------------------------------------------------------


------------------------------

From: Future Beacon <[EMAIL PROTECTED]>
Subject: How Uncertain?
Date: Sun, 25 Jun 2000 19:03:04 -0400



Files A and B are composed of the least significant two bits of
bytes found in news group messages (excluding headers, carriage
returns, line feeds, and spaces).  A and B contain one megabyte
each of this data.  In each case, these bits are strung together,
four pairs per byte.  At least three quarters of the original data
available in the news group messages is not used.

Then, file B is divided into two files (BP and BQ) this way:  If the
first bit in A is a 0, then the first bit in B becomes the first bit
in BP.  If the first bit in A is a 1, the first bit in B becomes the
first bit in BQ.  In this way, each next bit in A determines whether
the next bit in B becomes the next bit in BP or BQ.  When the bits
of A and B are exhausted, BQ is appended to BP and the resultant
file is called RAND.

Two people wishing to send and receive encrypted messages use the
one-time-pad method but instead of using a random number generator
to create their shared random numbers, they use the file called
RAND.

Nobody except these two people know how the news group messages are
selected and all of the files are kept secret.  Does anybody know
how to attack the encrypted messages?  Can anybody characterize the
degree of orderliness or predictability of RAND (without knowing A
or B)?

Thank you for your help.


Jim Trek
Future Beacon Technology
http://eznet.net/~progress
[EMAIL PROTECTED]



------------------------------

From: Jonathan Edwards <[EMAIL PROTECTED]>
Subject: Re: "And the survey says"
Date: Sun, 25 Jun 2000 20:01:23 -0400



On Sun, 25 Jun 2000, Scott Fluhrer wrote:

> 
> For that matter, I don't know of any cipher (practical or impractical)
that
> has property #1 (with an Oracle, you can break an OTP trivially). 
> 
> [1] The constraint on the Oracle is that you can't ask the decryption of the
> input ciphertext.

OK, I'm dense.

How does the Oracle break the OTP?  Do you ask it the bits of the key?


------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: Tiny hash?
Date: Mon, 26 Jun 2000 00:05:53 GMT

Actually, what I want it for is to make something similar to the lja1
cipher I saw in a contest on this NG.

The important properties that the function should have, are 1) if any
bit in the key changes, approximately half the bits of the hash output
change for any fixed input, and 2) if any bit in the input changes,
approximatly half the bits of the output change for any fixed key.

The input lengths that I'll be using the hash function with will be 1,
3, 7, 15, or 31 bytes.  (Sizes 1 and 3 will be for testing, not use,
though).

One should note that the keyspace of of a well-implemented version of
lja1 is log2(255!) bits.  I don't, however, know if there are any weak
keys.  

I found thethe LJA1 cipher at
<http://www.wizard.net/~echo/crypto-contest.html>, and I recently posted
my own implementation to the NG asking if anyone had done any work on
breaking it.  (No-one seems to have done so).

A symbolic version of a single round of the cipher is:
for i = 1 to blocksize
        x = H_K( data[i+1..blocksize] || data[0..i-1] )
        data[i] ^= x + counter
        counter = counter + 1
end for

Rex Stewart wrote:
> 
> Maybe he mis-stated his needs.
> 
> Perhaps what he really needs is a keyed CRC.  If his opponet is
> a mindless virus, or non crypto educated hacker, and he keeps
> both the CRC and Key off line he might think 8 bits is enough.
> (I disagree, but He might think that.)
> 
> In this case, what he probably needs to find is a good
> ASM programmer who can fiddle with the old CRC-16 routine
> to create a keyed version. (I still think he would need at
> least 32 bits, and in most cases 32 bits would actually be
> faster - but that is my opinion.)
> 
> --
> Rex Stewart
> PGP Print 9526288F3D0C292D  783D3AB640C2416A
> 
> In article <[EMAIL PROTECTED]>,
>   Simon Johnson <[EMAIL PROTECTED]> wrote:
> > None exist because an 8-bit output is useless.....
> > Using the Birthday attack you need to search just 2^4 = 16
> > text's to find a collison.
> >
> > Like i said, useless
> >
> > Got questions?  Get answers over the phone at Keen.com.
> > Up to 100 minutes free!
> > http://www.keen.com
> >
> >
> 
> Sent via Deja.com http://www.deja.com/
> Before you buy.


------------------------------

From: "Scott Fluhrer" <[EMAIL PROTECTED]>
Subject: Re: Variability of chaining modes of block ciphers
Date: Sun, 25 Jun 2000 16:49:41 -0700


Mok-Kong Shen <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>
>
> Mark Wooding wrote:
>
> > Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> >
> > > O.k. We could discuss on that. Please give your arguments. (Note that
> > > this issue is independent of the issue of whether chaining only adds
> > > a couple of bits to the key space.)
> >
> > No.  Just using *a* chaining mode doesn't affect the key space at all.
> > Using an secretly chosen chaining mode will add a few effective key
> > bits.
>
> If you use a secret IV, that should help a little bit.

Depends on the mode.  If we're talking to some variant of CBC or CFB, it
won't help at all (as long as the message is at least two blocks long) --
the attacker selects some block other than the first.

>
> > > (I could have instead given an example like this: My boss, who happens
> > > to be a friend of an amateur crypto designer, insists on using and
> > > buying the software developed by the latter, the security of which I
> > > am however not very sure. Do you find this version of an example
> > > better for you?)
> >
> > Not really.  Either explain to your boss that he needs to get a clue
> > from somewhere, or find a different boss.
>
> I am not sure that's always a recommendalble stragegy (especially
> in times of unemployment problems).

And even if that's not a concern, it'd be a shame to blow off a good boss
with one quirk.  Alternative strategy: implement the amateur crypto design,
but preprocess the plaintext just a bit, so of like a "prewhitening" phase.
For example, pass it through Rijndael first. (1/2 :-)

>
> [snip]
>
> > > O.k. Suppose that the best cipher you could manage to get is not
> > > good enough, but almost. Suppose adding the strength due to chaining
> > > renders it above the capability of the opponent. Would you care to use
> > > chaining in this situation or not?
> >
> > I'm disputing whether this situation can happen at all!  It calls for an
> > impractically fine judgement of the adversary's capabilities.
>
> I was providing a 'boundary' case for consideration. One's judgement
> in crypto is indeed in my opinion necessarily fairly subjective and gross
> in general. But there could also be (admittedly rare) special situations
> where one could have sufficient materials to do fairly accurate estimate
> of the opponent's capability. For example, if one somehow knows
> the computing power of the opponent and brute force is the analysis
> method being used.

You rarely get precise estimates of the computer power available to the
opponent.  If nothing else, opponents tend to stay around for a while, and
the computer power available tends to rise unpredictably over time.


--
poncho




------------------------------

From: "Scott Fluhrer" <[EMAIL PROTECTED]>
Subject: Re: Variability of chaining modes of block ciphers
Date: Sun, 25 Jun 2000 16:56:20 -0700


Mok-Kong Shen <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>
>
> Scott Fluhrer wrote:
>
> > Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> > > The most popular block chaining mode seems to be CBC.
> > > There is also PBC which chains with plaintext blocks.
> > > One can also accumulate the previous blocks for doing the
> > > chaining and use plaintext as well as ciphertext for
> > > chaining. (I used this in one of my own designs.) By
> > > combinatorics this gives 8 variants. Obviously one can
> > > also use modular addition instead of xor and do some
> > > random rotations if one likes. I think that the variability
> > > of chaining modes could be advantageousy exploited such
> > > that the actual chanining mode used in a message has to be
> > > guessed by the opponent, thus redering his task much more
> > > difficult.
> > Why would it?  All these operations can be summarized as:
> >
> >   C_i = Encrypt( F( P_i, C_{i-1}, P_{i-1} ) )
> >
> > for a relatively small set of F's, where:
> >
> >   P_i is the ith plaintext block
> >   C_i is the ith ciphertext block
> >   Encrypt is the underlying block cipher (on a single block in encrypt
mode)
> >
> > Given that the number of possible F's is relatively small, and assuming
that
> > P_i, P_{i-1} are known (that is, this is the known-plaintext attack),
the
> > attacker can easily compute all 8 possible values of F( P_i, C_{i-1},
> > P_{i-1} ), and then do a brute force calculation of:
> >
> >    Decrypt( C_i )
> >
> > looking for any of the 8 possible values.  When he finds one, it is easy
to
> > verify if he got the key, or a false hit.
> >
> > This allows the attacker to look through all 8 possibilities with a
> > relatively small additional effort beyond a brute force search of the
> > underlying block cipher.  Thus, you have gained no additional security
for
> > your additional complexity.
>
> How about the case where IVs are secret? I am not claiming tremendous
> gain but some gain that may overweigh the cost.
As I mentioned upthread, it doesn't help if the attacker can select i>1.

> As to added complexity,
> I'll say that's very trivial according to experience of implementation
> of one of my own algorithm.
I take it you didn't try to implement the algorithm in hardware.  In any
case, it appears that to implement this efficiently, you need either to have
N copies of the encryption/chaining routine (uses more memory), or a single
copy that can dynamicly select between chaining modes (which is a bit
slower).  Now, since it appears that this doesn't gain any additional
security, I don't see how either cost is worth it, even though the costs are
fairly small in both cases.


--
poncho





------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Quantum computing
Date: Mon, 26 Jun 2000 00:26:50 GMT

Bill Unruh wrote:
> We have no idea at all of what is involved with building a quantum
> computer with an internal memory of more than say 10 bits.

Wrong.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Quantum computing
Date: Mon, 26 Jun 2000 00:27:16 GMT

Bill Unruh wrote:
> Uh, no. Error correction itself requires incredibly unnoisy qbits. The
> raw level of error must be less than about 10^-4 for one even to dream
> of error correction.

Wrong again.

------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Subject: Re: "And the survey says"
Date: Sun, 25 Jun 2000 17:23:04 -0700


David A Molnar <[EMAIL PROTECTED]> wrote in message
news:8j5spb$1gc$[EMAIL PROTECTED]...
> Paul Pires <[EMAIL PROTECTED]> wrote:
> > Scott Fluhrer <[EMAIL PROTECTED]> wrote in message
> > news:8j5c0i$tr6$[EMAIL PROTECTED]...
> >>
>
> >> 1. Provable security.  A formal proof that any Turing machine with an
> >> encryption/decryption constrained [1]oracle, given a ciphertext
encrypted
> >> with a random key, is unable to find the corresponding plaintext with
> >> probability greater than epsilon in less than (huge number) of steps.
Or,
> > a
> >> proof of something at least as interesting.
>
> > You should probably incude the inability of the attack to determine any
> > portion of the key with any probability greater than random chance.
>
> well, first of all, you care about the message, not about the key. right?
> In a previous thread here, someone (my apologies, cannot remember who)
> pointed out LION and BEAR as examples of schemes where you could prove
> that the key could never be recovered from the ciphertext...but where
> the cipher turned out to be an identity permutation for some keys and
> some plaintexts.

Seems to me that if a key is used more than once, the secrecy of the key has
value seperate from any one plaintext although your comment is not taken
lightly, secure key does not necessarily equal secure plaintext.

> second, you should also maybe consider a cipher where even if you DO know
> part of the key, it does not help you any more than cutting down on your
> guessing time.

Or an indication that it is beginning to succum to cryptanalysis?

> (question of the moment : is this notion of "no help from partial key
> exposure" implied by semantic security? my guess would be no, because
> you can take any semantically secure cryptosystem C and create a C'
> which uses only the first half of the key or something like that).
>
> > But that's cheating! I feel bad enough fantasizing about the Ideal
cipher, a
> > postulated God machine makes even me a little edgy. More to the point,
what
> > would be the attributes of the cipher that make it pass such a test if
it
> > existed?
>
> The thing is, we *have* public-key ciphers which *do* satisfy those
> kinds of definitions, subject to the major caveat that we have a
> trapdoor one-way function. The definitions are so "strong" because
> they are trying to capture all the chosen plaintext and chosen ciphertext
> attacks which anyone could ever think of. Yes, it's a "God Machine,"
> but only because our adversary must be considered as next to God if
> we are not to fall into the trap of underestimation.

You cause my head to hurt. Might be good. I specifically focused on
symmetric as a concept I can wrap my brain around, not as indication of the
relative merits of symmetric vrs assymetric. I plead self defense not
intelegence.

>
> RSA with Optimal Asymmetric Encryption Padding
> is an "almost"-example of such a system. I say "almost" because the
> proof uses "random oracles" and so is not considered by some to be
> totally kosher (I can expand on that if anyone wants, but it's a tangent).
> A better example would be the Cramer-Shoup system from Crypto '98,
> which they managed to show is "secure" in the kind of sense given above
> if and only if the "decision diffie hellman problem" is hard.
>
> Such ciphers tend to be randomized. Encrypting involves not only the key,
> but some random bits as well to make the output look more random.
> This requires a better treatment that I don't ahve time to write
> right now. You can try looking at Bellare & Goldwasser's lecture notes
> and/or Goldreich's fragments of a chapter on encryption schemes to
> see some of the flavor of the ciphers which satisfy these definitions.
> Also the Optimal Asymmetric Encryption Padding paper.

Thanks for the links.

> check http://www-cse.ucsd.edu/users/mihir/
>
> and http://theory.lcs.mit.edu/~oded/
>
> and follow the publication links.
>
> There's a lot of work that I know of on building public-key randomized
> cryptosystems, but I don't know of that much on building randomized
> symmetric ciphers...maybe because the people working in each area do not
> seem to overlap very much.
>
> Jon Katz and Moti Yung had a neat paper in the last STOC on what
> definitions of security for symmetric cryptography would look like.
> I don't have it on me now, but looking at it might be instructive.

I heard somewhere (and I could be wrong)  that Phil Rogaway (SP?) was doing
something similar.

> -dmolnar

Thanks

Paul






------------------------------

Subject: Re: How Uncertain?
From: tomstd <[EMAIL PROTECTED]>
Date: Sun, 25 Jun 2000 17:20:35 -0700

Future Beacon <[EMAIL PROTECTED]> wrote:
>
>
>Files A and B are composed of the least significant two bits of
>bytes found in news group messages (excluding headers, carriage
>returns, line feeds, and spaces).  A and B contain one megabyte
>each of this data.  In each case, these bits are strung
together,
>four pairs per byte.  At least three quarters of the original
data
>available in the news group messages is not used.


Maybe you missed a very important point.  ASCII DATA IS NOT
RANDOM!!!

There will be strong biases from ANY bit you take from ascii
characters...

Tom

Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------

Subject: Re: DES and questions
From: tomstd <[EMAIL PROTECTED]>
Date: Sun, 25 Jun 2000 17:23:33 -0700

rick2 <[EMAIL PROTECTED]> wrote:
>I have two more questions about implementing DES in a program.
(Feel
>free to Dope-Slap(TM) me if these are too close to outright
idiotic.)
>
>1. It turns out that 3-DES is kind of slow, so any speed-up
tricks
>would be worth it. So, not very brilliant, but what about 2-DES?
>Would that have a encryption strength of 128 (or should it be
112)
>bits? Geez, that seems pretty strong, no? Especially if people
have
>been using only two different keys for 3 rounds of DES, why not
just
>do the two with two keys?
>

I don't know the very specifics but I believe the meet-in-the-
middle attack is much more efficient for even-usage of DES so 2-
des or 4-des would be easier to break then 3-des or 5-des.

Here's a tip.  DONT USE DES!

>2. I think it would also be convenient for the user to have
some sort
>of a check when they enter the password, to alert them if they
made
>a mistake, but obviously I don't want to store the real
password in
>their data file. How can I do this without compromising the
security
>too much? Would it be acceptable to simply add up all the
characters
>in the password and then save that as a single byte "checksum"
of
>the password? I guess that would reduce the security, but by
how much
>(like maybe by 8 bits worth, if that is a measure of security?)
>Would it be better to save a stronger type of hash of the
password, or
>would that also not really help the security either? I think I
only
>need to save 8 bits because it's OK with me if they make an
error
>that isn't rejected 1 out of 256 times...

Why not just encrypt a fixed block.  The chances of getting the
passwd wrong and the block right would be slim.

Tom


Got questions?  Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com


------------------------------

From: zapzing <[EMAIL PROTECTED]>
Subject: Re: Compression and known plaintext in brute force analysis (restatements 
caused by the missing info .... thread)
Date: Mon, 26 Jun 2000 00:20:33 GMT

In article <8j5ak1$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Guy Macon) wrote:
> zapzing wrote:
> >
> >
> >In article <[EMAIL PROTECTED]>,
> >  "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> >> zapzing wrote:
> >> > And another problem: suppose a random (or already encrypted)
> >> > file is encrypted. Then an amount of predictable
> >> > information will be added to a file that was previously
> >> > perfectly unpredicatble.
> >>
> >> ? That is false in general.
> >
> >A "compression" program cannot "compress" all
> >files, while still maintaining the same amount
> >of information. If you attempt to compress a
> >random file, therefore, the resultant
> >"compressed" file will actually be larger.
> >Either there must be a short header (perhaps
> >only one bit) that states "this looks random
> >to me - I couldn't compress it" or some sort
> >of table must be transmitted which states
> >something like "each character represents
> >itself" or something like that (I cannot
> >help being vague because the exact situation
> >depends on the exact compression algorithm
> >used.)
> >
> >If you still disagree, I challenge you to
> >present a "compression" algorithm that will
> >compress *all* files without loss of
> >information.
>
> One slight correction;  The above is true for some random files
> and not true for others.  A 100K random file has a very small
> but still non-zero chance of being all zeros, a chapter from
> shakespeare, etc.
>
> No compression program can compress all random files, but all
> compression programs can compress some random files.

I still maintain that, with the exception of your
exteamly improbable but I concede possible scenario,
if random numbers are sent through a compression
program, the result will in all probabitily have
less entropy than the input file.

--
If you know about a retail source of
inexpensive DES chips, please let
me know,  thanks.


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Subject: Re: "And the survey says"
Date: Sun, 25 Jun 2000 17:35:22 -0700


Simon Johnson <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Has it actually been proven|disproven that such a perfect cipher
> could ever be constructed?

I have no idea how such a proof could be attained. My knee jerk reaction is
"Of course not".

Paul

>
> Got questions?  Get answers over the phone at Keen.com.
> Up to 100 minutes free!
> http://www.keen.com
>





------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to