Cryptography-Digest Digest #932, Volume #13      Sun, 18 Mar 01 11:13:00 EST

Contents:
  IDEAL ENGLISH TEXT RIJNDAEL ENCRYPTION (SCOTT19U.ZIP_GUY)
  Re: What do we mean when we say a cipher is broken?  (Was Art of Cryptography) 
(Benjamin Goldberg)
  Re: How to eliminate redondancy? (SCOTT19U.ZIP_GUY)
  Re: How to eliminate redondancy? (Nicol So)
  Re: An extremely difficult (possibly original) cryptogram (Factory)
  Re: What do we mean when we say a cipher is broken?  (Was Art of    (Benjamin 
Goldberg)
  Re: PGP "flaw" (Tom McCune)
  Profile analysis and known plaintext  (Steve Portly)
  Re: How to eliminate redondancy? (SCOTT19U.ZIP_GUY)
  Re: How to eliminate redondancy? ("Tom St Denis")
  Re: IDEAL ENGLISH TEXT RIJNDAEL ENCRYPTION ("Tom St Denis")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: IDEAL ENGLISH TEXT RIJNDAEL ENCRYPTION
Date: 18 Mar 2001 15:08:33 GMT

 Dear folks I know its hard to get most of you motivated.
But would there be interest in making a "perfect english"
encryption using anyone favorite fixed block size cipher.
Such that any wrong key would lead to valid english text.

 I could help in such a project but I amn't sure I could use
it simce I speel so badly. But here is the outline of
how to do it.

Get a list of allowed english words. The words are of the
form of the spellings plus a space. Only one case allowed
upper or lower. 
list has the word followed by weighted occurace in english.

A computer program takes the list and makes a huffman tree
or we usee a weigthed PPM tree ala Matt timmermans compressor
code.

When a message is to be encoded only the dictionary words are
allowed. When the compression takes place it is fully bijective
to a 8-bit byte binary file. Then you encrypt useing your 
favorite block cipher.

Whan an attacker tries a wrong key it will always decrypt to
a valid message full of english words.

What do you folks say to this. It could easily be automated
for words that are not in the dictionary one could also have
40 symbols for the 40 english sounds that make up the language.
(I heard there was 40 but use what ever the number is) 

Hell you could just use the sounds for word and have a dictionay
that only associates word typeings to the sounds and use symbols
for the sounds if one wants.


ANy way I hope you folks get the idea. Truely bijective compress
encryption of engish or Navaho or what ever is in reach. Only
a few people need to get motivated to do it.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
        http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website **now all allowed**
        http://members.xoom.com/ecil/index.htm
Scott LATEST UPDATED source for scott*u.zip
        http://radiusnet.net/crypto/  then look for
  sub directory scott after pressing CRYPTO
Scott famous Compression Page
        http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: What do we mean when we say a cipher is broken?  (Was Art of Cryptography)
Date: Sun, 18 Mar 2001 15:18:23 GMT

wtshaw wrote:
> 
> In article <[EMAIL PROTECTED]>,
> [EMAIL PROTECTED] wrote:
> >
> > I say, "A cipher is clearly insecure when the of cost of a
> > cryptanlyitic attack is lower than the value of success."  I
> > normally measure the cost of attack as  work, access, indifference
> > to detection, special knowledge, and time to detection and
> > corrective action (WAIST).  The value of success can be measured in
> > dollars or alternative values such as vengeance.
> >
> > Because vengeance is hard to measure and crypto is cheap, I use it
> > in such a way as to raise the cost of attack several orders of
> > magnitude higher than the value of
> > success.
>
> But, measures of strength need not be subjective.  An absolute
> scientific statement that a one cipher can be more secure and have a
> longer unicity distance than another can be an objective fact not
> based on the value of the messages involved.

Murray is describing the strength of the cipher in the context of it
being a component of a real crypto system, and you are considering the
strength of a cipher in isolation.

A cipher in isolation should have some objective [theoretical] strength
value... or rather a strength function in terms of time and memory.  But
ciphers aren't used in theory, they are used in practice (remember my
sig).

A when cipher is part of a real system, we *can* try to measure the
real, practical (non- theoretical or subjective) cost of breaking the
system, and we *can* try to measure the real, practical (non-
theoretical or subjective) value of breaking the system.  If the cost to
break the cryptosystem is less than the real value of the real
information which would be gained, then the cryptosystem is broken.

Cryptography is one of those fields where the difference between theory
and practice is not a thin grey line, but a vast chasm.  Good strong
bridges with scrupulous inspections are a must.

-- 
The difference between theory and practice is that in theory, theory and
practice are identical, but in practice, they are not.

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: How to eliminate redondancy?
Date: 18 Mar 2001 15:20:12 GMT

[EMAIL PROTECTED] (Douglas A. Gwyn) wrote in <[EMAIL PROTECTED]>:


>>   Compression goes along why towards solving the problem if it
>> allows for any possible input block.
>
>All commonly used compression schemes work for any input.
>
>

   Here is where you maybe missing it. Compression is a two
sided coin. compression/decompression. Most people are so
focused on the frontside they fail to check the backdoor.
True most compressors at the front door the compresion side
do handle all 8-bit byte binary files. But they leave the
back door wide open. They don't handle the decompression of
all possible files. If they did it then 
1) for any file X then compress( uncompress(X))= X
would be true for all files.
 This in something anyone can check. Take a file use
Notepad enter a message "hello world" save it as a file.
Now uncompress it with any of your favorite compressors.
Most will fail on the spot possibly giving an error message.
You may have to go to DOS or use some method to change file
name extension to the type your compressor uses.
A few will actually do somthing. Next if you get this far
compress the resultant file using the compressor part of your
compressor. Know check it with "fc /b file1 file2" see if
the match. If not your compressor failed.




David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
        http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website **now all allowed**
        http://members.xoom.com/ecil/index.htm
Scott LATEST UPDATED source for scott*u.zip
        http://radiusnet.net/crypto/  then look for
  sub directory scott after pressing CRYPTO
Scott famous Compression Page
        http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:

------------------------------

From: Nicol So <[EMAIL PROTECTED]>
Subject: Re: How to eliminate redondancy?
Date: Sun, 18 Mar 2001 10:25:32 -0500
Reply-To: see.signature

"Trevor L. Jackson, III" wrote:
> 
> "Douglas A. Gwyn" wrote:
> 
> > "Trevor L. Jackson, III" wrote:
> > > Given a highly redundant plaintext one can eliminate the redundancy
> > > by masking with a good PRNG.
> >
> > I guess at this point we ought to ask what people mean by "redundancy".
> > To me, that scheme doesn't reduce redundancy by more than the bits in
> > the PRNG parameters.  It does make it more "latent", however.
> 
> The same complaint can be leveled against any lossless transform.

That's not true. Lossless compression works exactly by reducing the
redundancy in the representation of information.

The per-symbol redundancy of a source is the difference between: (1) the
maximum per-symbol entropy possible based on the alphabet, and (2) the
actual (average) per-symbol entropy of the source.

The information content of a stream of symbols is what the receiver
cannot predict a priori, w.r.t. to the best possible (predictive)
receiver. There is no limitation on the computing power of the
hypothetical receiver, and it can have any a priori knowledge about (the
statistical properties) of the source (but not about the individual
sequences of symbols that the source emits).

Redundancy is about the density of information in a representation
scheme.

>From an information-theoretic viewpoint, masking plaintext using a PRNG
does not change its redundancy. What it does change is what would be a
best possible encoder. The best encoder in this case will have knowledge
about the PRNG & its parameters (in order to encode efficiently). The
masking has no effect on the size of the information representation.

Lossless compression, on the other hand, reduces redundancy because it
makes the information representation more "compact".

I suspect that looking at redundancy from an information-theoretic
viewpoint was not what you had in mind. Maybe you're thinking about the
*apparent* redundancy as seen by an adversary, which has limitations on
computing power and a priori knowledge, and is *not* the theoretically
best encoder of the masked plaintext stream.

-- 
Nicol So, CISSP // paranoid 'at' engineer 'dot' com
Disclaimer: Views expressed here are casual comments and should
not be relied upon as the basis for decisions of consequence.

------------------------------

From: Factory <[EMAIL PROTECTED]>
Crossposted-To: rec.puzzles
Subject: Re: An extremely difficult (possibly original) cryptogram
Date: Mon, 12 Mar 2001 18:38:58 +1100

In article <98tku5$3jud5$[EMAIL PROTECTED]>, 
[EMAIL PROTECTED] of the mothership connection said...
> > Was it *absolutely* necessary to quote the entire original message ?
> 
> Yes and no....
> 
> Not all news servers are created equal.... By quoting the whole message, it
> gives one more chance to those srevers who didn't get the original message
> to get the quoted copy of it.. so more people have a chance to see it.
> 
> I found many times a reply to a message on a server where the original
> message does not exist....
> 
> On the other hand, if the message has already been quoted 2 or 3 times... it
> can safely be snipped an ony subsequent reply..
> 
> Comprenez-vous, mon ami ?

  Hmm critizing a message for being too long, in a message which is even 
longer, is like, very, very stupid.

  - Factory

------------------------------

From: Benjamin Goldberg <[EMAIL PROTECTED]>
Subject: Re: What do we mean when we say a cipher is broken?  (Was Art of   
Date: Sun, 18 Mar 2001 15:45:30 GMT

Douglas A. Gwyn wrote:
> 
> Paul Crowley wrote:
> > ...  The attacker doesn't have to recover plaintext; they just
> > have to demonstrate that it's not like an "ideal" component of its
> > type.  Stream ciphers are a good example here: if there's a cheaper
> > way than brute force of detecting that the stream cipher is in use
> > at all, that's a problem.
> 
> I have to disagree.  That would be the case for *steganography*,
> but not for encryption.  In general it is *obvious* when encryption
> is being used, and if the adversary cannot recover any of the
> hidden information then he cannot be said to have broken the system.
> There can certainly be strong patterns in ciphertext without
> implying that the encryption is easily broken; for example every
> 8th bit could be a parity bit.  (Not usually the case, but it
> shows that nonrandomness does not imply breakability.)

When Crowley said "if there's a cheaper way than brute force of
detecting that the stream cipher is in use at all, that's a problem," I
believe he meant "detecting *that particular* stream cipher," not
"detecting that there *is* a stream cipher."

For instance, suppose we suspect (but don't know) that an LFSR is in
use, with a particular order-32 polynomial.  It takes only about 64 bits
of known plaintext to be absolutely certain that the cipher in use is an
LFSR with that particular polynomial.

OTOH, if we suspect (but don't know) that RC4 is in use, it takes many
megabytes of known plaintext to detect the characteristic bias that RC4
puts in it's output keystream.

-- 
The difference between theory and practice is that in theory, theory and
practice are identical, but in practice, they are not.

------------------------------

From: Tom McCune <[EMAIL PROTECTED]>
Subject: Re: PGP "flaw"
Date: Sun, 18 Mar 2001 15:44:36 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:

<snip>
>Viewed in that light, then, PGP's original backdoor implementation, and
>its need for having one, really -isn't- necessarily a dark guv'mint
>conspiracy!

Although as an employee, I wouldn't like the ADK, I agree that this meets a 
legitimate management purpose.  But calling it a "backdoor" is not correct, 
and may produce unnecessary confusion.

Tom McCune
My PGP Page & FAQ: http://www.McCune.cc

------------------------------

From: Steve Portly <[EMAIL PROTECTED]>
Subject: Profile analysis and known plaintext 
Date: Sun, 18 Mar 2001 10:50:07 -0500

Several people have mentioned the use of non numerically intensive
methods to determine the content of encrypted plain text.  Is there
formal methodology for these techniques?  Can actual numerical
representation for the *advantages gained* through these techniques be
assigned?  You would think that a cross disciplinary investigation of
the subject would give it some numerical legitimacy as well as a better
understanding of the subject.


------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: How to eliminate redondancy?
Date: 18 Mar 2001 15:49:17 GMT

see.signature (Nicol So) wrote in <[EMAIL PROTECTED]>:

>
>That's not true. Lossless compression works exactly by reducing the
>redundancy in the representation of information.
>

   Actaully most Lossless compressors that are not bijective can
actaully increase redunacy for random files. It is just that you
hope the file your attempting to compress is one that the compressor
will remove some redundacny so that the entropy per bit goes up.

>The per-symbol redundancy of a source is the difference between: (1) the
>maximum per-symbol entropy possible based on the alphabet, and (2) the
>actual (average) per-symbol entropy of the source.
>
>The information content of a stream of symbols is what the receiver
>cannot predict a priori, w.r.t. to the best possible (predictive)
>receiver. There is no limitation on the computing power of the
>hypothetical receiver, and it can have any a priori knowledge about (the
>statistical properties) of the source (but not about the individual
>sequences of symbols that the source emits).
>
>Redundancy is about the density of information in a representation
>scheme.
>
>From an information-theoretic viewpoint, masking plaintext using a PRNG
>does not change its redundancy. What it does change is what would be a
>best possible encoder. The best encoder in this case will have knowledge
>about the PRNG & its parameters (in order to encode efficiently). The
>masking has no effect on the size of the information representation.
>
>Lossless compression, on the other hand, reduces redundancy because it
>makes the information representation more "compact".


   However if one knows what compressor is being used one can
creat input files that appear highly redundant and yet after
the lossles compression takes place even to the the eye the
resultant file looks more redundant.
   One should never lose sight of the fact that lossless compression
is at best only a transform of files from one set to another and if
it is fully bijective one could pick what the compressed file will be.

>
>I suspect that looking at redundancy from an information-theoretic
>viewpoint was not what you had in mind. Maybe you're thinking about the
>*apparent* redundancy as seen by an adversary, which has limitations on
>computing power and a priori knowledge, and is *not* the theoretically
>best encoder of the masked plaintext stream.
>

  About the best way one could use a compressor is to design one for
the type of files you are using. True most general compressors seem
to make large classes of files smaller. They also seem to add information
so that when encrypted they result can be easier to break. One way
to aviod this is to use compressors that are for the type of file you
are going to encrypt. And assuming one is going to compress to a 8-bit
byte binary file. Where one is using a general block cipher to encrypt.
One should  use a compressor that will decompress and bianry file
to the target set. And the file should be able to be compressed back
to the same file. Otherwise if a false key is tested. It will lead
to something impossible due to the information that the poorly
desinged compressor added to the compressed file.

  In short if your going to compress and encrypt. Don't use 99.99%
of the compression products. If you want general compression and
encryption. The only product that I know that uses "good bijective
compression and bijective encryption is Matts BICOM.


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
        http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website **now all allowed**
        http://members.xoom.com/ecil/index.htm
Scott LATEST UPDATED source for scott*u.zip
        http://radiusnet.net/crypto/  then look for
  sub directory scott after pressing CRYPTO
Scott famous Compression Page
        http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:

------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: How to eliminate redondancy?
Date: Sun, 18 Mar 2001 16:09:13 GMT


"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> see.signature (Nicol So) wrote in <[EMAIL PROTECTED]>:
>
> >
> >That's not true. Lossless compression works exactly by reducing the
> >redundancy in the representation of information.
> >
>
>    Actaully most Lossless compressors that are not bijective can
> actaully increase redunacy for random files. It is just that you
> hope the file your attempting to compress is one that the compressor
> will remove some redundacny so that the entropy per bit goes up.

That makes no sense whatsoever.  Good codecs will reduce the redundancy of
the output that's why they are good.  If I take X bytes and pack it into Y
bytes where Y << X then I have a good codec *and* the information density
per bit must be higher.  Whereas if I use your method where I pack X bytes
into Z bytes where Y << Z < X your information density per bit must be lower
then before.

Tom



------------------------------

From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: IDEAL ENGLISH TEXT RIJNDAEL ENCRYPTION
Date: Sun, 18 Mar 2001 16:09:49 GMT


"SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Dear folks I know its hard to get most of you motivated.
> But would there be interest in making a "perfect english"
> encryption using anyone favorite fixed block size cipher.
> Such that any wrong key would lead to valid english text.

WTF?  Have you ever considered the fact that RIJNDAEL is not only used to
encrypt english text?  What if I have ASCII symbols in my text?

Tom



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to sci.crypt.

End of Cryptography-Digest Digest
******************************

Reply via email to