Cryptography-Digest Digest #450, Volume #10 Tue, 26 Oct 99 10:13:03 EDT
Contents:
Re: This compression argument must end now (SCOTT19U.ZIP_GUY)
Keyed permutation (Adam Durana)
Re: Unbiased One to One Compression (SCOTT19U.ZIP_GUY)
Re: Keyed permutation (Hideo Shimizu)
Re: RC 4: security & law ([EMAIL PROTECTED])
Twofish: Optimization for GF(2^8) multiplication desired ([EMAIL PROTECTED])
Re: OAP-L3: How Do You Spell S-H-A-R-E-W-A-R-E (Jerry Coffin)
Newbie question ("The Gorf")
multiple signature scheme ("dino")
Re: This compression argument must end now (Tom St Denis)
Re: This compression argument must end now (Tom St Denis)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: This compression argument must end now
Date: Tue, 26 Oct 1999 05:56:34 GMT
In article <_d6R3.6864$[EMAIL PROTECTED]>, gtf[@]cirp.org (Geoffrey T.
Falk) wrote:
>In article <[EMAIL PROTECTED]>, Tim Tyler <[EMAIL PROTECTED]> wrote:
>>David Scott has recently invented perhaps the first compressioon system
>>suitable for cryptography known to man. Since *all* other known
>>compression routines potentially weaken your encypherment - and add
>>"unnecessary" information to the file - I believe this event should not be
>>under-estimated.
>
>That is quite an overstatement. David's system might be suitable for
>cryptography, if it compresses the data well. The much-hyped "one-on-one"
>property does not guarantee this at all.
>
>For example, here is my favourite compression algorithm: Copy the input
>file to the output file. This algorithm is "one-on-one." But it is
>worthless for cryptography.
I don't think most people would consider a direct copy a compression
program but you are free to think so if you wish.
>
>I understand that David uses adaptive Huffman coding, which is not
>a very good compression scheme by itself. Adaptive arithmetic coding
>is a bit better. But neither will give you very good compression by
>itself, because they only look at the frequency count of symbols in
>portions of the input file.
My understanding is that arithmetic is patented but I may look at
it some day in the future. I think Huffman compression is very good
for many things if you don't stick with the direct copying that you
favored in the paragrph above.
>On the other hand, either of these methods makes a great backend
>for a decent predictive algorithm. Perhaps David could adapt his
>program in such a manner.
>
>>At the moment we have very few "one-on-one" compressors.
>
>To the contrary, there are any number of "one-on-one" compressors.
>My null compressor (above) is the simplest. Do you want more examples?
How about some that would compress general text better or is that to
hard?
>In short, there is no reason to think that this "one-on-one" property
>by itself is helpful in any way. Much more is needed.
>
>The quality of the compression makes vastly more difference to the
>difficulty of attacking the system. This is because the attacker
>must decrypt the entire message anyways before he can test if it
>is a valid compressed file. Whereas, to test source statistics he
>only needs a partial break.
What are you talking about. There is so little thought given
to compression routines that it is likely only a portion of message
would have to be decrypted before it could be tested as a valid
file for the compression that was used. Even worse if a purely
random file was compressed the attacker could rule out many
possible solutions by examing only the first few blocks. This is
a case where bad compression is much worse than no compression
at all since it is giving info to the attacker.
.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: Adam Durana <[EMAIL PROTECTED]>
Subject: Keyed permutation
Date: Tue, 26 Oct 1999 05:33:13 GMT
Hi,
I've been playing around with this for a while. I have talked to a few
people about it, and they seem to think its interesting and different.
I would like to know what all of you think about it. You can read
about it at http://www.nine.net/~adam/slotting.html I also plan on
posting a mirror of it at http://www.wizard.net/~echo/slotting.html
since nine.net is on my home network which might be kind of slow for
some of you. If you could please CC all posts to me at
[EMAIL PROTECTED] Recently I have been having problems downloading
posts to the group.
Thanks,
Adam Durana
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Unbiased One to One Compression
Date: Tue, 26 Oct 1999 06:55:42 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>[EMAIL PROTECTED] wrote:
>: Tim Tyler ([EMAIL PROTECTED]) wrote:
>
>: : Making the cypher "one-on-one" avoids the possibility of any information
>: : or regularity *added* by the compressor being used in the attack on the
>: : cypher - since no information has been added.
>
>: : Attacks can still be based on any regularities left by sub-optimal
>: : compression, though.
>
>: There is a problem with one-to-one compression.
>
>Hmm?
>
>: As I recently noted, I had found a way to perform Huffman compression so
>: as to approximate the one-to-one condition between the files to be
>: compressed, and strings of bits of arbitrary length.
>
>: But the next step, converting to a message that is a whole number of bytes
>: in length, is the problem.
>
>: Here is an example of a mapping between messages an arbitrary number of
>: bits in length, and messages that are an even number of bytes long.
>
>: Leave all of the message alone, except for the last 0 through 7 bits.
>: These may have 255 possible values, so these can be coded in the last byte
>: as follows:
>
>: Code Represents
>: 00000000 not used
>: 00000001 <nothing>
>: 00000010 0
>: 00000011 1
>: 00000100 00
>: 00000101 01
>: 00000110 10
>: 00000111 11
>: 00001000 000
>: ...
>: 00001111 111
>: 00010000 0000
>: ...
>: 00011111 1111
>: 00100000 00000
>: ...
>: 00111111 11111
>: 01000000 000000
>: ...
>: 01111111 111111
>: 10000000 0000000
>: ...
>: 11111111 1111111
>
>: Each byte represents the bits that follow the first 1 in the byte.
>
>: This is the most efficient way to represent arbitrary-length bit strings.
>: What is wrong with it?
>
>: When we compress a message, it is reasonable to expect that, for good
>: compression:
>
>: - the probability of each bit in the compressed message being a 0 or a 1
>: is 50% each way, and
>
>: - the probability of the number of bits in the message being of the form
>: 8n+k, for k from 0 to 7, is 12.5% for each value of k.
>
>: Given that, we find that the last byte of the message, using the
>: representation above, has the probabilities:
>
>: 00000001 12.5%
>: 00000010 6.25%
>: 00000011 6.25%
>: 00000100 3.125%
>: ...
>: 00000111 3.125%
>: 00001000 1.5625%
>
>: and so on.
>
>: This problem is _fundamental_, and not due to my particular choice of
>: representations.
>
>No way. You're claiming that this problem affects *all* "one-on-one"
>compression algorithms ("There is a problem with one-to-one compression")
>and then using an argument which appears to be tied to one particular
>implementation to prove your point. I don't think this is a valid form
>of argument.
>
>"One-on-one" compression routines may be considered to be a bijection
>between two sets. This is a mapping which includes every element in each
>set, and has at most one link from each element (which always links to
>an element in the other set).
>
>When thinking about compression in terms of such a pair of linked sets,
>questions about terminating bits and padding the message to a multiple of
>a certain number of bits seem to become rather irrelevant.
>
>: The probability distribution of "random strings of bits of arbitrary
>: length" and of "random strings of bits an even number of bytes in length"
>: are different in shape.
>
>Well, I won't argue with that, but one-on-one compression does *not*
>require an even number of bytes in either the pre-compressed text, nor in
>the result.
>
>I wonder if you can imagine a world with 7-bit bytes, and visualise the
>Huffman compression scheme used there. In such a world less padding would
>be required. Can you imagaie a Huffman scheme based on a single bit?
>This would require no padding at all.
>
>Of course in practice you may lose compressibility if you try to employ
>a compression routine along these lines. However, they /would/ slowly
>eliminate any EOF problems - indicating they are somewhat less than
>fundamental to one-on-one compression systems as a class.
>
>I understand David has himself developed some code to deal with
>end-of-file issues himself. I'm not well placed to comment on
>what he has done in this area.
If you goto http://members.xoom.com/ecil/compress.htm
you get the start of compression pages. The other are compres1.htm
and replace 1 by 2 and so on to get rest of pages if you don't want to
follow the links. My orginal program on first page is my main simple
straight forward adaptive huffman compression program. On the following
pages I discribe how the file ending is handle since all input and output
files are multiples of 8-bit lengths. What confuses many people is that
if one was not using multiples of 8 bit bytes but a continuous stream of
data the matching file that is created is sometimes longer than the continuous
stream and sometimes shorter and somtimes the same length. When the
length is shorter there are hidden dropped ones and when longer there are
added zeros. no more than 8 ones dropped and no more than 7 zeros can
be added. This still allows for every possible ending byte and if one changes
the last byte of the compressed file to every possible case you would get
the 256 input files that make up the file. But John is hang up on thinking
that certain ones more likely than others. But even if he was correct and
even if he has heartburn about either chopping ones and adding zeros.
If you follow the links to the FOCUSED one to one adaptive Huffman compression
the ending situation is far more complicated and depending on the text
compressed there can be any possible pattern used for the fill or chopped
The focusced code produce both output files at once so that either can be
used. This should ease John hangs ups about only ones being chopped
in the orignal method. But I doubt he will follow this since he still has no
real clue how the other one operates.
>
>However, let's not get this out of proportion. Your objection only
>applies to one particular form of one-on one compression. Further
>it only applies when a certain alignment problem occurs. Most computer
>files are only stored to the nearest byte anyway, so I have difficulty
>in seeing your objection as much of a practical problem. I doubt even
>embedded application typically try to encrypt files which are not 8-bit
>aligned frequently.
>
>: Of course, simple techniques can distribute this bias throughout the
>: message. For example, XOR the last byte with a checksum of the rest of the
>: message.
>
>I'd have thought some sort of diffusion process was necessary to do this.
>
>What you've just described seems more like concentrating a synopsis of
>the rest of the message in the last byte.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: Hideo Shimizu <[EMAIL PROTECTED]>
Subject: Re: Keyed permutation
Date: Tue, 26 Oct 1999 15:23:10 +0900
I think 1 round keyed permutation is weak against chosen plaintext
attack. Or it have any other weak point, because there is no
diffusion.
Hideo Shimizu
TAO, Japan
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: RC 4: security & law
Date: Tue, 26 Oct 1999 06:56:16 GMT
Thank you fungus!
Your answers are a great help!
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED]
Subject: Twofish: Optimization for GF(2^8) multiplication desired
Date: Tue, 26 Oct 1999 07:04:19 GMT
hi all,
Due to help from some of u i could understand the basics of galois field
multiplication in twofish MDS and RS multiplication. but this method was
found to be extremely slow. hence i could generate a 255 element array
for mapped galois field multiplication. this i achieved basically by
raising a generator polynomial to powers betn. 1 and 254. i am now using
this table to map my 1 byte multiplication to another.
say x = gf(x) ^m
and y = gf(x) ^n where gf(x) is the generator polynomial
for given MDS polynomial.
then x * y in MDS is gf(x)^(m + n). but this is again slow b'cos i end
up doing it 16 times this mapping and then reverse mapping for each 1
byte no. then for the final 32 bit o/p as reqd. to complete one
h-function.
Code by brian gladman does something more to it. he has maps for all 32
bit no. and o/p is formed by directly mapping input 1 bytes to o/p.
t'is i can't understand. can some one be kind enough to explain this or
any other better method and enlighten me.
-bye
rasane_s
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (Jerry Coffin)
Subject: Re: OAP-L3: How Do You Spell S-H-A-R-E-W-A-R-E
Date: Tue, 26 Oct 1999 02:00:56 -0600
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] says...
[ ... ]
> Almost all software includes the usual prohibition against disassembly.
> It's never stopped anyone from reverse-engineering a product if they
> really wanted to.
Quite true -- OTOH, what I had in mind at the time was publishing a
breakdown of what it does and how it does it (most likely along with a
quite discourse on how to break it). While I probably wouldn't let
the license agreement stop me if I had a _really_ good reason, it's
enough to stop me from doing it without any particular reason.
--
Later,
Jerry.
The universe is a figment of its own imagination.
------------------------------
From: "The Gorf" <[EMAIL PROTECTED]>
Subject: Newbie question
Date: Tue, 26 Oct 1999 02:24:47 -0700
This is a multi-part message in MIME format.
=======_NextPart_000_0008_01BF1F59.45F1FB20
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
OK I have followed my own advice and watched this group for the =
information that I am looking for but have not seen it. So I am now =
going to ask it right out. Can someone do me a favor and drop me an =
e-mail with one of the two, preferrably both, answered below?
1. An explanation of how a hash key works. I am in need of creating =
some software that incorporates this functionality at it's basic form =
but I don't understand how it works. Basically I have written a program =
and I want to be able to offer several users a unique key each that will =
unlock the software after the expiration date of the full working demo.=20
2. A good recommendation for a solid beginners book on encryption.
Thank you for humoring me in this and for your time. It is greatly =
appreciated.
Geoff Sweet.
Mail to [EMAIL PROTECTED] as the e-mail attached to this letter is =
mostly for spam. =20
=======_NextPart_000_0008_01BF1F59.45F1FB20
Content-Type: text/html;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META content=3D"text/html; charset=3Diso-8859-1" =
http-equiv=3DContent-Type>
<META content=3D"MSHTML 5.00.2919.3800" name=3DGENERATOR>
<STYLE></STYLE>
</HEAD>
<BODY bgColor=3D#ffffff>
<DIV><FONT face=3D"Comic Sans MS" size=3D2>OK I have followed my own =
advice and=20
watched this group for the information that I am looking for but have =
not seen=20
it. So I am now going to ask it right out. Can someone do me =
a favor=20
and drop me an e-mail with one of the two, preferrably both, answered=20
below?</FONT></DIV>
<DIV> </DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2>1. An explanation of how a =
hash key=20
works. I am in need of creating some software that incorporates =
this=20
functionality at it's basic form but I don't understand how it =
works. =20
Basically I have written a program and I want to be able to offer =
several users=20
a unique key each that will unlock the software after the expiration =
date of=20
the full working demo. </FONT></DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2></FONT> </DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2>2. A good recommendation for =
a solid=20
beginners book on encryption</FONT><FONT face=3D"Comic Sans MS"=20
size=3D2>.</FONT></DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2></FONT> </DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2>Thank you for humoring me in =
this and for=20
your time. It is greatly appreciated.</FONT></DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2></FONT> </DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2>Geoff Sweet.</FONT></DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2></FONT> </DIV>
<DIV><FONT face=3D"Comic Sans MS" size=3D2>Mail to <A=20
href=3D"mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</A> as the =
e-mail attached=20
to this letter is mostly for spam. </FONT></DIV></BODY></HTML>
=======_NextPart_000_0008_01BF1F59.45F1FB20==
------------------------------
From: "dino" <[EMAIL PROTECTED]>
Subject: multiple signature scheme
Date: Tue, 26 Oct 1999 10:57:47 +0100
Hi
There is some standard for multiple signature on the same file?
Infact, when two people sign a paper contract, they sign the same copy,
maybe two identical copies.
What happen when they are using digital sign?
Thank you
Dino
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: This compression argument must end now
Date: Tue, 26 Oct 1999 12:11:19 GMT
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:
> In a word: "cryptanalysis". Look at how cyphers have been broken
> historically. Brute-force search of the keyspace is *not* the only
> technique available to cypher-breakers.
But most popular ciphers are 'as-of-yet' unbroken in any cannocal sense.
> Um yes, you have to break the cypher. If it is known on a-priori
> grounds that you can't break the cypher, no quantity of lack of
> compression is likely to change this.
Oh my gosh, was that my original point?
> Unfortunatly, only with OTPs (and perhaps some obscure, monstrous
> systems) is it /known/ that the cypher is invulnerable to any attack
> faster than brute force.
Ahh, but most popular ciphers... or more specifically: rc4, rc5, rc6,
cast, cast-128, blowfish, 3des, idea, twofish, [any aes] have yet to
be broken.
> I've personally explained the matter a number of times now. If you
don't
> understand, perhaps try asking specific questions to highlight where
your
> problems lie. Or perhaps use a service like "deja".
No you have speculated that 'weak' compression (which is ironic because
deflate is better then huffman any day.) makes strong crypto weak. You
guys have never actually cited one example where this is true.
> So far my impression is that you've made very little effort to
understand
> the issue. If fact, you don't even seem to consider the /possibility/
> that modern cyphers can be broken by analytic attacks *at all*!
I understand the issue is that you guys don't have a clue. If you have
a good compression ratio, each bit of plaintext will contain more
information then if you had a low compression ratio. This means it's
harder to find biases in the input [albeit not impossible]. Just
because p = compress(decompress(p)) is true doesn't mean the
cryptosystem is secure.
Btw for any lossless deterministic compression algorithm the above is
ALWAYS true. If it weren't it would not be deterministic. It might
not be possible todo so with PKZIP but just use the deflate (with all
error checking removed) and it's just as easy. Heck code your own LZ77
compressor around it :)
> I find this confidence slightly alarming? What reason do you have for
> having such strong faith in these modern cyphers?
Becuase they are designed by those who have been in the field much
longer. Because they can [for the most part] present their ideas in
clear and logical fashions. Becuase they are more mature about the
matter. Because they can have educated conjectures as too why I should
trust them.
> Various compressed cyphers have already been cracked based on
> non-"one-on-one" properties in the algorithms involved. Someone in
this
> forum claimed to have broken a "PKZIP stream cypher" on the basis of
such
> clues recently.
PKZIP was not broken by the compression algorithm. It was the bad
quality of the cipher. By your reasoning PGP should be just as easily
to attack and it simply is not.
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: This compression argument must end now
Date: Tue, 26 Oct 1999 12:22:48 GMT
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:
> That is /one/ aspect of the argument.
>
> I observe you have somehow failed to mention that the opponents get
less
> cyphertext to base their attacks on, and what they do get has had it's
> entropy-per-bit increased to the point where cryptanalysis is made
much
> harder.
You jerk, you stupid jerk. That was my point from the beginning. That
if you used a better compression algorithm you would have more info per
bit of plaintext. Obviously you don't read my posts that carefully?
That's why I said compression should be used to make the message
smaller (i.e use something other then huffman coding)
> It is not a good argument against compression in the first place.
> It doesn't even address the issues above. Cryptanalysis based on
> regulariries in the file does not rely on brute forcing the
> keyspace, nor on having the whole message decrypted. You've been told
> this before - it makes me wonder if you're paying attention ;-(
For the most part however you don't get enough ciphertext to make this
usefull. And if you use a chaining mode you are not encrypting ASCII
are you? Didn't think of that did you?
> What??? LZ77 is one-on-one? I very much doubt it. To see if a code
> has resulted from an actual compression, simply decompress it and
> compress it again. If you get a different file, you know that it has
> not been made with the specified (deterministic) compression
program. If
> you /never/ get such a file, the compressor is "one-on-one".
>
> LZ isn't one-on-one - there's obviously more than one way to encode
the
> same file. LZ compression works by crudely identifying repeated
strings.
> I believe XXXXXXXX can be represented both as 8 x X and 5 x X + 3 x X
> as far as the decompressor is concerned. No way is this compression
> "one-on-one".
Any one implementation of LZ77 is deterministic thus it's one onto
one. If you take 'hello hello hello hello hello' and compress it a
million times with you algorithm will it not compress to the same thing
each time? I think so. Thus if you decompress that you will get a
long string of garbage that should compress back to what it originate
from. Even if it doesn't this one-onto-one thing has yet to be proven
as usefull.
> I'm sorry, Tom, but I fail to see where you fail to understand the
> security benefits of compressing before encryption. This sort of post
> from you does not help me form a positive image of you as an
intelligent
> creature.
I never said compression was bad. I said using stupid low compression
ratio entropy coders such as huffman is stupid since you could do much
better. That's why I use LZ77 in peekboo and that PGP used deflate in
their program.
>
> : However I would like to live and see a day where this is dropped.
>
> Then why not start by abandoning your pointless, illogical crusade
against
> compression in cryptography?
See above.
>
> : Please consider the facts, make up your own mind (I am not saying
which is
> : right or wrong) and stop posting about this.
>
> Why? Since this is a public forum devoted to cryptography, this
seems by
> far the best place to discuss the issue.
Because it's stupid. You guys keep yakking on and never have proved
anything. It's like me saying all crypto is bad because I have this
obscure attack that I will never fully explain just to get attention
once in a while. I will personally attack anyone who challenges my
fixed mentallity on crypto.
I just don't care anymore [well not enough]. Think what you want but
don't convince newbies with your wierd obscure views.
> David Scott has recently invented perhaps the first compressioon
system
> suitable for cryptography known to man. Since *all* other known
> compression routines potentially weaken your encypherment - and add
> "unnecessary" information to the file - I believe this event should
not be
> under-estimated.
Prove it. Which algorithm *adds* to the stream?
> Note (sorry if I've said this too often) that the most efficient
> compressors are - in theory - one-on-one. This is a kind of double-
whammy
> as far as cryptography goes. Not only do they add the minimum of
> information to the file (that cryptanalysis can potentially exploit),
but
> also, this class of compression entirely contains the class of
compression
> routines which offer the analyst the smallest possible quantity of
> cyphertext (for any given finite input set, anyway).
>
> I would urge others who have any theoretical understanding of the
design
> of bijective compressors to enter into the current vacuum in the area
and
> make some sort of positive contribution to the new field, in order to
> benefit themselves and others.
You keep saying 'adds info the attacker can use' What info are you
talking about? I mean what the heck are you talking about? I don't
get this part. I mean a LZ77 compressed stream is just a series of
<index, length, literal> codewords. Where is the bias? Where is the
hidden info?
Why not cite examples instead of respewing that stupid argument. I
could just as easily say your method leaves some info in the stream.
And that dave scotts method is actually easier to crack. But let's not
go there.
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************