Cryptography-Digest Digest #458, Volume #10 Wed, 27 Oct 99 18:13:03 EDT
Contents:
Re: Unbiased One to One Compression (SCOTT19U.ZIP_GUY)
Re: Unbiased One to One Compression (SCOTT19U.ZIP_GUY)
Quantum Computer Simulators and Source Code ([EMAIL PROTECTED])
Re: Unbiased One to One Compression (SCOTT19U.ZIP_GUY)
Re: the ACM full of Dolts? (SCOTT19U.ZIP_GUY)
Re: Unbiased One to One Compression (Mok-Kong Shen)
Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
([EMAIL PROTECTED])
Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Unbiased One to One Compression
Date: Wed, 27 Oct 1999 19:50:08 GMT
In article <[EMAIL PROTECTED]>, Mok-Kong Shen <[EMAIL PROTECTED]>
wrote:
>[EMAIL PROTECTED] schrieb:
>>
>> Somewhat recently, David A. Scott proposed that for purposes of
>> encryption, a method of compression should be used that is "one to one";
>> that is, it has the property that any random string of bits having the
>> same length as one's message (such as that obtained by deciphering the
>> message with the wrong key) should decompress to something without error.
>
>A tiny remark: The one-to-one argument assumes that the analyst
>knows and hence uses the same compression scheme as the sender.
>How about weakening that assumption a bit? If one uses a dynamic
>Hoffman scheme, one can start out with an arbitrary initial frequency
>distribution. If this is hidden from the analyst, he doesn't know
>how to decompress/compress properly from the very beginning, let
>alone to examine the one-to-one property. Of course, this means
>one is effectively using more key bits. On the other hand, it does
>mean that one can now explicitly prefix the ciphertext with a
>length (in plaintext) giving the exact number of bits of the
>ciphertext, which could be useful for ensuring correct transmission.
>
>M. K. Shen
Mok I have mentioned this before. If you have been reading you would
have noticed this possiblity. However I am trying to seperate compression
from encryption but it would be childs play to modify the compression
routines to do this. But it complicates the disscussion of compression
as a first pass to encryption. But if one has a close friend you could
easily do this. But I would still use compression in both directions
through the file so as to create an all or nothing effect. Note if one does
this you have an encryption method that is very nonstandard and uses
a much larger effective key than any of the short keyed AES methods.
A method like this would be unlikely to become stanard since the
governement is pushing shorter keyed systems so the NSA can keep
reading your mail and if people realized how compression programs
could be modifed to make for better crypto the cat would be out of the
bag and the NSA would have a much tougher job especially if they
used keyed one to one compression in series with even a weak AES
method.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Unbiased One to One Compression
Date: Wed, 27 Oct 1999 20:09:40 GMT
In article <JoHR3.54$[EMAIL PROTECTED]>, gtf[@]cirp.org (Geoffrey T.
Falk) wrote:
>Please explain how adding a random byte can add information for
>the attacker to use. The byte is random and so gives no useful
>information whatsoever.
I already have stated how if one compresses even a purely
random file let alone the adding one byte that is random. That
if a non one to one compressor is used for compression before
encryption that there may be enough information in the resluting
file that only one key might exist that could have created the file.
If this concept was over your head I am not sure what could
convence you.
>
>That makes no sense. If your compression has bad statistical
>properties, it can add information to the file even if it is 1-1.
>Either you have the guts to admit that, or you don't.
I am not a shy person when it goes to admitting things.
Before I was married I loved fucking Nevada Whores and
am proud of it. So this talk about guts is nonsense. My
huffman compressor is not going to add information in to
the file. There really is no question about it. It is a PURE
transformation of the file. No information is added or delelted
since every file uniqely compress to another file. And every
file unqueily decompress to another file and
DeComp(Comp(X))= X for all 8bit based file systems
Comp(DeComp(Y))=Y for all 8bit based file systems
Something your handlers might want you to discuss is
a compress routine like your copy routine that you
are so proud of that is actually nothing more than
a reverse of the encryption process so that when you
apply the encryption the original plain text comes out.
Since you could just arbitrarly call the reverse encryption
a compression. But even in this case no information was
added. But even someone like you can see that it would
be easier to break.
>
>g.
>
>PS: In case I didn't make it sufficiently clear, I am a firm
>advocate of the benefits of using compression before encrypting.
>And, the idea of 1-1 compression is mathematically very
>interesting, at least in theory. But I do not think it is very
>practical. And finally, I am not interested in insults,
>so you can take it elsewhere.
>
BullShit if you were the firm advocate you pretend to be
then you would have a grasp of what the one to one compression
is doing. Either you don't have the IQ to grasp it or you are putting
out disinformation. Which is it? I don't really mean to insult you
maybe your IQ is higher than you show and are just having a bad
day which is an unlikely third possibility.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: [EMAIL PROTECTED]
Subject: Quantum Computer Simulators and Source Code
Date: Wed, 27 Oct 1999 19:03:45 GMT
http://www.dcs.ex.ac.uk/~jwallace/simtable.htm
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Unbiased One to One Compression
Date: Wed, 27 Oct 1999 20:38:45 GMT
In article <[EMAIL PROTECTED]>, "Trevor Jackson, III" <[EMAIL PROTECTED]>
wrote:
>
>Example: static Huffman compression against an alphabet consisting of three
> VHF
>characters amounting to 1/2, 1/4, and 1/4 of the message text, and 32 other
>characters each amounting to 1/128 of the message text. Thus a "perfect"
>compression. Assume at the head of the compressed file you have a table
>describing the decompression tree -- the encoded frequency info.
This is not a perfect compression. It is only optimal to certain narrow
conditions. Namely that all you know about the input characteristics is
the frequency. But even your example is flawed since it has to add up
to one. !/2 +1/4 +1/4 + 32(1/28) is larger than one so your example sucks
how about 1/2 + 1/4 +32(1/128) that at least adds to one. If you use
static huffman compression it is optimal in certain cases but if you
know for example that the thing that appears half the time actaully appears
at every even location you could drop it all together and add it back in
during decompreession in this case you don't need any bits in the main
body to waste the space of the cahracter that occurs half the time.
so the word perfect if a mistake.
>
>You are free to build one of a large number of decompression trees, randomly
>choosing between them, without adding any extra bulk to the message.
>
>If the decompression tree is a priori known to the recipient, one has to count
>the decompressor as part of the message size, and the freedom to selecting bit
>encodings remains.
>
>Example: All of the text that I will type in the remainder of my lifetime is a
>fairly small sample. To compress it perfectly one could simply map each
> mesaage
>onto a number and communicate the numbers (like early word processor form
>letters).
>
>Since each I will send each message once any mapping of messages onto integers
>(bitstrings) is as optimal as another.
>
>Query: Where is the extra bulk in these examples?
>
>>
>>
>> : To insure that there is no residual confusion, ;et me state that any
>> : given compressor with the property you seek requires that it have a
>> : unique bidirectional mapping from plain to compressed text and back.
>> : But there is no unique compressor with that property for any
>> : probability model.
>>
>> I *completely* agree with this.
>>
>> However, I doubt that you can make "random" decisions while compressing
>> with an optimal compressor - *and* still expect a partner to
>> decompress it - without fattening up your file.
>
>Example, a simple-minded RLE form of compression always has a size-neutral
>entry. BY this I mean that if a run is represented by a tuple
>char+runcode+run_len taking N bytes one can either encode runs of length N or
>ignore them. A program doing so randomly, encoding some and not encoding
> others
>does not increase the size of the message.
Yes but if it is random your assuming the attacker does not know it
so it should really be consider part of an encryption key. And again you
have the problem of actaully creating a random number. Clearly we are
looking into the use of compression that compresses the same. Since
then it would not need this extra code to actually get random data which
initself is a major problem that can be avoided by just using a one to one
compressor in the first place.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: the ACM full of Dolts?
Date: Wed, 27 Oct 1999 20:44:26 GMT
In article <7v7g1j$[EMAIL PROTECTED]>, "Roger Schlafly" <[EMAIL PROTECTED]> wrote:
>SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote in message
>news:7v6ubj$jj2$[EMAIL PROTECTED]...
>> Below is a comment of why my article was rejected by the ACM.
>
>Don't feel too bad. The ACM rejected the first paper on public
>key cryptography many times. Meanwhile, it publishes a lot of
>garbage papers.
>
>
>
well I guess that is why i like IEEE better
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Unbiased One to One Compression
Date: Wed, 27 Oct 1999 22:17:29 +0200
SCOTT19U.ZIP_GUY wrote:
>
> >A tiny remark: The one-to-one argument assumes that the analyst
> >knows and hence uses the same compression scheme as the sender.
> >How about weakening that assumption a bit? If one uses a dynamic
> >Hoffman scheme, one can start out with an arbitrary initial frequency
> >distribution. If this is hidden from the analyst, he doesn't know
> >how to decompress/compress properly from the very beginning, let
> >alone to examine the one-to-one property. Of course, this means
> >one is effectively using more key bits. On the other hand, it does
> >mean that one can now explicitly prefix the ciphertext with a
> >length (in plaintext) giving the exact number of bits of the
> >ciphertext, which could be useful for ensuring correct transmission.
> Mok I have mentioned this before. If you have been reading you would
> have noticed this possiblity. However I am trying to seperate compression
> from encryption but it would be childs play to modify the compression
> routines to do this. But it complicates the disscussion of compression
> as a first pass to encryption. But if one has a close friend you could
> easily do this. But I would still use compression in both directions
> through the file so as to create an all or nothing effect. Note if one does
> this you have an encryption method that is very nonstandard and uses
> a much larger effective key than any of the short keyed AES methods.
> A method like this would be unlikely to become stanard since the
> governement is pushing shorter keyed systems so the NSA can keep
> reading your mail and if people realized how compression programs
> could be modifed to make for better crypto the cat would be out of the
> bag and the NSA would have a much tougher job especially if they
> used keyed one to one compression in series with even a weak AES
> method.
While I think that in principle one shouldn't count on any compression
scheme to provide any essential strength (for these are not designed
with the goal of an encryption scheme) making use of possibilities in
compression to confound the analyst appears not to be a bad idea,
I suppose. (It is cheap. So why shouldn't one neglect it, if it can
help a little bit?) There is no issue of standard here: the adaptive
Huffman is a well-known technique. The simplest way to obtain
an initial frequency distribution is to prime the scheme with a piece
of text, which can be changed according to certain schedule agreed
upon by the communication partners. The analyst, who can't guess
the distribution, can't start to decompress or compress. He must
try quite a lot, which may render his work very time consuming or
even futile.
M. K. Shen
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Wed, 27 Oct 1999 20:05:21 GMT
In article <7v4tv7$pv5$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (David Wagner) wrote:
> In article <7utv1b$ldc$[EMAIL PROTECTED]>, <[EMAIL PROTECTED]>
wrote:
>>Take two ciphers A and B with keys of N bits. These ciphers must be
>>independent in the sense that physically breaking one does not help in
>>breaking the other. [...] For ciphers A and B
>>there exists then a method to break them at a cost comparable to 2^
>>(N*k) operations where k<1 (2^N corresponds to brute forcing them). If
>>you increase the key length by 1 bit, then you would need 2^((N+1)*k)
>>=2^N * 2^k operations to break A or B [...]
>
>Why do you assume that the complexity of attack is linear in the
>keylength? To put the question differently: why should we expect that
>increasing the key length will increase the complexity of attack
>proportionally?
>
>In most of the cryptanalytic (shortcut) attacks I can think of, the
>complexity of attack is essentially independent of the key length.
>
>Since the argument is predicated on this assumption, and since the
>assumption seems unlikely, I'm not sure how much we can really
>conclude from the argument. (Correct me if I'm misunderstanding.)
It seems to me that if an attack's complexity grows slower than linear
in the key length then the argument becomes stronger, because even if
that attack's complexity did grow linearly the method described would
produce a stronger cipher against that attack. In other words, even if
we assume that the component ciphers are stronger than what they
probably are we can construct an even stronger cipher. A similar
argument can be made for the observation that the component ciphers
will not really be equally strong: assume that all are as strong as the
strongest one and then proceed to construct a cipher that is stronger
still.
Formally we can express this as follows:
For all sets of ciphers and for key bit lengths of N in the range
[Nmin,Nmax] there exists k<=1 so that the cost of breaking any one of
them is greater or equal 2^(k*N).
Nmin, Nmax is the fixed range of key lengths that interest us. 2^(Nmax-
Nmin) must greater than the number of ciphers in the set.
Let us now choose any two independent ciphers A and B and specify
N=Nmin and Nmax=Nmin+1. Then
cost(A,N) >= 2^(k*N)
cost(B,N) >= 2^(k*N)
cost(A,N+1) >= 2^(k*(N+1))
cost(B,N+1) >= 2^(k*(N+1))
Construct a cipher C with keys of N+1 bits and use the last bit of the
key to decide whether to apply A or B.
Then cost(C,N+1) = cost(A,N)+cost(B,N)/2
>= 3/2 * 2^(k*N)
>= 3/2 * 2^(-k) * 2^(k*N + k)
>= 3/2 * 2^(-k) * cost(A,N+1)
3/2 * 2^(-k) > 1 => cost(C,N+1) > cost(A,N+1)
or:
k < ln(3/2)/ln(2) => cost(C,N+1) > cost(A,N+1)
So we have constructed a cipher C which is on average stronger than A
or B (as long as A and B are rather weak).
If the set has 2^M ciphers then the cost of the attack is 2^(k*N1)*
(2^M+1). If M is large then no matter how weak the component ciphers
are (k->0), any attack against C will have complexity greater than M-1.
I am not sure about the practical significance of this. Proving that a
huge set of ciphers is independent may be as difficult to do as proving
that a cipher is strong. On the other hand, if we build a generator
that produces desperately different ciphers we may trust its quality of
independence more than we now trust the strength of any conventional
design.
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: "Risks of Relying on Cryptography," Oct 99 CACM "Inside Risks" column
Date: Wed, 27 Oct 1999 20:13:08 GMT
In article <7v4vbm$q05$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (David Wagner) wrote:
> In article <7uvtj0$t7e$[EMAIL PROTECTED]>, <[EMAIL PROTECTED]>
wrote:
>>The best way I can imagine to achieve this is to create a cipher
>>generator: take a large number of bits out of the key and use them to
>>produce a cipher (code+tables). Use the rest of the key plus the
>>plaintext blocks to drive that cipher and produce the ciphertext.
>>
>>This is almost exactly what my Frog algorithm does. Only about 5.6% of
>>the key material is really used as key data. The rest is used for
>>building tables and for "expressing" code.
>
>The problem is that it is very challenging to ensure that all of the
>resulting ciphers truly are "independent" of each other.
Very challenging indeed. Any ideas about how it could be done?
>If they're not "independent", your argument for security breaks down.
>In particular, it might be that you could attack N "dependent" ciphers
>with not much more complexity than that required to attack 1 such
>cipher.
>
>I can think of some very natural examples of this phenomenom. FROG is
>one good example; (...)
What I wanted to say was that Frog can be viewed as a cipher generator
- not that it is a generator that produces independent ciphers. Sorry
for the ambiguity.
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************