Cryptography-Digest Digest #331, Volume #10 Wed, 29 Sep 99 16:13:05 EDT
Contents:
Re: Increasing password security dramatically without making it harder to remember
("Gary")
Re: public/private/session keys (jerome)
Q's about OFB mode in DES and Triple DES (Jon Pierre Fortney)
Re: msg for Dave Scott (JPeschel)
Re: Please review proposed rebuttal... (Bob Silverman)
Re: About differential cryptanalysis.... (jerome)
Re: review of peekboo please? (Medical Electronics Lab)
Re: NEMA, Swiss cipher machine (John Savard)
Re: Compress before Encryption (James Felling)
Re: Electronic envelopes (Tom St Denis)
Re: RSA-512: Weizmann Institute: London Times (Bill Unruh)
Re: msg for Dave Scott (SCOTT19U.ZIP_GUY)
Re: Perfect Shuffle Algorithm? ("Clive Tooth")
Re: Compress before Encryption ("karl malbrain")
Re: RSA Variation (John Bailey)
Re: Electronic envelopes (Mok-Kong Shen)
Re: About differential cryptanalysis.... (Tom St Denis)
Re: Requirement for Uniqueness in Decryption Keys (wtshaw)
Re: Compress before Encryption (Tom St Denis)
----------------------------------------------------------------------------
From: "Gary" <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp,comp.security.pgp
Subject: Re: Increasing password security dramatically without making it harder to
remember
Date: Tue, 28 Sep 1999 14:08:51 +0100
By iterating the one way secure hash (say SHA 160 bits) of the password (say
originally 40 bits) a fixed number of times that takes about 10 seconds on
an average machine, means the attacker has to take on average 10 seconds per
brute force try. (2^40)*10 Seconds for all combinations is a long time
compared to 2^40 average computing cycles. Remember he will only know if his
guess is correct if the iterated hash opens the encrypted file.
Johnny Bravo wrote in message <[EMAIL PROTECTED]>...
> A weak key is going to stay a weak key, no matter what you do to it. It
is
>easier to choose a strong key in the first place. If you need to memorize
a
>bunch of strong keys for a number of applications, use one stronger key to
>protect all the others in something like PasswordSafe or in a ScramDisk
File.
>
> Johnny Bravo
>
A weak key does not stay weak!
Read above pointers by David Wagner!
------------------------------
From: [EMAIL PROTECTED] (jerome)
Subject: Re: public/private/session keys
Reply-To: [EMAIL PROTECTED]
Date: Wed, 29 Sep 1999 16:01:48 GMT
On Wed, 29 Sep 1999 11:38:42 GMT, Tom St Denis wrote:
>In article <[EMAIL PROTECTED]>,
> [EMAIL PROTECTED] wrote:
>> how both part can have the same salt ?
>> i wonder because the salt depend on the current time.
>
>Don't quite follow you here. The SALT is sent in the clear.
i misunderstood that.
>SALT1 uses the
>time in seconds, and SALT2 uses the time in milliseconds since bootup. These
>times are used for variance when the content remains the same (i.e YES/NO
>replies)
>
>Tom
>
>
>Sent via Deja.com http://www.deja.com/
>Before you buy.
------------------------------
From: [EMAIL PROTECTED] (Jon Pierre Fortney)
Subject: Q's about OFB mode in DES and Triple DES
Date: Wed, 29 Sep 1999 18:15:01 GMT
I've got a question concering the use of the OFB (output feedback) mode in
DES. Given a random initial vector is the keystream generated by OFB
ultimately periodic? How would this be proved or disproved?
My first guess would be yes, after long enough we would evenually repeat an
input after which point the stream would be periodic, but I know this is
probably very naive, and certainly not a mathematical proof.
I also have another questions:
If we let order k1 = order k3 = 64 and order k2 = 56
and we encripy a message M by the following
1) M ----> k1 XOR DES(M XOR k3 , k2)
2) M ----> DES (M , k2) XOR k1
then we get encriptions with an effective key length of 184 that are only
marginally better then simply encripting by DES(M , k2), and much worse then
using triple DES.
Any comments/ideas/proofs/pointers would be very much appreciated.
Thanks
Pierre Fortney
------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Subject: Re: msg for Dave Scott
Date: 29 Sep 1999 16:03:34 GMT
> [EMAIL PROTECTED] (Patrick Juola) writes:
>In article <[EMAIL PROTECTED]>,
>JPeschel <[EMAIL PROTECTED]> wrote:
>>>[EMAIL PROTECTED] (Patrick Juola) writes:
>>
>>
>>>However, *because* DES has the reflexivity property, you should be able
>>>to find the proper key in an expected 2^54 operations, yes?
>>
>>Sorry, Patrick, can't say I've heard of it.
>
>Schneier, pg. 281 (2nd ed.) :
>
>"Take the bitwise complement of a key, that is, replace all the 0s with
>1s and the 1s with 0s. Now if the original key encrypts a block of
>plaintext, then the complement of the key will encrypt the complement
>of the plaintext block into the complement of the cyphertext block.
>
>If x' is the complement of x, then the identity is as follows :
>
> E_k(P) = C
> E_k'(P') = C'
>
>What this means is that a chosen-plaintext attack against DES only has
>to test half the possible keys : 2^55 instead of 2^56."
>
>And, of course, a brute-force search of the 2^55 keys is expected to
>find the answer halfway through, after 2^54 operations.
>
Thanks, Patrick, but this is the complementation property. I've never heard it
called reflexivity.
I don't think you can extend Schneier's assertion to a brute-force attack.
In a real brute-force search, for example, EFF.org expected to search
through at least 2^55 keys.
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Please review proposed rebuttal...
Date: Wed, 29 Sep 1999 17:19:46 GMT
In article <7st8sf$55u$[EMAIL PROTECTED]>,
Don Leclair <[EMAIL PROTECTED]> wrote:
> > One needs a very large Cray-class machine
> > to deal with the matrix. (fairly big bucks,
> > i.e. $5-10 Million)
>
> Question:
>
> I understand that the block Lanczos algorithm is now the optimal
method
> for the matrix step but can structured Gaussian elimination still be
> useful for reducing the initial matrix to a more reasonable size?
No.
You would start with a very sparse large matrix (e.g. 12 million
rows, 60 entries/row) then reduce down to (say) 4 million rows.
But the matrix would become VERY dense (close to 50%), so it would
actually require a lot MORE storage than the original one.
Further, it would take MUCH longer to solve. block Lanczos runs
in time O(n^2 d) where d is the #entries/row and n is #rows.
(12 M)^2 * 60 is a lot less than (4M)^2 * (2M)
>
> I have an idea, that I would be happy to share, for a distributed
> implementation of structured Gaussian elimination.
A tightly coupled parallel version has already been done (by me).
It isn't terribly useful now.
--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (jerome)
Subject: Re: About differential cryptanalysis....
Reply-To: [EMAIL PROTECTED]
Date: Wed, 29 Sep 1999 16:08:11 GMT
On Wed, 29 Sep 1999 21:34:50 +0800, OTTO wrote:
>Hi,
>
> I want to know more detail of differential cryptanalysis.
> Who can sent me the relaiting document.
shamir & biham describe that in technical reports which are available
online. check www.counterpane.com, there are a very good collection
of papers.
biham & shamir wrote a book about DES differential cryptanalisys (but
it is out of print as far as i know)
btw, i never read the book and would like to know if the technical reports
contain the same informations or are a 'light' version of the book ?
------------------------------
From: Medical Electronics Lab <[EMAIL PROTECTED]>
Subject: Re: review of peekboo please?
Date: Wed, 29 Sep 1999 12:19:57 -0500
Tom St Denis wrote:
>
> Anything I could make more user attractive?
A couple things for me would be nice: knowing something is
running (a blinking box in a corner someplace during key
gen for example) and which keys I've got set up for the
clipboard (like "my key" and "their key" windows on the
main screen box).
I think working from the clipboard is very nice, but it's
hard to tell when something has been done or not. Some way
to mark that there's new data in the clipboard, or stale
data might be nice. Not sure it's possible tho.
Patience, persistence, truth,
Dr. mike
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: NEMA, Swiss cipher machine
Date: Tue, 28 Sep 1999 13:31:09 GMT
[EMAIL PROTECTED] (John Savard) wrote, in part:
>move). While the machine has a large number of initial settings, that
>means its period is only 676, but I suppose that was considered
Oops. A ring and rotor pair has a period of 676. With the red ring,
the period is 15,756, which is quite adequate.
John Savard ( teneerf<- )
http://www.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: James Felling <[EMAIL PROTECTED]>
Subject: Re: Compress before Encryption
Date: Wed, 29 Sep 1999 11:22:20 -0500
1-1 compression algoritims are rare because:
1) they are comutationally more demanding than non 1-1 compression( catching
special cases)
2) they are not robust as to error detection -- a problem in a compressed
file will not be detected until use of that data is attempted.--and as a user
of compression products for archiving/ data transmission, I would rather have
the ability to easily detect such errors.
3) They are more demanding to program than non 1-1 compressions( those
special cases again)
4) Their efficiency in compressing data is not apreciably different than non
1-1 methods.
5) Generaly a 1-1 compressor will have a larger code footprint than a non 1-1
compressor.
So it becomes difficult to justify their writing/use except for special
purpose applications.
I can see some advantages to them for some crypto aplications, but in
general, I feel that the payback does not justify the associated downsides.
A "headerless" compression which is not 1-1, but is very efficient as far as
reducing file size, will in my opinion result in all the benefits of a 1-1
compression except the possible faster discarding by an adversary of
non-working keys. However, a highly efficient compression should result in a
very large ratio of
(num possible compressed file)/(num not possible compressed file).
Tim Tyler wrote:
> SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote:
>
> ["one to one" compression the never gives decompression errors]
>
> : Can anyone (but Tommy) specualte as to why they are not common.
> : And to why the phony crypto Gods even in there books give zero space
> : to this topic.
>
> I find it rather hard to believe that people have not even considered
> the subject before now.
>
> Since so much crypto-research goes on behind closed doors, it may
> eventually emerge (when the technique becomes more widespread), that
> it was in fact invented years ago - but never escaped from government
> custody.
> --
> __________
> |im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
>
> It's not hard to meet expenses - they're everywhere.
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Electronic envelopes
Date: Tue, 28 Sep 1999 13:20:48 GMT
In article <[EMAIL PROTECTED]>,
Art Dardia <[EMAIL PROTECTED]> wrote:
> You have to assume that if you're using crytography to encrypt the
> envelope until the year 2020, as given in your example. By the year
> 2010, processors should be powerful enough to crack that a lot quicker
> than you'd want them to. You'd need to use a disgustingly large key
> size.
>
> Art Dardia
>
> PS - I'm a newbie. My posts may mean nothing.
Well you have to think reasonably. Even in the year 2020 I don't think
128 bit keys will be searchable as they are now (56 bit keys are
considered small). A 128-bit keyspace is 3.094850098213e+26 times
larger then a 56-bit key. Assuming you can search a billion keys a
second it will still take on average about 4906852641.764 years to find
the key. Most computers now can only search around half a million keys
a second so we are talk about a 2000 times improvement in 21 years. (in
20 years computers will only be 2(20(12)/18) times better or about 26
times better (using the predicted trend).
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (Bill Unruh)
Subject: Re: RSA-512: Weizmann Institute: London Times
Date: 29 Sep 1999 18:39:27 GMT
In <[EMAIL PROTECTED]> [EMAIL PROTECTED] writes:
> After an Israeli research institute said it could break Europe's
> banking codes in less than a second, a initiative has been launched
> that could result in unbreakable codes.
...
> the system used by the European banking system. It claims it has
> developed a hand-held device that can break the code in
> 12 microseconds....
I guess in the Jewish calender, April 1 falls on a different day.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: msg for Dave Scott
Date: Wed, 29 Sep 1999 18:41:16 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
(JPeschel) wrote:
>>[EMAIL PROTECTED] (Patrick Juola) writes:
>
>
>>However, *because* DES has the reflexivity property, you should be able
>>to find the proper key in an expected 2^54 operations, yes?
>
>Sorry, Patrick, can't say I've heard of it.
>
>Joe
>__________________________________________
Joe it just means that with careful observation when you test one key you
can automatically test the another key. In DES if you use the ones compliment
key you get the ones compliment of the block out. This reduces the number
of full keys need to be searched by a factor of 2. I am sure the NSA new this
when they designed the cipher it just makes it easier for them to recover the
data if some one dumb enough to use DES. One could then actully say DES
is based on a 55 bit key instead of 56. There may be other automatic features
that cut this search done even more. I thought I read 10 years ago that there
where other reductions but not 100% sure about that. However I would hope
the people who coded the EFF machine where smart enough to check into
this sort of thing.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: "Clive Tooth" <[EMAIL PROTECTED]>
Crossposted-To: sci.stat.math,sci.math
Subject: Re: Perfect Shuffle Algorithm?
Date: Tue, 28 Sep 1999 14:46:59 +0100
I wrote in message <7sqato$33e$[EMAIL PROTECTED]>...
>Hmmm... seems like 97,020 to me. Oh well...
There appear to be four cycles:
Length
Card 1: 735
Card 3: 77
Card 9: 180
Card 19: 9
-----
Total: 1,001
-----
lcm=97,020
--
Clive Tooth
http://www.pisquaredoversix.force9.co.uk/
End of document
------------------------------
Reply-To: "karl malbrain" <[EMAIL PROTECTED]>
From: "karl malbrain" <[EMAIL PROTECTED]>
Subject: Re: Compress before Encryption
Date: Wed, 29 Sep 1999 11:56:40 -0700
SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]> wrote in message
news:7stk34$2oik$[EMAIL PROTECTED]...
(...)
> This whole thread was around the concept of "if one is using
compression"
> before one encrypts then what kind of characteristes would be desirable in
the
> compression method so that your not giving information to an attacker in
> solving or breaking the message. The problem with most compression methods
> is that they can't treat an arbitrary file of bits as a valid compressed
file
> that when umcompressed and recompressed you end up with the same file.
What you say here is BIPOLAR in relation to before and/or after compression.
No wonder you're having trouble making this DIFFERENTIATION stick.
> And besides you may lack the intelligence to realize that one could
still
> guess a key and only look at last half of the file minus a few blocks to
see
> if the header is there. We are making the assumption the attacker may not
> know what your files is put he knows what you are doing with it.
Most people are unable to GAUGE their own intelligence. So what you say
here is called a TRUISM.
Karl M
------------------------------
From: [EMAIL PROTECTED] (John Bailey)
Subject: Re: RSA Variation
Date: Wed, 29 Sep 1999 18:12:33 GMT
On Wed, 29 Sep 1999 10:21:25 +0100, "Gary Partis"
<[EMAIL PROTECTED]> wrote:
>Hi,
>
>We have inherited a sub-set of RSA in an embedded system.
>What we do not have is an algorithm for generating keys - the standard
>algorithm does not work! :-(
>Has any one come across this subset before, and if so, do they have the
>algorithm for generating keys?
Not THIS subset, but depending on what you know, you can possibly
adapt the script at:
http://www.frontiernet.net/~jmb184/interests/cryptography/old_trunk/PAIRMKR.BC
This uses the Unix utility BC for unlimited precision decimal
generation of matched keys. If you don't have BC, it may nonetheless
be running on your ISP's server, if its running Unix. Mine does, but
I never dared use if for heavy math manipulation. BC does provide
built in conversion for other bases but my application was base 10.
The script is sparsely commented but may be obvious, depending on your
point of departure, computationally speaking. It does use one number
theory trick, relying on a property of phi(n) the orginality of which
I never determined.
Feel free to move around in the directory to see other scraps I may
have forgotten to mention. Also a listing of a series of pairs
generated is there.
Please let me know if this helps.
John
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Electronic envelopes
Date: Wed, 29 Sep 1999 20:58:28 +0200
Richard Parker schrieb:
>
> Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> > Is it the case that the notary doesn't have to run ANY computing
> > process between time 0 (when the envelope is deposited) and time
> > T (when the enveloped is opened) and only one very short-durated
> > computing process at time 0 and T?
>
> Yes - actually, the notary does even less computing then that. The
> notary does not have to do any cryptographic computation when each
> envelope is deposited. The notary does, however, have to perform key
> generation some time before he can begin to accept any envelopes to be
> deposited.
>
> > Allow me an additional question: Could a person at the server
> > collaborate with the notary to decrypt the message before T or else
> > render a certain set of messages encrypted with the help of the server
> > to be earlier decryptable?
>
> Yes, a malicious time server could collude with a malicious notary to
> open Alice's envelope before the time she specified. If this is a
> concern, use a threshold secret-sharing scheme in addition to the
> time-release protocol.
I haven't yet studied the references you gave but I surmise that
one of the possibilties could be to temporarily setting forward the
system clock at both sites. Using secret-sharing, if I don't err,
greatly eliminates the chance of collusion though does not reduce it
to zero. Actually, however, if I were the depositor of the secret,
I would most likely carefully choose the notaries and I would
personally rely on their being honest. However, the bigger problem
seems to be how the public (those who have certain interest in
matters connected with the secret) can be adequately convinced that
there is indeed no foul play.
It has been my impression so far that in general, for issues of this
or similar nature to be done by software alone, at some point
ultimately one needs a trusted person(s) or agency, one that is
'believed' by all parties involved to be infalliable (without
necessarily being actually infalliable) and always available. This
is true for e.g. E-commerce, I believe.
M. K. Shen
===============================
http://home.t-online.de/home/mok-kong.shen
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: About differential cryptanalysis....
Date: Wed, 29 Sep 1999 18:14:33 GMT
In article <[EMAIL PROTECTED]>,
OTTO <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I want to know more detail of differential cryptanalysis.
> Who can sent me the relaiting document.
>
> Thinks....
It depends on the system I have many papers on cryptanalysis of block
ciphers, just email me ([EMAIL PROTECTED]) and I can send some.
tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Requirement for Uniqueness in Decryption Keys
Date: Wed, 29 Sep 1999 12:27:35 -0600
In article <8E505DBFFrcarbol@news>, Roger Carbol <[EMAIL PROTECTED]> wrote:
>
> The passage I was having difficulty with is:
>
> >>In the Handbook of Applied Cryptography:
> >>An encryption scheme consists of a set of encryption
> >>transformations and a corresponding set of decryption
> >>transformations with the property that for each e there
> >>is a unique key d such that
> >>[d decrypts the message that e encrypted.]
>
> This does *not* discuss what wtshaw is discussing, which is
> mapping one plain text message M to one or more ciphertext
> messages C. Rather, the passage seems to claim that
>
> For every {M,C} there exists exactly one unique decipher
> key d such that d deciphers C back into M.
I thought it important that the subtile difference was mentioned between
the decrypted material and the decription process. Otherwise, I would
agree that the handbook appears to be correct within well-designed
inductive/deductive algorithms. But, for some algorithms it is hard to
quantify an instruction like, "Pick out the words that seem to make
sense."
>
> I am merely wondering if the Handbook is correct in this
> case -- is it really necessary that d be unique? Or is
> it true in the more general case that there may exist more
> than one d which deciphers C back into M?
Otherwise, a specific decryption key need not be allowed in cyclic
algorithms, as x number of chained *encryptions* might return you to
plaintext. The difference here is that if the decryption process is
merely an inversion of encryption, the mathematics of one really contain
the essence of the other.
There are certain algorthms that do not even behave in the same way, what
I would call imperfect cyclics, and I have seen series that do the strange
thing of entering some sort of loop at some some point, really weird and
unexpected stuff.
--
Still a good idea from Einstein: If you can't explain something clearly to a child,
you do not understand it well enough.
So much for models of trust, they generally are ill-founded.
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Compress before Encryption
Date: Wed, 29 Sep 1999 18:12:55 GMT
In article <7st5oh$c9m$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (SCOTT19U.ZIP_GUY) wrote:
> In article <7sstjs$snq$[EMAIL PROTECTED]>, Tom St Denis <[EMAIL PROTECTED]>
>wrote:
> >In article <[EMAIL PROTECTED]>,
> > [EMAIL PROTECTED] wrote:
> >> Tom St Denis <[EMAIL PROTECTED]> wrote:
> >> : [EMAIL PROTECTED] wrote:
> >>
> >> :> Hasn't this holy war gone on long enough? When do you ever quit?
> >>
> >> : What I don't get is almost EVERYONE agrees that compression before
> >> : encryption is a good idea. So why is he carrying this on?
> >>
> >> Perhaps because he has recently developed the only vaugely sensible
> >> compression product targetted at encrypted data that anyone knows of?
> >>
> >> As he mentions, using an unsuitable compression routine may actually
> >> weaken the encryption.
> >
> >But if the compression is 'weak' cryptographically that means it's bad
> >compressionally as well.
> >
> >Take LZSS for example (or LZSS+Huf which alot of people use). You can just
> >run in on the data without putting headers or anything. And ten bucks says
> >it will compress 30% better then huffman alone. Or DEFLATE for example is a
> >very fine tuned algorithm. Since it's output is highly efficient that means
> >predicting the output is as easy as predicting it's input.
> >
> >I personally don't see where this comes in. Obviously encrypting known
> >headers is a bad idea. No matter what compression you use, if you encrypt
> >the header you are gonna give out known plaintext. Whether you use LZSS,
> >DEFLATE or his huffman coder. Possibly is most flawed line of thinking is
> >that a compression routine doesn't output headers. The program does. For
> >example you can LZSS a 10kb buffer and have zero bytes of headers or anything
> >(well maybe 4 bytes for the compressed size). Nuff rambling. Show some
> >proof and I will follow
> >
> >Tom
> >
> The problem Tom is your to stupid to understand proof. The point is I have
> stated how one can test for "one to one" compression and your to dam lazy
> to think. I started with adaptive huffman compression becasue it was the
> easyiest to make "one to one" but the concept is beyond your small brain.
> I currently am testing a huffman with RLE and a limited LZSS capability.
> I have been told that the second may violate patents. As for the other waiting
> for the status of a paper.
> Yes I treid to write a paper for the ACM. I try about once a year to write a
> paper but of course most like the AES thing are really closed. The AES group
> did not really want good ciphers since the NSA candidate will obviously win.
> The ACM is taking html text. Of course I realy don't expect to get it
> published. I just wish I lived near a friend of mine who use to wirte. We
> could seend two papers he would wrote both but change name and guess what the
> one with my name will not make it. But it least I try and I know the game
> is corrupt.
> And brain dead i would not have entered scott16u or scott19u they are to
> secure and those are not what I would have entered for AES since file
> security for transfers was no the real criteria.
> For those of you not following this thread or for Tommy since his brain
> can't seen to retain anythng that a crypto god has hand fed him. If you
> use a compresion decompression that can take treat any file as a
> valid compressed file. Then it is safe to use on your data before you
> encrypt. IF you use methods and as far as I can tell that is everything out
> there at this time but my adaptive huffman coder ( the proof that a static
> huffman with a tablee in front can't was posted fere earlier) lack this
> feature. Why they lack this feature I do not know. But it is easy to test if
> this feature is present. Of course Tommy is to stupid to even test a method
> for this.
> The problem with a non one to one ocmpression is that you can send
> possibly a file that is completely unknow but if you go through
> a standard compression routine and then encrypt. It may be possible
> that the only valid compressed file that can come out is the one you
> compressed with the bad compression method. This would not happen
> if you used a one to one compression in the first place, But tommy
> acts like he is to stupid to understand this. If he is this stupid I would
> not have much faith in his program. But the game in crypto is not
> to give hooks for the attacker to use. Why this topic is not a
> hot topic in the books makes my think that the NSA does not
> want people to use compression for a frist pass that would make
> there job harder. And while we are at it a reverse huffman pass
> is not a bad idea either.
1) First off LZSS is not patented. 2) LZSS can treat anything as valid input
to produce crap output. Only dictionary methods like LZW because the output
are code/literals which could point to empty strings. At any rate it is
possible to allow any LZ method to decompress anything as long as you work
around things like null strings... 3) That is not a requirement in your view,
since even if your magic will decompress anything, IT WILL NOT DECOMPRESS
ANYTHING INTO VALID ASCII TEXT, which can be used to detect successfull
decryptions. 4) See #2 5) Actually read #2 6) Make sure you understand #2 7)
Good let's carry on with life?
Hey Dave, if you have any papers lying around I wouldn't mind reading them.
I would like to see what ACM/AES turned down (if there really si anything at
all).
Please drop this insane benine arguement. You talk the talk but you never
walk the walk. You say alot is weak, why not prove it? Your insane
arguments are not proof. I would like to see a deterministic process for
breaking modern cryptosystems like PGP (maybe even Peekboo) that use 'weak'
deflate code (well peekboo doesn't use compression but I would like to see
you break a 'kids' cryptosystem, since I am really stupid...).
Anyways, seriously can we drop this? Please? I am begging you... just don't
reply to this message and NEVER ever start another compress/encrypt thread...
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************