Cryptography-Digest Digest #353, Volume #10 Sat, 2 Oct 99 15:13:03 EDT
Contents:
Re: Addition/subtraction mod 256 vs. XOR (Mike DeTuri)
Re: Addition/subtraction mod 256 vs. XOR (Mike DeTuri)
Re: Compress before Encryption (Tim Tyler)
Re: Compress before Encryption (Tim Tyler)
Re: Compress before Encryption (Tim Tyler)
Announcement of results (E C D L Account)
Re: Quantum Crytography (Tim Tyler)
Re: msg for Dave Scott (Tim Tyler)
Re: Cryptanalysis of 2 key TDES (fungus)
Re: FBI issues warrant for Alice & Bob ("Microsoft Mail Server")
Re: Ciphers Categorized on Web Site
FBI issues warrant for Alice & Bob ("Goyra")
Re: Paper announcement (Tom Knight)
Re: Announcement of results (Bill Unruh)
Re: Compress before Encryption (SCOTT19U.ZIP_GUY)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (Mike DeTuri)
Subject: Re: Addition/subtraction mod 256 vs. XOR
Date: Sat, 02 Oct 1999 14:17:54 GMT
On Sat, 02 Oct 1999 12:08:00 GMT, Tom St Denis
<[EMAIL PROTECTED]> wrote:
>The problem there is you have to program a decryption routine. A novel
>feature of RC4 is you only need two functions to use it (one to make the key,
>one to encrypt/decrypt). Why make the code harder?
Doing addition mod 256 would make the code harder but I was wondering
if it this would be of any benefit...
I wasn't really all that clear, sorry. I was thinking of add/subtract
mod 256 at the end of the of the loop instead of XORing the RC4 byte
with plaintext. Now you've got me wondering if it would be beneficial
to add/subtract mod 256 throughout, replacing all the XORs. I tend to
think it wouldn't make any difference and make the code needlessly
complex, but I'm not a pro so figured I would ask. :-)
Mike
------------------------------
From: [EMAIL PROTECTED] (Mike DeTuri)
Subject: Re: Addition/subtraction mod 256 vs. XOR
Date: Sat, 02 Oct 1999 14:24:59 GMT
On Sat, 02 Oct 1999 14:17:54 GMT, [EMAIL PROTECTED] (Mike DeTuri)
wrote:
>I wasn't really all that clear, sorry. I was thinking of add/subtract
>mod 256 at the end of the of the loop instead of XORing the RC4 byte
>with plaintext. Now you've got me wondering if it would be beneficial
>to add/subtract mod 256 throughout, replacing all the XORs. I tend to
>think it wouldn't make any difference and make the code needlessly
>complex, but I'm not a pro so figured I would ask. :-)
Sorry again. Way too early in the morning. I just remembered there
are no other XORs in RC4 other than the one at the end to get the
ciphertext byte.
Mike
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Compress before Encryption
Reply-To: [EMAIL PROTECTED]
Date: Sat, 2 Oct 1999 14:28:31 GMT
Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
> "One to one mapping" (injection) is a standard mathematical term,
> typically taught in high school or even earlier. It means that
> the mapping takes distinct elements in the domain to distinct
> elements in the range. Thus, a "one to one compression" would
> merely produce different compressed outputs for different
> compressed inputs. But *all* lossless compression schemes have
> this property.
You [Douglas] are taking the view of the compression program, where the
domain is the original file and the range is the compressed version.
By simply taking the perspective of the decompression program, where
the domain is the compressed file and the range is the expanded text,
you will find that David's one-on-one terminology suddenly makes perfect
sense.
I'd agree that some term, such as "bijective compression" would eliminate
any ambiguity - but "bijective" is not a terribly tongue-friendly word ;-)
[sorry about this article's threading]
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Archimedes had no principles.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Compress before Encryption
Reply-To: [EMAIL PROTECTED]
Date: Sat, 2 Oct 1999 15:23:37 GMT
Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> Indeed. However is precompression that fails to have error correction
:> and consequently offer clues to the attacker that they have found a
:> correct key mentioned?
: Why should it be?
It should be mentioned in such texts bacause offering clues to
the opponent relating to when they have found a valid compressed
file is surely a bad idea. This doesn't seem to have been
properly appreciated until recently.
: Data compression in general doesn't incorporate any error
: correction; that can be layered on separately in the comm
: system if desired, but is irrelevant to compression.
Data compression *very* frequently incorporates error *detection* -
which is almost as bad from a cryptographic point of view. Neither I,
nor D. Scott nor as far as we can see anyone else reading this forum
knows of a single other decompression product which will always produce
a unique decompressed file from its input and never give any errors at
all.
[snip one-on-one as this has been covered elsewhere]
:> : I think the plain fact is that the small amount of structure
:> : from the compression header is generally not considered sufficient
:> : to exploit, given that a cipher is already infeasible to brute-force
:> : even in a known-plaintext attack. [...]
:> However, why have less security than is easily possible? Who are
:> you to say that the system is /so/ secure that adding more security is
:> pointless?
: Who are *you* to put words into my mouth? I said no such thing.
D. Scott's method increases the security of the system. You don't
seem to approve of it. I don't see what your problem is.
:> Sequlching the source statistics is desirable; but simultaneously
:> introducing an insecurity whereby an automated search through the
:> space of the keys has an increased probability of identifying the
:> plaintext automatically is surely something to be avoided.
: Now, *that* is pointless, because as I said nobody would employ
: a cryptosystem where what was a feasible attack to begin with.
What?
Say Operator Y uses his thumbscrews on The Evil Assasin's assistant.
The assistant is half-way through divilging his key when the Assasin
himself realizes what is going on and assasinates him.
Suddenly you have half the key - and have reduced the keyspace by a
*huge* factor. Suddenly a brute force search through the keyspace becomes
practical and - joy of joys - the compression routine is known to leave
clues about its activities carelessly lying around, so an automated
attack is possible.
This is /clearly/ a security issue. The idea that just because a
brute-force attack is infeasible, means that it's OK for the
compression program to systematically leave ordered information
around seems highly dubious.
:> As far as I can see all these technical points are wrong.
: Evidently, that *is* as far as you can see :-)
The style of compression D. Scott is advocating is obviously the most
appropriate type of compression. In a short period of time, more
one-on-one compression techniques will become available and everyone who
knows anything about cryptography will wind up using them.
There really is no technical argument against using such compression.
At the moment there are not many one-on-one compression routines
available, but - since all the most efficient compression routines are
demonstrbly one-on-one, I'm sure it will soon become a selling point -
in conjunction with the security question marks that appear to be hanging
over the use of most other types of compression.
:> If I were D.Scott I would find this hostility bewildering: don't
:> you understand the security implications properly? Are you part of
:> a NSA conspiracy designed to keep people using a technique with
:> unnecessary security problems?
: To the contrary, I am an opponent of people wasting their time
: on solutions to non-problems.
This does not appear to be a non-problem.
: Defenses that make sense only against brute-force keyspace
: search are wasted effort.
The presence of (for example) checksum information in compressed data
blocks is not only helpful to those who would employ brute force attacks.
It offers cryptanalysts clues in much the same way that a high frequency
of the letter "e" in a message does.
: If you employ precompression (which if you had paid attention you
: would have seen that I did recommend), you should do it for the
: right reasons.
Increased security is as right a reason as I can imagine. It's difficult
to quantify how much of a security increase is involved - but hey, it
is effectively for free, so why not?
Compression gets rid of regularities in the data. From a security
POV, letting a compression routine introduce regularities of its own goes
against the whole principle of applying compression before encryption in
the first place.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
The mailman bringeth and the trashman taketh away.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Compress before Encryption
Reply-To: [EMAIL PROTECTED]
Date: Sat, 2 Oct 1999 14:56:46 GMT
James Felling <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
:> : Tim Tyler wrote:
: Asume you are using a strong compression method. Further assume that you are
: using a block cypher with block size B. Also assume that the length of your
: message M when compressed > B. Lastly assume that the message being sent is of
: a known format(say english tex).
: I feel that these are all reasonable assumptions. For most modern cyphers B
: will be less than 32 bytes, and most messages will be at least 200 characters
: long. (asume 75% compression) a message will be at least 2 blocks, and more
: probably 3 or more blocks long. The only assumption I feel is unreasonable is
: the last, but in most cases the analyst will have some idea what kind of
: trafic it is that they monitor.
: With a typical modern compression every possible first block is possible, and
: thus for that block it is 1-1. An analyst attacking this data will take the
: Cyphertext( call it C), and do the folowing when testing key K,
: Decompress(Decrypt(C,K)), while this may end in mid character with the known
: compression algorithim, it will still generate at least 32 bytes of
: plaintext. If these match the probable plaintext(i.e. they are all text
: characters or whatever) this key is flaged as a possible.
: The only time that the whole file will be attempted is over this set of
: possible keys. If there are more than a few of these possibles that result in
: coherent plaintext, I wll be very surprised. The odds of this false match
: occuring for more than a block or 2 are very low.(It is much more likely that
: the analyst won the lottery that day than 3 blocks of coherent plaintext
: emerging at random)
:> : I think the plain fact is that the small amount of structure
:> : from the compression header is generally not considered sufficient
:> : to exploit, given that a cipher is already infeasible to brute-force
:> : even in a known-plaintext attack.
:>
:> It's trivially true that if you're cypher is already totally secure then
:> adding more security won't help.
: Even asuming a headerless compression, known plaintext attacks may easily
: occur, and will be no slower than known plaintext attacks vs. the same cypher
: in an uncompressed setting.
Someone else has attempted to make this point. Is it correct?
Since you have less cyphertext to deal with you have less information
about the workings of the cypher. Of course having known plaintext helps
immensely, but I fail to see how it eliminates the advantage offered by
having less cyphertext - unless you (unrealistically) presume unbounded
quantities of cyphertext are available.
:> : Therefore nobody other than D.Scott has spent much time worrying about
:> : this aspect. The gain in security from squelching the source statistics
:> : is so significant that it dominates the other issue.
:>
:> Sequlching the source statistics is desirable; but simultaneously
:> introducing an insecurity whereby an automated search through the
:> space of the keys has an increased probability of identifying the
:> plaintext automatically is surely something to be avoided.
: I don't see such a weakness being introduced here. There are headerless
: compression schemes that do just fine, and on a short block by block
: decompression this will not be an issue.
If you have a compression technique which is very close to expanding every
input file to a valid output file, then you /may/ be relatively safe.
However, if you have parity bits, checksums, or other error recovery
information in your compressed data, you may be opening a significant
security hole - these are artefacts which may be exploited by
cryptanalysis in much the same way that having a high frequency of
the letter "e" can help.
If this is happening, it's a weakness. Just because you've stripped the
headers off a conventional compressed file, that doesn't mean that
parity and checksum information is no longer present.
: Precompression is, in general, an excelent idea. I feel that one should use
: the most effective method of compressing the data available.
I agree. Since methods which aren't one-on-one have short compressed
files which fail miserably to expand to anything useful, it should be
clear enough that the most effective method of compressing the data
will always - in principle - be a one-on-one technique.
: Then assuming that criteria is met, a 1-1 method will have some minimal
: prefrence over a non- 1-1 method.
I'm suprised you feel able to quantify the benefit as minimal. It can be
the difference between leaving in clues to aid crypyanalysis and
eliminating them. I can't see why anyone would think twice about the
issue - it's a no-brainer that removing as many clues as possible
is a good idea.
: ( though speed, memory usage, robustness, error checking/detection,
: etc must still be considered).
Robustness and error checking/detection are just other ways of talking
about clues left in the compression program that act as aids to
cryptanalsis. Having an error-detection technique (of the sort which is
common in many compressed data formats) that catches 99.9% of errors
and allows the user to detect that the file has been corrupted are
*exactly* the type of thing that leads to the security problems that
D. Scott is trying to avoid.
As for speed, and memory usage, I would expect that stripping out all
error detection and checksumming code from a compression routine would
make it faster and leaner.
Unless you have some concrete reason to believe that one-on-one
compression techniques are slower or more memory hungry, this
seems to be invoking problems on the basis of absolutely no evidence.
: However, precompression will not hinder certian types of attack against your
: system. If those forms of attack are the ones you are faced with -- it will
: make no difference whether you compress or do not.
Well, if your keys are being stolen from under your operator's noses
precompression is not going to help.
Precompression is a pretty broad defense against the cracking of cyphers,
though. Less cypher text is a good idea under practically any
circumstances.
:> As far as I can see all these technical points are wrong.
: Why?
Lack of any good reason for not using this type of compression.
It's true that so far only a small number of such routines are available,
and that not all types of data are well catered for.
However, people should be writing more one-on-one compression routines,
rather than criticising the idea because of their lack.
:> Why use encryption technology that compromises security when
:> D.Scott's compression is now available for download?
: Because more efficient compression methods exist, and in my jugdgement
: efficiency will improve security more than a 1-1 being 1-1.
Efficiency is /very/ important - and /all/ the most efficient compression
techniques are in theory one-on-one! ;-)
As for efficiency being *more* important - I would say it depends on
the type of clue left to the crypyanalyst.
If your one-on-one compression leaves structures in the data that can help
the analyst, that's bad. However if you increase the compression ratio,
but introduce a bunch of compression artefacts, that's also pretty bad.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Two monologues do not a dialogue make.
------------------------------
From: E C D L Account <[EMAIL PROTECTED]>
Crossposted-To: comp.theory
Subject: Announcement of results
Date: Sat, 02 Oct 1999 14:59:04 +0100
Could I take this opportunity to invite comments on the following paper:
http://www.angelfire.com/nv/papers
The main point of this paper (that I am hosting for the author) is a
construction which claims to use coherent optics and the wavefront
reconstruction property to solve various computationally hard problems
(that is NP-complete problems, and trapdoor functions such as prime
factorisation).
This is an extraordinary claim, and although the paper has now been
distributed for several months to a number of researchers in the field of
optics and computational complexity, the author (perhaps understandably)
wishes to remain anonymous at this stage.
Please direct all e-mail correspondence to either the appropriate
newsgroup, or the e-mail address given at the site mentioned. (i.e. not to
this e-mail address)
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Quantum Crytography
Reply-To: [EMAIL PROTECTED]
Date: Sat, 2 Oct 1999 15:33:46 GMT
Arthur Dardia <[EMAIL PROTECTED]> wrote:
: A very cool article on Quantum crytography. Makes you think.
: http://www.newscientist.co.uk/ns/19991002/quantumcon.html
: I highly suggest reading it.
Though don't bother if you've read Simon Singh's "The Code Book" - as
it is word-for-word equivalent to his last chapter in many places.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
The internet is full, go away.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: msg for Dave Scott
Reply-To: [EMAIL PROTECTED]
Date: Sat, 2 Oct 1999 15:42:44 GMT
Tom St Denis <[EMAIL PROTECTED]> wrote:
: So generally a 'blind' keysearch is the only way.
I'd say at least 99% of cypher breaks depend on techniques which are
different from a "blind" keysearch.
: And may I remind this group the brute force of the keyspace is not
: the only way to go.
You could if you hadn't just written the preceeding sentence which says
almost exactly the opposite.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
The bad news is that the good news was wrong.
------------------------------
From: fungus <[EMAIL PROTECTED]>
Subject: Re: Cryptanalysis of 2 key TDES
Date: Sat, 02 Oct 1999 18:04:58 +0200
David Wagner wrote:
>
> DESX does not have the security level you'd expect from Triple-DES:
> it is susceptible to a differential attack requiring something on
> the order of 2^60 chosen texts
Only 2^60, eh? .... <g>
--
<\___/>
/ O O \
\_____/ FTB.
------------------------------
From: "Microsoft Mail Server" <[EMAIL PROTECTED]>
Subject: Re: FBI issues warrant for Alice & Bob
Date: Sat, 2 Oct 1999 14:10:21 -0400
let's hope they divert most of their resources to this pressing problem!
--
best regards,
hapticzemail at email.msn.com
remove first email, sorry i had to do this!!
------------------------------
From: [EMAIL PROTECTED] ()
Subject: Re: Ciphers Categorized on Web Site
Date: 2 Oct 99 17:54:58 GMT
[EMAIL PROTECTED] wrote:
: There was another mention of that cipher on the same page, which I left in
: with a pointer to where I have moved the merged description, to which I
: have finally added an example:
: http://www.freenet.edmonton.ab.ca/~jsavard/mi060302.htm
Except for adding new descriptions of ciphers to my web page, and
*finally* getting to the section on Kerberos, I think the site is just
about finished! And the section that inspired me to believe I have reached
this sort of completion is appropriately entitled "Tying up Loose Ends".
This section is at:
http://www.freenet.edmonton.ab.ca/~jsavard/mi060303.htm
It is all about what to do if one has an odd bit or byte at the end of
one's message. So it describes the technique of _ciphertext stealing_ from
Applied Cryptography, among other things.
However, I advise you to be sitting down when you read it, and not to read
it right after eating.
This is because I obtain many illustrations of different problems of this
kind from my own plan for a PGP-like E-mail encryptor, involving some very
elaborate fractionation steps, so that I can perform encryption this way:
- Compress the message into a binary form;
- Encrypt while in binary using today's encryption technology
- Convert to letters of the alphabet, using a very efficient, but also
very elaborate, 47-bit to 10-letter encoding, and overlap that conversion
with fractionation based on a 3-symbol alphabet and a 5-symbol alphabet
- Encrypt the letters using classic methods, including methods similar to
some of the better rotor machines
- Finally, armor the letters, reducing four letters to three symbols from
a 78-character set, for efficient transmission in text form.
Thus, the principle of padding the end of a message is illustrated in a
number of ways.
John Savard
------------------------------
From: "Goyra" <[EMAIL PROTECTED]>
Subject: FBI issues warrant for Alice & Bob
Date: 2 Oct 1999 16:46:25 GMT
Goyra news service
Dateline 2.Oct.99
The FBI issued today a warrant for the arrest of two
persons known only as Alice and Bob. FBI director
Louis Freeh stated "There are many first-hand
accounts, in the cryptanalytic literature, of these two
individuals communicating with secure systems. In
some cases they are described as being in an unnamed
"foreign country". The equipment they use is clearly
subject to U.S. export restrictions and we want to interview
them about just what equipment they have taken out
of the country."
The surnames of Alice and Bob are unknown, and no
pictures are available. None the less the FBI are determined
to pursue them. "This is an issue pertaining to national
security" stated Freeh, "and we take it very seriously. Also
we would like an individual known as Eve to step forward and
tell us what she knows." Eve, according to all accounts, is
a surveillance expert who has spent years trying to crack Alice
and Bob's codes without success. "Eve has nothing to fear
from us" stated Freeh. "In fact we want to employ her".
(c) Goyra 1999
------------------------------
From: Tom Knight <[EMAIL PROTECTED]>
Subject: Re: Paper announcement
Date: 02 Oct 1999 12:02:51 -0400
[EMAIL PROTECTED] writes:
> I would like to take this opportunity to invite comments on the
> following paper:
>
> http:/www.angelfire.com/nv/papers/
Short version: it won't work.
Longer version: The author pays careful attention to the phase of the
optical signal in going through a "gate", but fails to pay attention
to the amplitude of the signal. An AND or OR operation (but not an
XOR "EOR" in his notation) will have two possible amplitudes of output
signal, depending upon how many of the input terms are satisfied.
When the output signal is used in subsequent stages of logic, the
amplitude of the signal is important, because only signals of equal
amplitude interfere in the desired way to produce the required logical
operation. This is said more simply by saying that the logic gate is
made of linear components.
A similar concern exists for signal fanout or fanin, where the
amplitudes are poorly controlled. For fanout, this can be compensated
for, because the losses can be measured. For fanin (or, equivalently,
an "OR" operation) the losses cannot be compensated for, because,
again, the number of inputs on are not known.
This whole paper results from a misunderstanding about what is
difficult in making logic gates. The logic is trivial. The hard part
is building the signal restoration and standardization logic, which
requires a nonlinearity -- a component missing in this model (with the
possible exception of the photographic film).
Another, obvious, cause of concern is his notion of "forgetting". I
would remind the author that what is forgetting in the forward
direction is inventing in the reverse direction. Where, one might
ask, does he expect the information to perform the reverse operation
to come from, if it has been forgotten.
------------------------------
From: [EMAIL PROTECTED] (Bill Unruh)
Crossposted-To: comp.theory
Subject: Re: Announcement of results
Date: 2 Oct 1999 18:51:40 GMT
In <[EMAIL PROTECTED]> E C D L Account <[EMAIL PROTECTED]> writes:
>Could I take this opportunity to invite comments on the following paper:
>http://www.angelfire.com/nv/papers
Yes, and I think you have tried hard to make getting the paper as hard
as possible. That page points you to another page which points you to a
Zip file, which unzips into a directory structure with blanks in the
directory name (This may be OK in Win, but can cause havoc in say
command line Unix.) furthermore, the technique of incorporating diagrams
within the text has now been a standard part of Latex for many many
years.
Then the diagrams are crowded 6 to a page so they are completely
unreadable on a screen.
And you want people to read them?
>The main point of this paper (that I am hosting for the author) is a
>construction which claims to use coherent optics and the wavefront
>reconstruction property to solve various computationally hard problems
>(that is NP-complete problems, and trapdoor functions such as prime
>factorisation).
It would of course be worthwhile if the author actually showed how, in
detail, to use this technique to solve even one hard problem-- not in
princple, but in practice.
Note that trading exponential complexity in time for that in space is no advantage.
>This is an extraordinary claim, and although the paper has now been
>distributed for several months to a number of researchers in the field of
>optics and computational complexity, the author (perhaps understandably)
>wishes to remain anonymous at this stage.
And what have their remarks been?
>Please direct all e-mail correspondence to either the appropriate
>newsgroup, or the e-mail address given at the site mentioned. (i.e. not to
>this e-mail address)
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Compress before Encryption
Date: Sat, 02 Oct 1999 19:30:53 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:
>Compression gets rid of regularities in the data. From a security
>POV, letting a compression routine introduce regularities of its own goes
>against the whole principle of applying compression before encryption in
>the first place.
It is precisely the above point that makes me think Mr B.S. is a phony
he should have mentioned this in his book. God knows he had the chance
but he chose not to. My question to the masses is why.
I think it is becasue he does not want people to use cryptography in
the correct way. Why Doug should act so stupid is beyond me. He
claims to be a cryptographer but seems to lack attension to details?
Again I say why?
I would recommend everyone read page 226 of Bruces book. But don't buy it.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************