Cryptography-Digest Digest #101, Volume #11 Fri, 11 Feb 00 21:13:01 EST
Contents:
Re: Bounding (p-1)(q-1) given only knowledge of pq (David Hopwood)
Re: Guaranteed Public Key Exchanges (David Hopwood)
Re: Using Gray Codes to help crack DES (David Hopwood)
Advanced Network Authentication ("Joseph Ashwood")
Re: Using Gray Codes to help crack DES (Mok-Kong Shen)
Re: RFC: Reconstruction of XORd data (Jerry Coffin)
Re: Which compression is best? (Tim Tyler)
Re: Bounding (p-1)(q-1) given only knowledge of pq (Anton Stiglic)
Re: Which compression is best? (Tim Tyler)
Re: Which compression is best? (Tim Tyler)
Re: need help with a basic C++ algorithm ([EMAIL PROTECTED])
Re: Message to SCOTT19U.ZIP_GUY (Tim Tyler)
Re: Which compression is best? ("Douglas A. Gwyn")
Re: Message to SCOTT19U.ZIP_GUY (SCOTT19U.ZIP_GUY)
Re: Which compression is best? (SCOTT19U.ZIP_GUY)
Re: help DES encryption ("Douglas A. Gwyn")
Re: Which compression is best? (SCOTT19U.ZIP_GUY)
Re: Which compression is best? (SCOTT19U.ZIP_GUY)
----------------------------------------------------------------------------
Date: Fri, 11 Feb 2000 06:10:36 +0000
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Bounding (p-1)(q-1) given only knowledge of pq
=====BEGIN PGP SIGNED MESSAGE=====
Joseph Ashwood wrote:
>
> For let me say that I'm not sure this information is new,
> but to the best of my knowledge it was never told to me.
Note that finding (p-1)(q-1) is provably equivalent in difficulty
to factoring pq. So, although it is possible to put (rather loose)
bounds on (p-1)(q-1), this doesn't help to break RSA.
- --
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5 0F 69 8C D4 FA 66 15 01
"Attempts to control the use of encryption technology are wrong in principle,
unworkable in practice, and damaging to the long-term economic value of the
information networks." -- UK Labour Party pre-election policy document
=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv
iQEVAwUBOKOnxDkCAxeYt5gVAQFEmwf/X/72IDfm0JV76HipRiT0YUWMFgizWc5b
Rm3RJj7j2zegSRaJmeQ1cvhcA8TxssE2pzPeB5FtZdM8x/QKpw1ohOp/sITHRHyl
QHHVFdgfwPxPOSLmr9472BtpRY92j6xVmfyQOLwgdUBQTc5rRFEE9lsfp+7zlNmU
2Gc1D4YekmvUsAQWGVmF46w2B/r+KvBvGxaWsOx0OmHr/hlz1F/XRQoN656WeYoA
NdXK5hnh2/JZmxJLmfahEwnl9hD2F/7cpaFHDZyx3bolRYxJ8Ebq/SJp+IEV0Pkw
7RTACgI8LIvSlGm65qAjXHQj/PfvvgO9Z7WAXB1SQHKBWLawrl5SzA==
=Y1gS
=====END PGP SIGNATURE=====
------------------------------
Date: Fri, 11 Feb 2000 05:38:51 +0000
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Guaranteed Public Key Exchanges
=====BEGIN PGP SIGNED MESSAGE=====
Dan Day wrote:
>
> On Thu, 10 Feb 2000 02:29:58 +0800, No Brainer <[EMAIL PROTECTED]> wrote:
> >
> >I was hoping there was a new protocol of some type that would allow me to
> >exchange public keys with integrity and take the middle man "out of the
> >loop" (without the need for another secure channel of course).
[...]
> Well, I've got at least a partial workaround... As one of your
> first messages, send your recipient an attached EXE file of a popular
> sort, such as the "gerbil in the microwave" animation.
If you're sending executable files, the man-in-the-middle has another type of
attack: replace the .exe with a trojan that installs a back door, and then
runs the original executable (using the same techniques as are used by
viruses). Then even if you discover the switched public keys, the attacker
still has a back door into your system.
> But at least it's a step in the right direction.
No, probably a step backwards.
- --
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5 0F 69 8C D4 FA 66 15 01
"Attempts to control the use of encryption technology are wrong in principle,
unworkable in practice, and damaging to the long-term economic value of the
information networks." -- UK Labour Party pre-election policy document
=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv
iQEVAwUBOKOgUTkCAxeYt5gVAQHw4QgAi+0qAgKsRqZ+9Gy/Hq3WzUwf6j4Qg6X6
yEr9xbXUzYNaLP5Fd8qOD0EGh1Ixm7AafrxKX1WN2rhxK03wRnkWLtqyB8G1Z7Zy
rG5kYISUdhvzQfP9Pt9BaX6WocN4hZ7keRU70uN3LfuQ8d6XfBZjhPb+lMjnJsh5
OkfpPdSlSlPL7F9s11QTmq3oz/4admrM1BgOdBN5urixMiJx3CuVx02m1H7dBa+0
BP8XhOYvh6hMrcA2VerzZ6AFdV4nbhpDHULDU2RHLSRIGjEOc4H02dif9GCpITux
/2mqWayKGbcQ1xZZ8Ot5Mouwp2DU/lnxDIWyN48tbGd8zUM4/vAIdA==
=r5cG
=====END PGP SIGNATURE=====
------------------------------
Date: Fri, 11 Feb 2000 05:05:57 +0000
From: David Hopwood <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Using Gray Codes to help crack DES
=====BEGIN PGP SIGNED MESSAGE=====
Dan Day wrote:
>
> On 8 Feb 2000 23:24:52 GMT, [EMAIL PROTECTED] wrote:
> >
> > count = count + 1
> > greycode = count ^ (count >> 1)
>
> Hey, that's really slick. But I'll be damned if I can figure
> out why it works...
When you add one to a number n, the effect is always to invert the
rightmost k+1 bits, where k is the position of the rightmost zero bit
in n (positions are numbered from right to left starting at zero for
the LSB).
E.g. 10+1 = 11 (k = 0; 1 bit changes)
1011+1 = 1100 (k = 2; 3 bits change)
Now consider the effect this has on n ^ (n >> 1): only the single bit
at position k is inverted, because the changes in the bits to the
right of that in n and (n >> 1) cancel.
When counting in grey code, the bit that changes is the leftmost bit
that would change if you were counting normally, and that's why this
method works.
- --
David Hopwood <[EMAIL PROTECTED]>
PGP public key: http://www.users.zetnet.co.uk/hopwood/public.asc
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5 0F 69 8C D4 FA 66 15 01
"Attempts to control the use of encryption technology are wrong in principle,
unworkable in practice, and damaging to the long-term economic value of the
information networks." -- UK Labour Party pre-election policy document
=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv
iQEVAwUBOKOYSjkCAxeYt5gVAQFjkAgAo1ZQzeIyQI8lycEqR9M1FgkB9H1Z+4Oy
5P5jUOVR7+EjLS4C4DJiigiTF+apyxQUOpufut4WcVeRbDBFi387F0ztBKwhUgZt
GwvGuOwyQaL2rOXqwUDn3QhdqPLLss4jfaAmc9GH3HzmX0dV6cGHeDeBkRGzSc/e
8F1GvimwzPDv0UClwybh3vDrugbP3O5X5lep0T+dMa7YqR03XsYvfp52yiFvpojF
urlHG6ChTlpnKIzjKbTRXk6lX7ZvKjwa3zIZutO6Xx4kddxIoB/1XIUdAOCXdpcd
YkDFeQOrYW68Qft85hSDcmHWCx6AgLtHbcU5OsAm6hSxOFNohmkjmA==
=Ql6E
=====END PGP SIGNATURE=====
------------------------------
From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Advanced Network Authentication
Date: Fri, 11 Feb 2000 15:25:55 -0000
Crossposted-To: comp.protocols.kerberos
A few others and myself are working on a network
authentication mechanism that is not subject to the minor
failings of Kerberos V5. If anyone is interested please stop
by http://www.onelist.com/community/AdvanedNetworkAuth where
you can sign up for the list that will be working on it.
Joseph Ashwood
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Using Gray Codes to help crack DES
Date: Sat, 12 Feb 2000 00:43:15 +0100
Errata:
In case 1,
(c+1) >> 1 0bbbb111
should read:
(c+1) >> 1 0bbbb100
M. K. Shen
------------------------------
From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: RFC: Reconstruction of XORd data
Date: Fri, 11 Feb 2000 16:52:09 -0700
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
says...
[ ... ]
> You are right. A correct auto-key encipherment should have a
> secret 'key'. (In what was described by the original poster, however,
> the 'key' is known!! cf. my follow-up.)
You could produce something like a key by adding several random bytes
to the beginning of the data, roughly like an IV. After doing the
XOR'ing, remove them again and you've got a _little_ bit of a key.
I'm pretty sure the basic method would still be fairly weak, but at
least with something on this order you'd at least have _something_.
--
Later,
Jerry.
The universe is a figment of its own imagination.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Reply-To: [EMAIL PROTECTED]
Date: Fri, 11 Feb 2000 23:27:07 GMT
Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
: John Chandler wrote:
:> I just noticed a post crediting "DS" with suggesting using
:> adaptive compression in _both_ directions before encrypting.
: If the forward pass does a good compression job, I wonder how much
: additional compression could be expected from the backward pass.
The backward pass is not intended to compress. AFAICS, It's purpose is
primarily to make sure that information necessary to decrypt plaintext is
not present in any fragment of the message.
This means (for one thing) that an analyst is effectively forced to
decrypt the whole message in order to recover any decompressed text at all
- which you will probably need to do before you can reject a key.
He can't just decrypt a block, decompress it and look for plaintext.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
At the party, first we were making Mary, then we jumped for Joy.
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Bounding (p-1)(q-1) given only knowledge of pq
Date: Fri, 11 Feb 2000 19:20:51 -0500
Joseph Ashwood wrote:
> Actually in general there will be 2^1024, and in particular
> it is the general assumption that the best way to do RSA is
> to use primes of the same length, so for a 1024 bit modulus,
> each of the primes cannot be less than 2^512+1 and so p-1
> cannot be less than 2^512, (2^512)^2 is 2^1024. You also
> seem to have ignored the upper bound of pq which puts a
> severe cap on the possible values.
What???
What difference do you see between 2^1024 possibilities and
2^1022 possibilities. What upper bond of pq is so important?
You are search something like 2^(1024 + X), 0<= X <= 4 for
your pleasure, values, why not just try to factor.
Anton
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Reply-To: [EMAIL PROTECTED]
Date: Fri, 11 Feb 2000 23:34:07 GMT
Thomas Pornin <[EMAIL PROTECTED]> wrote:
: According to <[EMAIL PROTECTED]>:
:> 1) From a security perspective, how important is compression?
: Rather unimportant. If your cipher is weak when used without
: compression, you should not use it anyway. And if your cipher is not
: weak, there is no security implication in the choice of one algorithm
: over another (otherwise this would be considered as a weakness of the
: algorithm).
So, the question arises: do you *know* your entire system is not weak.
*If* you can answer this question in the affirmative, you can forget
about compression improving your security, as obviously you're already
happy with it as it stands.
Other people should obviously concern themselves with compression,
given that very good compression can in principle make an analyst's task
well-nigh impossible.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Make new friends - but keep the old ones in case they ever come in handy.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Reply-To: [EMAIL PROTECTED]
Date: Fri, 11 Feb 2000 23:59:48 GMT
Jerry Coffin <[EMAIL PROTECTED]> wrote:
: [EMAIL PROTECTED] says...
:> As I've read here, it's good to compress before you encrypt the data. Now
:> I've got 2 questions about this:
:>
:> 1) From a security perspective, how important is compression? Is prior
:> compression just a kind of "weak enhancement" or is considered it an
:> integral part of the encryption process as a whole?
: A few people consider it extremely important, but most of us realize
: otherwise. It's of no use at all against a known-plaintext attack.
...though it can helps no end with partial plaintexts.
In practice, /most/ known plaintexts are known /partial/ plaintexts.
The "complete" known plaintext is a relatively unusual case.
Protecting against known /partial/ plaintexts may well be worth doing.
Besides - it's not true that it's "no use at all" against known-plaintext
attacks. This is true, *even* if we're talking about *complete* known
plaintexts! ;-)
: If it's designed with encryption in mind, it can be of minimal help in
: slowing a ciphertext-only attack. [...]
It can make the difference between having a broken cypher, and a
completely unviolated one. Isn't this what security is all about?
: The most fundamental problem is that it doesn't really make enough
: difference to care about: just for example, David Scott makes much
: of the compression method he uses, and the fact that if you decompress
: the wrong text, it produces output that's statistically similar to
: normal plaintext.
Does he? Frankly, I don't recall the occasion.
*I* bang on about this point - but I /never/ claim it as a propery of
David's current Huffman scheme.
I've examined its compression ratios myself and - in the version I
looked at - they were nothing to write home about.
*Where* did David claim this for his scheme? Can you quote from the post?
: Mr. Scott also uses a method of compressing once from beginning to
: end, and then again in the opposite direction (using dynamic
: compression in both cases). This basically means you have to
: decompress the entire message before you can collect statistics on it,
: rather than decompressing a block or two at a time like you can with
: most others. This is a theoretical help, but of an extremely limited
: degree: it slows an attack down by a factor roughly proportional to
: the length of the message. If you send small messages, it makes
: almost no difference at all; with larger messages the effect grows.
: IMO, to be considered secure, an algorithm needs to be at least
: sufficiently secure regardless of how small a message may be. If you
: make the algorithm sufficient with small messages, then it's still
: sufficient with large messages.
This is *one* of the functions of the technique. You don't even mention
its *other* benefits.
In case you are unaware of them, I'll give an example:
Compressing in one direction only diffuses plaintext in one direction.
This means any header to the file translates into a header in the
compressed file.
Consequently any known header information can be used as fuel to a
known-plaintext attack on the subsequent encryption of the compressed file
- since this too has a known header.
Compressing in *both* directions can - and generally will - destroy this
plaintext header in the compressed file - *provided* the files differ at
some subsequent location.
Since you don't seem to appreciate the benefits of the technique in
resisting such attacks, I'm suprised that you feel that you are in a
position to criticise it :-(
: For these purposes, special forms of compression make little
: difference. Instead, you're simply looking for whatever produces the
: smallest output.
If you don't care about compressors systematically adding information to
the files they compress, that's your look out.
Others should be advised that there are other criteria that a good
compressor should posess besides generating relatively small output files.
Most compressors allow automated rejection of keys based on the compressed
file alone. For non arithmetic/Huffman techniques this is often possible
given only a small fragment of the file. It seems clear enough to me that
as this is something that can be avoided, it *should* be avoided.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Computers unite: you have nothing to lose but your operators.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: need help with a basic C++ algorithm
Date: Sat, 12 Feb 2000 00:46:08 GMT
What is Rot-47 or Rot-13 or Rot-5. How does it work?
Dinesh
In article <[EMAIL PROTECTED]>,
"Trevor Jackson, III" <[EMAIL PROTECTED]> wrote:
> Adrian DuChant wrote:
>
> > Greetings,
> > I am working on a program which will be using clear text files for
basic
> > data storage,
> > I would like to encrypt them and decrypt them at runtime for
reading into
> > the program so as to not allow someone to tamper with the data held
within.
> > This only needs to be basic, nothing really intense.
> > If some one could please give me a hand (or a snippet of code) to
make this
> > algorithm it would be most appreciated.
> > TIA
> > Adrian DuChant.
>
> How proficient are the people who might tamper with the data? There
is no
> mechanism that can prevent all tampering.
>
> First, as opposed to obscuring the contents of the data you will need
to verify
> the integrity of the data -- that it has not been tampered with. If
this is
> the sum total of your interest you do no need to encrypt the data,
but simply
> add an integrity check.
>
> Checksums are simple integrity checks. Message Authentication Codes
(MAC) are
> more sophisticated integrity checks.
>
> If you want something really simple just Rot-13 the text (works
within the 26
> letter of the alphabet). If you want to be ambitious Rot-47 the text
(works
> within the 94 characters of printable ASCII minus tilde). If the
text is
> mostly numeric data Rot-5 it within the decimal digits.
>
>
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Message to SCOTT19U.ZIP_GUY
Reply-To: [EMAIL PROTECTED]
Date: Sat, 12 Feb 2000 00:07:55 GMT
[EMAIL PROTECTED] wrote:
:> To my ears, the description quoted at the top sounds like an extremely
:> garbled version of DS's recommendation of a method get diffusion of
:> plaintext information through the entire message by applying adaptive
:> compression programs "in both directions" through the file [...]
: I POSTED DAvid's Original Message in this thread above. Look at it
: ...and respond accordingly...
I'll do *exactly* as I choose, thank you.
Unfortunately, I have already done as you commanded - and am thus unable
to disobey your "request" ;-|
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Never hit a man with glasses - hit him with your fist.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Date: Sat, 12 Feb 2000 01:18:58 GMT
[EMAIL PROTECTED] wrote:
> 1) From a security perspective, how important is compression? Is
> prior compression just a kind of "weak enhancement" or is it
> considered an integral part of the encryption process as a whole?
If you don't know for sure that the enemy cannot crack your
encryption, then precompression at least interferes with attacks
based on statistical characteristics of the plaintext source
language, which *might* reduce the chances of the enemy reading
your message.
On the other hand, if you have justifiable confidence in your
cryptosystem, precompression would be a waste of resources.
> 2) Are there special compression algorithms that are specifically
> well-suited in combination with block cyphers? Is any of the
> standard algorithms as good as the other?
When the purpose of compression is to remove redundancy in order
to suppress clues from the statistical characteristics of the
plaintext source language, you simply want the highest degree of
compression you can get (subject to your resource limitations).
D.Scott has been promoting "one-on-one" compression, for reasons
you can read about in the sci.crypt archives of the past year.
It seems to offer an advantage only if the cryptosystem already
has some weaknesses that would normally be considered unacceptable.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Message to SCOTT19U.ZIP_GUY
Date: Sat, 12 Feb 2000 02:17:18 GMT
In article <[EMAIL PROTECTED]>, "Peter K. Boucher"
<[EMAIL PROTECTED]> wrote:
>Tim Tyler wrote:
>[snip]
>> To my ears, the description quoted at the top sounds like an extremely
>> garbled version of DS's recommendation of a method get diffusion of
>> plaintext information through the entire message by applying adaptive
>> compression programs "in both directions" through the file - in the
>> absence of any better whole-message diffusion scheme.
>
>Carl Ellison proposed a scheme like that, 8 or 9 years ago.
>If I recall correctly, it went like this:
>Encryption:
> 1) Encrypt file with DES and key 1
> 2) Sum all the bytes in the file, and use the sum as a key to
>transpose all the bytes in the file.
> 3) Encrypt file with DES and key 2
> 4) Sum all the bytes in the file, and use the sum as a key to
>transpose all the bytes in the file.
> 5) Encrypt file with DES and key 3
>Decryption:
> 1) Decrypt file with DES and key 3
> 2) Sum all the bytes in the file, and use the sum as a key to
>transpose all the bytes in the file.
> 3) Decrypt file with DES and key 2
> 4) Sum all the bytes in the file, and use the sum as a key to
>transpose all the bytes in the file.
> 5) Decrypt file with DES and key 1
>
>Does Scott give any credit to Ellison?
No for several resons. I din't use DES. Who is Ellison. What does this
have to do with either all or nothing encryption or One-one Compression.
>
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm
Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm
Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:
"The road to tyranny, we must never forget, begins with the destruction of the
truth."
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Which compression is best?
Date: Sat, 12 Feb 2000 02:20:37 GMT
In article <881da2$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (John Chandler)
wrote:
>In article <881con$[EMAIL PROTECTED]>,
>John Chandler <[EMAIL PROTECTED]> wrote:
>>
>>How about the following possibilities:
>>
>>1) Use adaptive Huffman compression instead of any static scheme.
>> With adaptive compression, the Huffman frequency table changes as
>> compression proceeds, and at any given point the table depends on the
>> initial table _and_ on the preceding characters in the message.
>> This seems like a good idea, cryptographically speaking,
>> a little reminiscent of chaining a block code.
>>
>> It should be kept in mind that static Huffman decompression
>> almost always resynchronizes itself automatically after
>> a bit has been dropped or flipped by noise,
>> but adaptive Huffman does not.
>>
>>2) The initial frequency table for adaptive Huffman compression
>> could be very simple (containing many 1's, for example)
>> and the non-simple part of the table could be part of the key.
>> While not cryptographically strong, this could be quite
>> inconvenient for a cryptanalysis.
>>
>>3) Recommendations have been made to prepend a segment of random or
>> pseudorandom garbage to any message before encryption,
>> to make it harder to find the header. (Another recommendation
>> was to break the message in two and swap the halves,
>> putting the header in the middle of the message.)
>> Would this be a particularly good thing to do before
>> carrying out an adaptive compression? Possibly.
>
>I just noticed a post crediting "DS" with suggesting using
>adaptive compression in _both_ directions before encrypting.
>
>Great idea!
>I hadn't thought of that.
The part that you may have not noticed is to use adaptive huffman
compression that does not add info to the file. Which is entirely different
than most forms of adapative huffman compression which unfortunately add
info to a file that can be explotied.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm
Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm
Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:
"The road to tyranny, we must never forget, begins with the destruction of the
truth."
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: help DES encryption
Date: Sat, 12 Feb 2000 01:35:02 GMT
Paul Koning wrote:
> NIST publishes a book that spells out a detailed set of validation
> procedures, including some that will help isolate problems.
> It's NIST Special Publication 800-17, "Modes of Operation Validation
> System (MOVS): Requirements and Procedures". ...
Which unfortunately is not available in on-line format;
it can be ordered in printed form.
A similar document for 3DES *is* available on line:
http://csrc.nist.gov/nistpubs/800-20.pdf
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Which compression is best?
Date: Sat, 12 Feb 2000 02:30:14 GMT
In article <[EMAIL PROTECTED]>, Jerry Coffin
<[EMAIL PROTECTED]> wrote:
>In article <880n6m$u2u$[EMAIL PROTECTED]>, [EMAIL PROTECTED]
>says...
>> As I've read here, it's good to compress before you encrypt the data. Now
>> I've got 2 questions about this:
>>
>> 1) From a security perspective, how important is compression? Is prior
>> compression just a kind of "weak enhancement" or is considered it an
>> integral part of the encryption process as a whole?
>
>A few people consider it extremely important, but most of us realize
>otherwise. It's of no use at all against a known-plaintext attack.
What people like Jerry seem to lack the intelligence to understand
is that if you use some ones inferior adulterated compression so much
information is added to the file that even if one is encrypting a purely
random file. If the file is long enough other compression methods can
easily lead to a case where there may exist only one encyrption key
that can lead to a file that was compressed with the method used.
Either peopl like Jerry or Mr BS don't really understand the problem
or they do and want to stay part of the NSA coverup to get people to
use piss poor compression methods.
True if a known plaintext attack is used where the attacker knows the
whole message compression is of no help. But my compression prevents
the most common type of plain text attack where there is a known fragment
of plain text buried in the middle of a message. While bad compression
allows for ciphertext only type to attacks.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm
Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm
Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:
"The road to tyranny, we must never forget, begins with the destruction of the
truth."
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Which compression is best?
Date: Sat, 12 Feb 2000 02:35:32 GMT
In article <[EMAIL PROTECTED]>, "Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
>[EMAIL PROTECTED] wrote:
>> 1) From a security perspective, how important is compression? Is
>> prior compression just a kind of "weak enhancement" or is it
>> considered an integral part of the encryption process as a whole?
>
>If you don't know for sure that the enemy cannot crack your
>encryption, then precompression at least interferes with attacks
>based on statistical characteristics of the plaintext source
>language, which *might* reduce the chances of the enemy reading
>your message.
>
>On the other hand, if you have justifiable confidence in your
>cryptosystem, precompression would be a waste of resources.
>
>> 2) Are there special compression algorithms that are specifically
>> well-suited in combination with block cyphers? Is any of the
>> standard algorithms as good as the other?
>
>When the purpose of compression is to remove redundancy in order
>to suppress clues from the statistical characteristics of the
>plaintext source language, you simply want the highest degree of
>compression you can get (subject to your resource limitations).
But the compression method itself can add information. IF for
the files you careabout compress(decompress(X))=X and
decompress(compress(X))=X for all X then use the one with highest
compression ratio that does the above.
>
>D.Scott has been promoting "one-on-one" compression, for reasons
>you can read about in the sci.crypt archives of the past year.
>It seems to offer an advantage only if the cryptosystem already
>has some weaknesses that would normally be considered unacceptable.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm
Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm
Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:
"The road to tyranny, we must never forget, begins with the destruction of the
truth."
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************