Cryptography-Digest Digest #92, Volume #12 Fri, 23 Jun 00 10:13:00 EDT
Contents:
Re: Variability of chaining modes of block ciphers (Mark Wooding)
Re: Variability of chaining modes of block ciphers (Runu Knips)
Re: Missing Info in the crypto-gram of MR BS (Tim Tyler)
security problem with Win 2000 Encryption File System ("james")
Re: MD5 Expansion (Mark Wooding)
Re: libdes: des_SPtrans (Eric Young)
Re: How encryption works (Mark Wooding)
Re: security problem with Win 2000 Encryption File System (Runu Knips)
Re: security problem with Win 2000 Encryption File System (S�bastien SAUVAGE)
Re: Encryption on missing hard-drives ("Trevor L. Jackson, III")
Re: Variability of chaining modes of block ciphers ("Trevor L. Jackson, III")
does 3des use only keys? (dexMilano)
Re: Compression and known plaintext in brute force analysis (restatements caused by
the missing info .... thread) (Tim Tyler)
Re: how to compare the securtity between ECC and RSA (DJohn37050)
Re: how to compare the securtity between ECC and RSA (DJohn37050)
Re: Try it. (John)
Re: Try it. (John)
Public key algorithm conversion - does it possible? (acoola)
Re: Try it. (John)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: Variability of chaining modes of block ciphers
Date: 23 Jun 2000 10:17:04 GMT
Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
> It is a valuable information (and I personally consider it a good news)
> that Rijndael's key can be adjusted at will.
This is an improvement mentioned in the new version of the paper
submitted in Round 2. It defines the number of rounds Nr as a function
of the number of key words Nk and the block size (also in words) Nb:
Nr = 6 + max(Nk, Nr)
Perhaps Mr Hopwood would like to amend his table of key sizes in the
light of this information?
-- [mdw]
------------------------------
Date: Fri, 23 Jun 2000 12:18:45 +0200
From: Runu Knips <[EMAIL PROTECTED]>
Subject: Re: Variability of chaining modes of block ciphers
Mok-Kong Shen wrote:
> Runu Knips weote:
> > If at all, I would use some kind of register which always stores
> > the last plaintext sum, i.e. use a form such as:
> >
> > R(x) is register value in step x (x in [0..n])
> > C(x) is ciphertext x (x in [0..n])
> > P(x) is plaintext x (x in [1..n])
> > IV is the initialization vector
> >
> > Encrpytion:
> > R(0) := IV
> > R(n) := R(n-1) xor P(n)
> > C(0) := ENC(R(0))
> > C(n) := ENC(R(n))
> >
> > Decryption:
> > R(0) := DEC(C(0))
> > R(n) := DEC(C(n))
> > P(n) := R(n) xor R(n-1)
> >
> > This has some more of the good properties like CBC. However, in
> > a message of all zero, or all constant values, it still works
> > far worse than CBC.
>
> Your accumulation of plaintext was mentioned as a possibility in my
> original post that initiated the present thread.
Sorry. I didn't read that posting carefully enough. I apologize.
> I mentioned that there are at least 8 chaining variants, though I
> didn't discuss comparisons of these. In one of my own designs
> (WEAK3-EX), I used accumulated plaintext and ciphertext to do the
> chaining.
If you combine the above with CBC, I would agree with you that
the resulting cipher is conceptionally better than CBC alone,
because if the attacker guesses correctly about the message
block N, it has no use for it because he or she doesn't know
about the plaintext sum of all blocks before N.
So combining accumulated PBC and the original CBC results in a
cipher which can only get attacked by guessing the first block.
By sending a second, cryptographically random, IV in the first
encrypted block, plaintext attacks by guessing the plaintext
are impossible.
On the other hand, a good cipher should not be plaintext
attackable anyway, so this is mainly an additional overhead,
isn't it ?
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Missing Info in the crypto-gram of MR BS
Reply-To: [EMAIL PROTECTED]
Date: Fri, 23 Jun 2000 09:43:30 GMT
James Felling <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> James Felling <[EMAIL PROTECTED]> wrote:
:> Here, valid plaintext messages are normally easily distinguished from
:> random files. They are (after all) ASCII text. However, for most messages
:> decrypting with the wrong key will result in a message that decompresses
:> to something very plausible looking :-(
: A "random file" is not ascii text as I have been using it. However if
: we call it a random snippet of ASCII text I will accept the statement.
I don't think I said anything controversial. The "They are (after all)
ASCII text" was intended to refer to the "valid plaintext messages" - not
the "random files".
:> How "easy" the original message plaintext is to recognise is not
:> relevant. Whether this is ASCII text or not doesn't come into it.
[reformatted]
: Wrong (I think). Here is my logic. Given we have a compressor C and
: its inverse C', and an encryption E(P,K) and its inverse E'(P,K).
: I am assuming that the adversary knows whether compression is being
: used or not, and also knows in a general sense the type of file that
: the plaintext is. Assuming that he recives a coded transmission X ,
: then he will compute E'(X,guess) for his guessed keys. I claim that
: unicity distance is maximised when the set G = {E'(X, guess), for all
: possible guesses} has the largest possible intersection with the
: set P of possible pre encrypted data.
: Since we assume G is randomly distributed throughout the space of all
: possible inputs( it is a good code), this boils down to attempting to
: maximise the size of the set P.
: This will have the effect of increasing unicity distance. This has
: nothing to do with how the data was generated, merely with how much of
: the space of potential inputs is used. If you encrypt a file of totally
: random numbers the unicity distance is infinite, as P is the space of all
: possible inputs, and G intersect P is always exactly equal to G. As P
: shrinks away from filling the space of all possible numbers it becomes
: possible to have finite unicity distances as each time a guessed key
: falls outside of P we discard that guess, eventually we will be left
: with but a single guess being possible. This is the unicity distance.--
: the amount of data necessary for this to be likely. Therefore the larger
: P is the larger the expected unicity distance.
: If the set P1 is the set of all possible ASCII text files of N bits,
: and P2 is the set of all possible compressed files of N bits. I claim
: that P2 is larger than or equal to P1. This is why I claim that given
: a function F, if F(P) fills the space of possible inputs more fully
: than P does, you are best off (as far as unicity distance goes) applying
: F all type P inputs. If on the other hand F(P) fills less of the space
: than P does, then you are worse off.
I don't think this conclusion goes against anything I've said.
In fact, AFAICS, I agree with all of it.
I'll have a quick attempt to restate my position in this immediate area
more clearly, so you can see if there's anything there to be objected to.
An attacker's test decrypts are compressed files.
To make the attacker's job as hard as possible, it is desirable to
increase the proportion of these that represent valid messages, at
any given message size (this is equivalent to increasing the unicity
distance).
It doesn't much matter what form the originall plaintexts are (provided
you have an algorithm that can distinguish a valid plaintext from junk).
*If* you have a compression program that deals with such plaintexts
by compressing them efficiently, the difficulty the attacker will
experience in testing keys will be very great.
In summary, if you have a compressor which means that most decompressions
(at expected message sizes) result in plausible looking messages, the
attacker is likely to be stumped - he won't be able to tell a correct
decrypt from an incorrect one.
This is true regardless of the format of your original messages (be they
ASCII text, or whatever). This is assuming that the attacker can
distinguish a valid plaintext from random garbage - and that a suitable
compressor exists.
--
__________ Lotus Artificial Life http://alife.co.uk/ [EMAIL PROTECTED]
|im |yler The Mandala Centre http://mandala.co.uk/ UART what UEAT.
------------------------------
From: "james" <[EMAIL PROTECTED]>
Subject: security problem with Win 2000 Encryption File System
Date: Fri, 23 Jun 2000 12:44:01 +0200
When i create a file and after, when i activate the encryption ( Properties
of the file ), the original file is only deleted ( not wipe from the disk ),
then the original file remains on the disk, and can be found with a sector
editor !!!
------------------------------
From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: MD5 Expansion
Date: 23 Jun 2000 11:09:13 GMT
David Hopwood <[EMAIL PROTECTED]> wrote:
>
> Simon Johnson wrote:
> >
> > A much better way is to divide the message in two. Then hash
> > each part individually and contatenate each hash.
> >
> > a = 1/2 of M
> > b = other 1/2 of M
> > Hash output = H(a) & H(B)
>
> No, this is a disastrous idea, because one half of the message can be
> held constant in a collision attack. That means it is only as collision-
> resistant as H.
>
> The modification with bytes of the message interleaved also doesn't work,
> because odd or even-numbered bytes can still be held constant.
Indeed. Some more complicated mixing needs to be done. Let H be a hash
function, and let K be a cipher. Let M = (a, b) be a message split into
two halves (it doesn't really matter how). Compute:
a' = E_{H(b)}(a)
b' = E_{H(a)}(b)
Then let the extended hash be
H'(M) = H(a', b') || H(b', a')
Any analysis? (Yes, it's much slower than just hashing the data twice.)
-- [mdw]
------------------------------
From: Eric Young <[EMAIL PROTECTED]>
Subject: Re: libdes: des_SPtrans
Date: Fri, 23 Jun 2000 11:25:06 GMT
Mark Wooding wrote:
> It's a combination of a bunch of small substitution tables and a bit
> permutation. My version of DES, in Catacomb, provides a program which
> generates such tables from the original tables provided in the
> definition of DES. Interstingly, libdes's table isn't actually the same
> as mine.
I used to have a perl script that generated the table, unfortunately I
lost it years ago (simple enough to reconstruct though).
I believe one reason the libdes table is probably different is that I went
'little-endian' in the table conversions. Most other people tend
to be big-endian.
By 'little-endian' I mean that the conversion from bytes into expanded
32bit words could be done in one of two ways. We have probably picked
differently :-)
eric
------------------------------
From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: How encryption works
Date: 23 Jun 2000 11:26:04 GMT
infamis.at.programmer.net <[EMAIL PROTECTED]> wrote:
> How does DH encrypt the block-cipher key?
It actually uses ElGamal, which is based on Diffie-Hellman. There are a
bunch of parameters chosen: a prime p, and a generator g of an
appropriate cyclic group. The private key is an integer x and the
public key is g^x mod p. You encrypt a message M by inventing a random
number k and sending the pair of values g^k mod p, y^k M mod p. The
recipient decrypts by computing y^k mod p = (g^k)^x mod p and doing the
division.
> Are there any other encryption programs or simulations that I can study,
> because I need something other than PGP.
Loads.
> Where can I learn about memory management and processing of keys ie
> greater than 256bit?
> For instance, a 4096bit key would take up 512 bytes of memory!
Yes. That's not a lot.
> How does a program compute large values like this?
Using integer arithmetic and clever algorithms.
> Any source code that displays this? I know x86 assembly language best,
> so any examples in asm would be great, but C examples are also
> welcome.
See http://www.excessus.demon.co.uk/misc-hacks/#catacomb for example
code. I've tried to make things as clear as I can. Most of the
algorithms are explained in the Handbook of Applied Cryptography,
available http://www.cacr.math.uwaterloo.ca/hac/ or as a very nice
hardback book.
-- [mdw]
------------------------------
Date: Fri, 23 Jun 2000 13:31:56 +0200
From: Runu Knips <[EMAIL PROTECTED]>
Subject: Re: security problem with Win 2000 Encryption File System
james wrote:
> When i create a file and after, when i activate
> the encryption ( Properties of the file ), the
> original file is only deleted ( not wipe from
> the disk ), then the original file remains on
> the disk, and can be found with a sector
> editor !!!
I don't know how that system works, but if one
wants to create an encrypted file system, any
data which is ever stored to that disk should
NEVER be unencrypted, or else you always can
get that data back, no matter if it was wiped
or not.
------------------------------
Subject: Re: security problem with Win 2000 Encryption File System
From: [EMAIL PROTECTED] (S�bastien SAUVAGE)
Date: Fri, 23 Jun 2000 12:51:24 GMT
[EMAIL PROTECTED] (Runu Knips) wrote in <[EMAIL PROTECTED]>:
>james wrote:
>> When i create a file and after, when i activate
>> the encryption ( Properties of the file ), the
>> original file is only deleted ( not wipe from
>> the disk ), then the original file remains on
>> the disk, and can be found with a sector
>> editor !!!
>
>I don't know how that system works, but if one
>wants to create an encrypted file system, any
>data which is ever stored to that disk should
>NEVER be unencrypted, or else you always can
>get that data back, no matter if it was wiped
>or not.
>
True indeed.
I recommend Encryption for the Masses (E4M).
It works under Windows 95/98/NT and 2000 and provides a true fully
encrypted filesystem (in the form of encrypted volume files which can be
mounted/unmounted from system.).
EVERYTHING on a E4M Disk is encrypted: boot sector, FATs, directories,
files and empty sectors.
It's very simple to use.
And it's free.
And the source code is included.
And it's Blowfish, IDEA, DES, TripleDES, CAST. You choose.
http://www.e4m.net
--
S�bastien SAUVAGE - [EMAIL PROTECTED]
http://www.bigfoot.com/~sebsauvage
------------------------------
Date: Fri, 23 Jun 2000 09:31:11 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Encryption on missing hard-drives
Guy Macon wrote:
> Trevor L. Jackson, III wrote:
> >
> >
> >JimD wrote:
> >
> >> On 22 Jun 2000 07:11:39 GMT, [EMAIL PROTECTED] (Mack) wrote:
> >>
> >> >But I can't fathom why you would put such a device on a
> >> >WIN based machine where it will just dump content to the
> >> >unencrypted swap file where anyone can read it.
> >>
> >> Loads of memory and swapfile disabled?
> >
> >On the 32-bit versions of Microsoft(tm) Windows(!tm) you cannot disable the
> >swapfile because the memory management subsystem goes through the disk
> >subsystem. If you disable swapping the system becomes (ahem) "unstable".
>
> Is not Windows Embedded NT a 32-bit versions of Microsoft Windows?
I have no information regarding Windows Embedded NT.
------------------------------
Date: Fri, 23 Jun 2000 09:36:51 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Variability of chaining modes of block ciphers
Guy Macon wrote:
> Eric Lee Green wrote:
> >
> >Implementing *A* chaining mode is not the point. Been there, done that. The
> >point is that every additional line of code is a line of code that could
> >represent a bug or a security problem with your implementation. Past a certain
> >point, the added security of globbing yet more code on top of code is
> >illusionary.
> >
> >In other words, complexity is to be avoided whenever possible because it is a
> >security problem. My opinion is that adjustable chaining modes (as vs. one
> >chaining mode) is added complexity whose cost-benefit margin is dubious.
> >
>
> (This is a general comment about adding complexity, not about the particular
> methods discussed in this thread) It seems to me that, if a cipher is strong,
> then it can protect any plaintext. Thus, if the plaintext to cipher B is
> the output of cipher A, any bugs or security problems in cipher A won't
> hurt you. Also, if a cipher is strong, then all known transformations of
> it's output do not reduce the strength.
All but one. Decryption does "reduce the strength".
In fact one could conceive of transforms related to decryption such as performing
an incomplete decryption of less than the normal number of rounds.
> Thus, if the plaintext to cipher B
> is the output of cipher A, any bugs or security problems in cipher B won't
> hurt you. I conclude that, in this limited case, adding complexity does
> not increase the chance of bug or a security problem with your implementation
> hurting you, but rather decreases it. I can make a similar case for adding
> complexity in stream ciphers by using XOR to combine your plaintext with
> the output of multiple ciphers.
------------------------------
From: dexMilano <[EMAIL PROTECTED]>
Subject: does 3des use only keys?
Date: Fri, 23 Jun 2000 13:23:42 GMT
I've found an implementation of 3DES using 2 times the 1st key and one
time the second.
I thought it could be better to use 3 different keys.
Any idea of the strenght?
thx
dex
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Compression and known plaintext in brute force analysis (restatements
caused by the missing info .... thread)
Reply-To: [EMAIL PROTECTED]
Date: Fri, 23 Jun 2000 12:44:58 GMT
Joseph Ashwood <[EMAIL PROTECTED]> wrote:
: Compression is certainly not a cure all for cryptographic protection,
: especially against a known plaintext attack. For this I am assuming that the
: compression algorithm uses a fixed table known to the attacker. According to
: results that were posted here certainty in a DES sized cipher should take 2
: blocks of attempts, with ASCII text. This is due to the amount of entropy
: and the unicity being at 2 blocks.
: But with compression a slightly different result can occur, we can end up
: with a deflation down to significantly smaller sizes. A quick test with
: winzip on a large text file gave a size reduction of 65%, more usefully for
: this that means that the compression resulted in 3 bytes per byte. Moving
: outward this leads to 3 blocks per block, so with a (semi)known plaintext,
: under compression, it is entirely possible that the unicity distance could
: actually be reduced.
Compression generally results in *increases* in the unicity distance -
you need a larger volume of original plaintext to be sent before you can
uniquely identify a correct decrypt.
The *reduction* you mention does not seem to follow from your argument.
To mention a specific issue, *chosen* plaintexts /could/ produce
reductions in unicity distance when passing through the compressor
- the attacker could pick plaintexts which increase in length under
the compressor. This seems to offer him no particular advantage
over choosing to send longer plaintexts, though.
As an aside, using a fixed table known to the attacker does not offer
one of the advantages of using adaptive compression schemes - namely,
increased difficulty in using attacks based on partial plaintexts.
--
__________ Lotus Artificial Life http://alife.co.uk/ [EMAIL PROTECTED]
|im |yler The Mandala Centre http://mandala.co.uk/ Namaste.
------------------------------
From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: how to compare the securtity between ECC and RSA
Date: 23 Jun 2000 13:46:32 GMT
Roger says he would use a 410 bit DSA/DH key. ANSI X9 requires use of a 1024
bit DSA/DH key. Roger is at one extreme, he picks that smallest size key that
has not yet been broken.
Don Johnson
------------------------------
From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: how to compare the securtity between ECC and RSA
Date: 23 Jun 2000 13:48:48 GMT
As I keep saying, given TIME to break a symmetric key is difficult to map to
SPACE to break an asymmetric key. It is straightforward to map time to time.
This is an advantage in the analysis.
Don Johnson
------------------------------
Subject: Re: Try it.
From: John <[EMAIL PROTECTED]>
Date: Fri, 23 Jun 2000 06:53:04 -0700
I can understand, but don't fully agree with your argument.
1. Public availability, as you say, I think, won't guarantee
security, nor will an NDA. A better way to be sure if an
algorithm/source code is good/secure, would be some courses in
computer programming.
2. I am confused, what part of the source would one get an NDA
for? If you can't "protext" your encryption, the rest hardly
seems worth protecting.
Got questions? Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com
------------------------------
Subject: Re: Try it.
From: John <[EMAIL PROTECTED]>
Date: Fri, 23 Jun 2000 06:55:19 -0700
Sounds reasonable. I wonder where the NDA discussion started. I
recall in another thread, though I may be wrong, that someone
said to use an NDA to protect your source. Isn't a copyright
sufficient?
Got questions? Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com
------------------------------
From: acoola <[EMAIL PROTECTED]>
Subject: Public key algorithm conversion - does it possible?
Date: Fri, 23 Jun 2000 13:49:33 GMT
Conversely conventional using of a public key algorithm I'd like to
conceal public key from everybody and make private key available for
anyone so only I be able to encrypt a message. May I use ElGamal for
that? How is difficult to deduce ElGamal's public key knowing private
key, chipertext and plaintext?
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
Subject: Re: Try it.
From: John <[EMAIL PROTECTED]>
Date: Fri, 23 Jun 2000 07:00:54 -0700
Reverse engineering, yes, but there is no logical link between
secrecy and insecurity, unless I don't have your definition of
security. I mean, if this None guy has the source code and it is
secure, we'll never know it, since we can't prove or disprove
it. If I understand, most people here think the source should be
public. I regularly talk to None via e-mail. I think I can get
the manufacturer to have someone like you, as you are in
computer science, to look at the source code and drop the NDA.
The marketing is a tough job. Many people can write good stuff,
but have no clue how to sell it. Even 20 years ago, source code
was considered a very private affair.
Some people think encryption=secrets?
Got questions? Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************