Cryptography-Digest Digest #293, Volume #12 Wed, 26 Jul 00 20:13:01 EDT
Contents:
Re: purpose of final swap in Twofish? (Eric Smith)
Re: MD5 algorithm questions (Bill Unruh)
Skipjack (David A. Wagner)
Re: purpose of final swap in Twofish? (David A. Wagner)
how to cirucmvent the rip.... (AskAlice)
Re: AESC-stream cipher ("Trevor L. Jackson, III")
Re: "Symmetric" RSA encryption (Bill Unruh)
Re: MD5 algorithm questions ("Joseph Ashwood")
The Purple Cipher (World War II) (Charles Petersen)
Re: Random Appearance (Future Beacon)
What is DES3mCBCMACi64pPKCS5? (Paul Rubin)
Re: Get Free Software (jungle)
Re: MD5 algorithm questions (Bill Unruh)
Re: RC4-- repetition length? ([EMAIL PROTECTED])
Re: Cursory Comments on Recursion (wtshaw)
Re: "Symmetric" RSA encryption (John Myre)
Re: "Symmetric" RSA encryption (John Myre)
Re: MD5 algorithm questions ("Joseph Ashwood")
Re: Elliptic Curves encryption ("Joseph Ashwood")
----------------------------------------------------------------------------
From: Eric Smith <[EMAIL PROTECTED]>
Subject: Re: purpose of final swap in Twofish?
Date: 26 Jul 2000 14:48:58 -0700
[EMAIL PROTECTED] (David A. Wagner) writes:
> I would have expected execution time to be affected by
> 1% and code size to be increased by a few instructions,
On the 6805, the added code size is about five bytes each for
the decrypt and encrypt routines. Admittedly this is not very much.
However, on low-end microcontrollers there may less than 2 Kbytes of
ROM available. I'm currently trying to implement this on PIC
microcontrollers with 1 Kword (by 14 bits) of EPROM/Flash.
> If they do, it seems that changing your
> algorithm to one designed specifically for constrained environments might
> make much more of a difference than any tweaking of the swaps in Twofish.
Twofish is reportedly designed to be suitable for constrained environments.
Choosing another algorithm is only practical if you are in control of both
endpoints of the communications. If you want to interoperate with a host
that does Twofish, you don't implement DES at your end.
Anyhow, I was just wondering what the supposed benefit of the
extra/missing swap was, and it seems to be essentially nothing. I'm
obviously not arguing that it should be changed. I just thought that
there might have been some deeper significance that I'd overlooked.
------------------------------
From: [EMAIL PROTECTED] (Bill Unruh)
Crossposted-To: comp.security.unix,alt.computer.security
Subject: Re: MD5 algorithm questions
Date: 26 Jul 2000 21:52:02 GMT
In <[EMAIL PROTECTED]> Scott Long <[EMAIL PROTECTED]> writes:
>Are all unique messages which are SHORTER than the MD5 code length (128
>bits) guaranteed to have different MD5 hashes? In other words, does
No.
>there exist a 64-bit message that has the SAME MD5 hash as a different
>64-bit message? Or is there only a possibility of hash collisions for
Unknown for your specific example of exactly 64 bits (probably they do
not have a collision.) As far as I have seen, the best estimates are to
compare it to a random output of 128 bits. Ie, any a, no matter what the
length has an output which is some uniform random selection from the
numbers less than 2^128. Thus, if you chose all numbers less than say 64
bits, there would be about a 50% probability that two would have the
same hash (Birthday paradox-- which actually does not apply to birthdays
because the probablity of a certain date being a birthday is not the
same for all days of the year).
>messages that exceed 128 bits in length? I realize that the probability,
>if it exists, is very small.
>Also, if I want to fold the 128-bit hash into 64-bits, is it sufficient
>to exclusive-or the top 64 bits with the bottom 64 bits? Please note
>that I'm NOT using this for any security purposes, and therefore any
>security arguments against doing this do not apply; I'm only using MD5
>in order to get a good hash code of some data.
Or just throw away the top 64 bits, or the bottom 64 bits, or every
second bit, take a 64 bit CRC of the 128 bits, or ....
If truncating MD5 in any of those ways gave a bad hash (ie one which
made it easy to find collisions-- or was not uniformly distributed-- then
this would make the whole of MD5 weak-- of course we do not know that it
is not weak, just that no such simple weakness has not been found.
(Note I am not an expert-- the above is based on readings and thinking I
have done. If I have made an error I assume I will get corrected by an
expert)
------------------------------
From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Skipjack
Date: 26 Jul 2000 14:49:54 -0700
Has anyone looked at the 256-element Skipjack S-box to see if it can
be expressed with less memory? For instance, is there any chance it
is itself expressable as a 4-round Feistel network with some choice of
4-bit S-boxes? Does anyone know of any way to determine whether such
a representation exists, for this size table?
Apart from a factual issue about Skipjack, it seems to be an interesting
theoretical question about whether it is possible to build a cipher
with a S-box which has a small representation that can be kept secret
from even a somewhat determined adversary. We might imagine that an
agency such as the NSA might have the expertise to build a secure cipher
with an extremely small footprint, perhaps considerably smaller than
many of their enemies know how to do, and yet might to occasionally
deploy that cipher in an environment where there is a high risk where
it can be reverse-engineered. In such a case, they might want to use
the large representation in those high-risk environments to prevent
certain other parties from acquiring the ability to use strong crypto
in highly-constrained environments. I know these are somewhat silly
premises, but as a purely academic question, I wonder whether it is
possible to build such a cipher. Any ideas?
------------------------------
From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Re: purpose of final swap in Twofish?
Date: 26 Jul 2000 14:55:58 -0700
Eric Smith <[EMAIL PROTECTED]> wrote:
> Twofish is reportedly designed to be suitable for constrained environments.
It is intended to be good enough, and fairly close to the best in all
contexts, but not necessarily optimal in any single environment. If
things are so tight that 5--10 bytes of ROM matter a lot, I don't know,
the difference between ``the best'' and ``close to the best'' might be
substantial.
> Choosing another algorithm is only practical if you are in control of both
> endpoints of the communications. If you want to interoperate with a host
> that does Twofish, you don't implement DES at your end.
Yup, exactly so. In this case, my advice of considering an algorithm
doesn't help one bit! Sorry for the useless comments.
> Anyhow, I was just wondering what the supposed benefit of the
> extra/missing swap was, and it seems to be essentially nothing. I'm
> obviously not arguing that it should be changed. I just thought that
> there might have been some deeper significance that I'd overlooked.
Nope. Nothing deep here at all. Just epsilon-improvements on some
platforms (and, apparently, epsilon-unimprovement on others).
------------------------------
From: AskAlice <[EMAIL PROTECTED]>
Subject: how to cirucmvent the rip....
Date: Wed, 26 Jul 2000 21:50:52 GMT
read about it here...
http://www.thirsty.com/Common/StoryReview/1,2351,7~19277,00.h
tml?
--
They also take user submissions and post daily tech news...
--
Gimble in the wabe,
A.L.
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
Date: Wed, 26 Jul 2000 18:10:21 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: AESC-stream cipher
"David A. Wagner" wrote:
> In principle, you might be right that it is best to do end-to-end
> measurements of time to encrypt a large file rather than microbenchmarks
> of time to encrypt memory repeatedly, but in practice, I disagree.
>
> End-to-end measurements can give us a very good understanding of
> performance in one very limited niche -- e.g., file encryption -- but
> it is hard to generalize from them to other settings where the cost of
> I/O may differ dramatically. Therefore, the most useful measurements
> seem to be raw encryption time (from memory, caches warm, etc.).
By this I assume you mean the cipher code and the supporting data structures
are in cache, but not the text stream. The OP mentioned repeatedly encrypting
a 256-byte block, which is hardly representative of normal usage. On a
reasonably recent CPU 256 bytes fits in 4 or 5 lines of cache, which is
unreasonably fast compared to main memory.
>
>
> We should all remember the caveat that, in some important cases,
> I/O time may dominate encryption time, in which case the speed of the
> cipher doesn't matter; but in other cases, encryption time dominates,
> and then speed does matter. Nonetheless, this judgement must be made
> on an application-by-application basis, whereas it is useful to compare
> the performance of various ciphers in a less limited context.
>
> Also, remember that cryptographers rarely have time or skill to optimize
> their I/O subsystem (you're lucky if they have the time and skill to
> optimize their cipher implementation!), which means that any end-to-end
> measurements you see are likely to be unrepresentative of how the cipher
> would be used in any case where speed truly mattered.
>
> You may well be right that the AESC cipher is far too slow to be of any
> real interest, and/or that the originally posted performance measurements
> for AESC were truly lacking. I just wanted to take exception to your
> general point, not to specifically defend AESC.
Benchmarks are hard to do well. Even when done well they generate
considerable heat. ;-)
------------------------------
From: [EMAIL PROTECTED] (Bill Unruh)
Subject: Re: "Symmetric" RSA encryption
Date: 26 Jul 2000 22:18:39 GMT
In <[EMAIL PROTECTED]> John Myre <[EMAIL PROTECTED]> writes:
>Let n be the modulus, and e and d be the exponents, as
>usual. The idea is to let one party (say, Alice) know
>n and e, and the other party (Bob) knows n and d, and that
>is it. In particular, the factors of n are gone. Now Alice
And can be easily regenerated. N is a product of p and q. ed-1 is a
multiple of the product of (p-1)(q-1). ( and thus one can easily find
(p-1)(q-1) and thus has p+q-1 as well). At which point solving for p and
q is not hard.
------------------------------
From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: MD5 algorithm questions
Date: Wed, 26 Jul 2000 15:19:22 -0700
Crossposted-To: comp.security.unix,alt.computer.security
> numbers less than 2^128. Thus, if you chose all numbers less than say 64
> bits, there would be about a 50% probability that two would have the
> same hash (Birthday paradox-- which actually does not apply to birthdays
> because the probablity of a certain date being a birthday is not the
> same for all days of the year).
Not quite correct. If you choose messages of length 64-bits there will be a
50% chance, but including numbers of smaller size is a very different sum,
the correct sum being f(n) = 2^n + f(n-1); f(0) = 1, or if I did the math
correctly 36893488147419103231 almost exactly double 2^64 (technically the
solution is also 2^(n+1) -1) so the odds are significantly higher. But if
you reduce the size to 63 bits or smaller the odd are 50%-epsilon.
Joe
------------------------------
From: Charles Petersen <[EMAIL PROTECTED]>
Subject: The Purple Cipher (World War II)
Date: Wed, 26 Jul 2000 15:49:29 -0700
I'm building a series of applets emulating famous ciphers from world war
II. I've already finished the Enigma and Sigaba and am now working on
the purple. I have not been able to find much detailed technical
information about the purple. I know it used a plug board to divide the
alphabet into 20 and 6. The 20 were then encrypted through banks of 4
telephone switches. Each switch contained six wipers that were
connected to its six inputs. Now heres where i start getting confused:
i *think* that each wiper has 25 outputs each of which has it's own
separate wiring. From the output it goes through a separate scrambled
wire set until it arrives at the next bank of switches. From what I can
tell there are 4 banks in the main system and just one bank of one
switch in the six letter system.
Correct anything of the above that's wrong; and *please* direct me to
any detailed source. I'm pretty sure all the wirings have been
destroyed so I'll have to make up my own but I'd at least like to get
the concept right.
thanks,
Charles Petersen
------------------------------
From: Future Beacon <[EMAIL PROTECTED]>
Subject: Re: Random Appearance
Date: Wed, 26 Jul 2000 18:50:19 -0400
On Wed, 26 Jul 2000, Mok-Kong Shen wrote:
> ........ You analyze a sentence and get its structure. For example,
> the sentence is 'John plays football'. The structure is noun-verb-object.
> What do you do in this case?
Compress and encrypt at this point.
> In the scheme I indicated, 'John' is
> mapped to a corresponding set of other nouns, say, {Mary, cat, .....},
> and similarly for the other two words. So the transform could turn out
> to be e.g. 'Mary hears music'.
The connection between "John plays football" and "Mary hears music"
must be obliterated by the encryption and disguise mechanism that
intervenes.
> In order to obtain such sufficiently
> natural transforms, the domain of discourse has obviously to be very
> limited and even then it is not a easy task. That is what I meant in
> my previous follow-up. Now could you show how you would
> operate with my example sentence to get a transform that looks
> natural according to your scheme?
>
> M. K. Shen
There have been mistakes in my attempt to give actual algorithms.
That has confused matters. I should give the overall scheme first:
To send, write the plain text, compress it, encrypt it, disguise it,
and send it.
The receiver must undo the disguise, decode the random-like cipher
text, decompress it, and read it.
Under the assumption that the disguise is apparent text, I believe
that the disguise phase will make the message longer. This is what
motivates the compression phase. I believe that common compression
schemes do not compress enough. This motivates using the same sort
of grammatical tools that seem needed for the disguise.
pt --> compressed pt --> ct --> disguised ct
In the phase that converts ciphertext to disguised ciphertext, an
algorithm must be found that makes all of the choices necessary to
constructing correct sentences (let's tackle the relatedness later).
The sentence structure (Si) must be chosen on the basis of some of
the bits in the cipher text. This might produce a sentence worth of
parts of speech. More of the ciphertext bits must be used to select
words that fit the parts of speech.
I have not taken the matter much farther than this. I am merely
optimistic.
I believe that the sender and the receiver need to share a list of
words that are organized, in some way, by part of speech. I believe
that common grammar needs to be modified in order to solve all of
the problems. I believe that the sentence structures in the plain
text should have no relationship to the sentence structures in the
disguise. I don't have much more.
Best regards,
Jim Trek
Future Beacon Technology
http://eznet.net/~progress
[EMAIL PROTECTED]
------------------------------
From: [EMAIL PROTECTED] (Paul Rubin)
Subject: What is DES3mCBCMACi64pPKCS5?
Date: 26 Jul 2000 22:53:39 GMT
I'm using a cryptography module that supports an encryption mode
called Mech_DES3mCBCMACi64pPKCS5. You give it a plaintext buffer and
it pads it to a multiple of 8 bytes using PKCS5, encrypts it in 3DES
CBC mode and adds a MAC. I understand about 3des/CBC and PKCS5 but
am not sure of how the MAC is supposed to be computed, and I'm trying
to code an interoperable implementation.
Does anyone know if there's a standard way to do this? I don't think
the PKCS standards say anything about it.
Thanks.
------------------------------
From: jungle <[EMAIL PROTECTED]>
Subject: Re: Get Free Software
Date: Wed, 26 Jul 2000 18:57:23 -0400
some people are so absorbed designing useless thing that they can't see the
good one ...
Joseph Ashwood wrote:
> "George Peters" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > Perhaps you didn't notice the source.zip contained within. Since
> > you make so many assumptions without first checking them out, I
> > would not take any of the other comments seriously either.
> Perhaps you didn't notice
some people are so absorbed designing useless thing that they can't see the
good one ...
> that we already have free software that
> doesn't require us to trust you with our e-mails, doesn't require us
> to pay for the service, doesn't require us to deal with the
> distributor at all after downloading, it's called GPG, and there's
> also PGP-freeware.
------------------------------
From: [EMAIL PROTECTED] (Bill Unruh)
Crossposted-To: comp.security.unix,alt.computer.security
Subject: Re: MD5 algorithm questions
Date: 26 Jul 2000 23:10:47 GMT
In <uy0ql#09$GA.369@cpmsnbbsa07> "Joseph Ashwood" <[EMAIL PROTECTED]> writes:
]> numbers less than 2^128. Thus, if you chose all numbers less than say 64
]> bits, there would be about a 50% probability that two would have the
]> same hash (Birthday paradox-- which actually does not apply to birthdays
]> because the probablity of a certain date being a birthday is not the
]> same for all days of the year).
]Not quite correct. If you choose messages of length 64-bits there will be a
]50% chance, but including numbers of smaller size is a very different sum,
]the correct sum being f(n) = 2^n + f(n-1); f(0) = 1, or if I did the math
]correctly 36893488147419103231 almost exactly double 2^64 (technically the
]solution is also 2^(n+1) -1) so the odds are significantly higher. But if
]you reduce the size to 63 bits or smaller the odd are 50%-epsilon.
] Joe
<Pedantic on>
The number of numbers of length n bits is 2^(n+1)-1. However, the
"birthday" probability is
exp(- (r*(r+1)/2N) where is the number of trials, and N is the total
number of numbers (2^128 in his case). This equals 1/2 when
(r(r+1)/2N=.69, or r=-1 +sqrt(1+4(.69)2N)) \approx 2.34 sqrt(N) \approx
1.17*2^65
<Pedantic off>
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: RC4-- repetition length?
Date: Wed, 26 Jul 2000 23:14:32 GMT
In article <e$hEvrS9$GA.318@cpmsnbbsa07>,
"Joseph Ashwood" <[EMAIL PROTECTED]> wrote:
> Actually, it is difficult (if not impossible) to prove but
> it is quite intuitive that once you can distinguish
> something like RC4 from random it can be broken, whether or
> not the algorithm is known to us is a matter far removed
> from it's existance. My statements about how much data to
> trust to a given (or closely related) key descended from
> that assumption. If you'd care to refute it, perhaps you'd
> care to place what you believe to be a more reasonable
> assumption in it's place.
> Joe
>
If by "broken" you mean you can eventually predict the sequence,
here's a sequence that can be distinguished from random but which
cannot be fully predicted:
Choose a probability b, 0 < b < 1.
Take two truly random sequences, X and Y.
Use Y to replace each bit in X with 0 with probability b.
You should be able to detect it's not random in about 1/bb bits,
but you'll never be able to predict an unknown bit with
probability better than (1+b)/2.
If by "broken" you mean it's not safe to use the sequence as
a one-time pad for some messages, you're right.
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Cursory Comments on Recursion
Date: Wed, 26 Jul 2000 16:32:41 -0600
In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:
> wtshaw wrote:
>
> > [snip]
> >
> > I was surprised to see the full results, that the Best Bases in recursion
> > are 66, 78, 93, and 94, and that the bottom five categories include the
> > most popular bases recursively used in so-called modern algorithms.
> >
> > The common wisdom is that prime numbers are best for encryptive bases, but
> > the data seems to say, if anything, that bases of all integer powers of
> > low primes, 2, 3, 5, and 6, with a very few unpredicted others, are
> > ill-suited as compared with others for recursive usage.
>
> I don't rememeber to have seen your definition of recursion of bases.
> Do you mean successive changes of a number of number bases?
> I suppose that using a pair of larger relatively prime bases results in
> a larger mapping and hence may be more advantageous in general.
>
> M. K. Shen
Base Translation deals with two or more bases, but the current topic deals
with one at a time. In the trials, seeds, or primers, are sensed as
recurring in a string.
I received a few days ago a back issue of Cryptologia, July, 1989. I
found that the author presented much of the same data that I had derived.
He and I agree on some of the same speculations, and perhaps conjured the
same error regarding binaries, which the current effort seems to
redefine. Crux offeres his assistance.
As Marshen indicated the other night, the number theory folks probably
have other ways to deal with the problems. I wonder if they have really
been snookered on such an important subject. Surely, if I ask embarrasing
questions about what goes, someone will have to fess up to why there is
such a vacuum in functional crypto, but perhaps they believe that Napster
can only pilfer music as well.
The beauty of working in a vacuum has always been that you do not become
distracted with the ways of another, but it is refreshing to find
mathematical beauties that others also have derived. Looking at others
contributions at some point can be important. Remaining vacuum packed
would have meant you are doing your stuff more for yourself than to share.
Beyond the weakness of the series generation method for simple combination
with plaintext, some strength appears to be gained with multiple encodings
with different length seeds. **However, some of the bases are off the
wall, where a longer seed can produce a shorter series, loop, string,
whatever you want to call it.
Bases can live in strange worlds, and what I see seems to support that
view. It is an old caution to make few assumptions when treading on
virgin ground.
--
Pat B. reminds us that he served in the Nixon Administration for
six years. How can he be proud of that?
------------------------------
From: John Myre <[EMAIL PROTECTED]>
Subject: Re: "Symmetric" RSA encryption
Date: Wed, 26 Jul 2000 17:20:26 -0600
Bill Unruh wrote:
> In <[EMAIL PROTECTED]> John Myre <[EMAIL PROTECTED]> writes:
<snip>
> > In particular, the factors of n are gone.
>
> And can be easily regenerated.
Only if Alice and Bob conspire by sharing e and d.
The thought was that e was Alice's secret (that she will not share
with Bob), and d was Bob's secret (that he will not share with
Alice). Of course, the initial generation and distribution of
n, e, d is frought with peril.
JM
------------------------------
From: John Myre <[EMAIL PROTECTED]>
Subject: Re: "Symmetric" RSA encryption
Date: Wed, 26 Jul 2000 17:34:08 -0600
"David A. Wagner" wrote:
<snip>
> What you have suggested is very similar to the Pohlig-Hellman
> cipher, which also uses number theory for symmetry crypto.
Right, but then A and B know the same secret. I suppose my
subject line is misleading.
> Note that if you know e, d, and n, you can factor n.
The idea is that no one person knows all three of these.
Alice and Bob do not intend to share their secrets. Both
know n, but only Alice knows e and only Bob knows d. This
is assuming (!) that we can set it up that way in the first
place. Somehow. Pretend, say, that a trusted party set
up a pair of crypto tokens (and, for *some* reason I
can't think up right now, a simple symmetric cipher is
not the right solution).
> By the way, it's not such a good idea to encrypt both directions in
> a related way. You should pick entirely independent keys for both
> directions. (For instance, if you can convince Bob to encrypt chosen
> messages for you, you can also read anything encrypted by Bob, with your
> above idea.)
Isn't that a basic caveat with RSA? I mean, that's why we have
OAEP; and you never sign something exactly as anyone else gave
it to you. Assuming Alice and Bob randomize their messages
appropriately, is there still a difficulty?
(I still don't see any reason to try any of this - any
ideas from anybody would be appreciated).
JM
------------------------------
From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: MD5 algorithm questions
Date: Wed, 26 Jul 2000 16:35:47 -0700
Crossposted-To: comp.security.unix,alt.computer.security
I was mostly pointing out that by including all the sizes less than 64 bit
he was effectively doubling the samples. But if your equation is correct (I
haven't taken the time to remember it yet), then I should change my
statement from "the odds are significantly higher" to "you're gonna have a
collision." That's certainly not a pedantic observation (IMO).
Joe
> <Pedantic on>
> The number of numbers of length n bits is 2^(n+1)-1. However, the
> "birthday" probability is
> exp(- (r*(r+1)/2N) where is the number of trials, and N is the total
> number of numbers (2^128 in his case). This equals 1/2 when
> (r(r+1)/2N=.69, or r=-1 +sqrt(1+4(.69)2N)) \approx 2.34 sqrt(N) \approx
> 1.17*2^65
> <Pedantic off>
------------------------------
From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Elliptic Curves encryption
Date: Wed, 26 Jul 2000 17:00:59 -0700
It sure seems to be. OTOH the RSA patent expires soon, and that would
certainly put it on even ground with ECC, ECC takes much smaller keys, but
RSA transfers actual data. They really seem to be the best for different
purposes.
Joe
"diana" <[EMAIL PROTECTED]> wrote in message
news:8lnldu$ja6$[EMAIL PROTECTED]...
> I'm wondering if elliptic curves is the next trend in encryption.... is
it?
> or can anyone think of something else?
>
>
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************