Cryptography-Digest Digest #299, Volume #9 Mon, 29 Mar 99 22:13:04 EST
Contents:
New Encryption Algorithim ([EMAIL PROTECTED])
Re: Live from the Second AES Conference (Bruce Schneier)
Re: Tripple DES key length. (DJohn37050)
strong brain-embedded decryption algorithm (DuBose8)
Re: strong brain-embedded decryption algorithm ([EMAIL PROTECTED])
Re: What did Gilbert Vernam look like? (Jim Gillogly)
Re: GOOD PRIME GENERATOR (GPG) ([EMAIL PROTECTED])
Re: How do I determine if an encryption algorithm is good enough? (Jim Gillogly)
Re: Is initial permutation in DES necessary? (Jim Gillogly)
Raw statistical data on computer security - in particular encryption (smustafa)
Re: New Encryption Algorithim (Scott Fluhrer)
Twofish Book Published by Wiley (Bruce Schneier)
Re: True Randomness & The Law Of Large Numbers ("Tony T. Warnock")
Re: Live from the Second AES Conference (Jim Gillogly)
Re: How do I determine if an encryption algorithm is good enough? (Patrick Juola)
Re: How do I determine if an encryption algorithm is good enough? (Patrick Juola)
Re: Twofish Book Published by Wiley (Paul Rubin)
Re: True Randomness & The Law Of Large Numbers (R. Knauer)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED]
Subject: New Encryption Algorithim
Date: 29 Mar 1999 22:09:54 GMT
I've written a new encryption algorithim with a 64-bit key and a 64-bit
block. (It's unnamed, any ideas?) I would like to submit it for public
discussion. Here's the C source:
#include <stdio.h>
int crypt( unsigned long K[2], unsigned long block[2] )
{
unsigned long S = 0x14159265;
unsigned long A = 0xabcdef01;
unsigned long B = 0x10fedcba;
unsigned long C = 0x12345678;
unsigned long D = 0x87654321;
int i;
char *p;
for(i=0;i<256;i++)
{
S = (((((K[0]<<i%16)^(K[1]>>i%16))) + K[0]) - K[1]) ^ S;
S ^= (A | (B+S) | (C^S) | (D&S))
A += S;
B -= S;
C ^= S;
D |= S;
block[0] += S;
block[0] ^= S;
block[0] -= S;
block[1] -= S;
block[1] ^= S;
block[1] += S;
}
block[0] *= ( K[0] % 32 );
block[1] *= ( K[1] % 32 );
block[0] ^= K[0];
block[1] ^= K[1];
block[0] += K[0];
block[1] += K[1];
block[0] += (K[0] - K[1]);
block[1] += (K[1] - K[0]);
block[0] ^= K[0] ^ K[1];
block[1] ^= (K[0]>>1) ^ (K[1]<<1);
block[0] ^= K[0] << (K[1]%16);
block[1] ^= K[1] >> (K[0]%16);
block[0] -= K[1];
block[1] -= K[0];
}
int main()
{
unsigned long key[2]= {0x12345678,0x87654321};
unsigned long block[2]={0x01abcdef,0xfedcba10};
printf("KEY: %.8x %.8x\nPLAINTEXT: %.8x %.8x\n",key[0],key[1],
block[0],block[1]);
crypt( key, block );
printf("CIPHERTEXT: %.8x %.8x\n",
block[0],block[1]);
}
***********************************************************
GNU GENERATION FOUNDATION ANONYMOUS REMAILER
Please report misuse of this system to [EMAIL PROTECTED]
The GGF takes no responsibility for the content of this
post.
***********************************************************
------------------------------
From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Re: Live from the Second AES Conference
Date: Mon, 29 Mar 1999 20:08:31 GMT
On Mon, 29 Mar 1999 16:15:16 GMT, [EMAIL PROTECTED]
(John Savard) wrote:
>quoting me:
>>>If you compare all the ciphers on the same compiler ... and I remember at
>>>a computer chess competition, someone suggested that it wouldn't be a bad
>>>idea to require the entrants all to use C, so that it wouldn't be a
>>>competition of assembly-language coding skills.
>
>>Then you're comparing compiler efficiency. What we did in our
>>comparison is assume best possible ASM. We didn't implement
>>everything, but we gave all algorithms the benefits of the doubt.
>
>If everybody's C code is compiled on the same compiler, one may be
>comparing optimizations or something, but one isn't comparing compilers.
One is comparing both how well the coder optimized his code, and how
well the compiler optimizes the particular algorithm. For example,
the Borland C compiler can't do rotates well. Any algorithm using
rotates will look relatively worse than an algorithm that does not, if
compared using a Borland compiler. This relativel difference won't
exist if the algorithms are compared using a Microsoft compiler.
Bruce
**********************************************************************
Bruce Schneier, President, Counterpane Systems Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN 55419 Fax: 612-823-1590
Free crypto newsletter. See: http://www.counterpane.com
------------------------------
From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: Tripple DES key length.
Date: 29 Mar 1999 22:23:23 GMT
A few comments:
1. This is the known best attack. Whether it is practical or not is left for
the reader to decide. It all depends on one's assumptions and paranoia.
2. I believe it is a mistake to "count on" the amount of storage needed. There
are many tricks that can be used to reduce storage, hashing, cycles, etc. I
think it is much safer to count on the number of operations needed.
Don Johnson
------------------------------
From: [EMAIL PROTECTED] (DuBose8)
Subject: strong brain-embedded decryption algorithm
Date: 29 Mar 1999 22:12:27 GMT
Is there such a thing as a brain-embedded decryption algorithm? Someone
mentioned this on a mailing list but I've never heard the term.
Scott
Scott
Senior - Computer Information Systems
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: strong brain-embedded decryption algorithm
Date: 29 Mar 1999 22:49:08 GMT
GRRRRRR!
Typical specimin of AOL trash which escaped ... god, they opened the
flood gates back in '95 ...
>
> strong brain-embedded decryption algorithm
>
> From: [EMAIL PROTECTED] (DuBose8)
> Reply to: DuBose8
> Date: 29 Mar 1999 22:12:27 GMT
> Organization: AOL http://www.aol.com
> Newsgroups:
> sci.crypt
> Followup to: newsgroup(s)
>Is there such a thing as a brain-embedded decryption algorithm? Someone
>mentioned this on a mailing list but I've never heard the term.
>
>Scott
>Scott
>Senior - Computer Information Systems
------------------------------
From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: What did Gilbert Vernam look like?
Date: Mon, 29 Mar 1999 15:55:35 -0800
Reply-To: [EMAIL PROTECTED]
John Savard wrote:
>
> monterey <[EMAIL PROTECTED]> wrote, in part:
>
> >I am trying to find an image file of Gilbert Vernam and/or Joseph
> >Mauborgne without any success. Has anyone ever run across a picture of
> >Vernam? If so could you refer me to the book/website?
>
> I'm going to have to look at my copy of Kahn's _The Codebreakers_: I
> _thought_ there was a picture of Vernam in it.
Yes, good call. It's in the picture section following page 558, next
to the picture of Mauborgne. (2nd edition, 1996, which I think is
paginated the same as the 1st edition in that region.)
--
Jim Gillogly
Highday, 7 Astron S.R. 1999, 23:53
12.19.6.1.2, 4 Ik 15 Cumku, Fourth Lord of Night
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: GOOD PRIME GENERATOR (GPG)
Date: Mon, 29 Mar 1999 23:46:13 GMT
To calculate the mean of a set of 5 values, we could divide each value by 5,
then find the sum of the results. We could also find the sum of the values,
then divide the sum by 5. Both methods produce the same results, both are
based on the same mathematical principles, but the algorithms are different.
Computationally, the difference is in the number of division versus sum
operations taking place. Would you have me conclude that the second method
is uninteresting just because it is isomorphic to the first method? (By the
way, "bobs" might have you define "uninteresting").
The theorem you state applies to interest in proofs, namely because proofs
are boolean: either something is proved, or not. You cannot generalize that
theorem to interest in algorithms, namely because efficiency and other
factors are crucial in algorithms. I understand that there may be no
*mathematical* interest in two isomorphic algorithms. But you should
understand that in the English language the algorithms can still be referred
to as different, and that in the real world their differences could still be
of interest to others.
By the way, I never made claims of speed for my original Sieve algorithm.
torres
In article <7dodop$bsr$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Thomas W. Kellar) wrote:
> [EMAIL PROTECTED] wrote:
> >Issue 1) You claim that I discovered nothing. Nothing is a strong word.
> >I'll grant you that I came up with nothing *mathematically* new, but I did
> >come up with a new way of doing something old. You don't have to like
it.
> >It doesn't even have to be better in any way. It just has to be
> >algorithmically *different*, which it is, unless you can point to someone's
> >work performing the same steps. If you can't produce such past work, and
> >continue to claim I discovered nothing, you are then just as guilty of that
> >which you claim cranks are guilty of.
> >
>
> There is a theorem that states if two proofs A and B prove
> the same theorem T, (and presuming they are correct), then
> proofs A and B are isomorphic. I.e., identical.
>
> Now if your algorithm produces the same output as another
> algorithm then this is uninteresting. While you might
> give different while/gotos/ifthenelse/case/begin/end or
> whatever you happen use in your language, the algorithms
> would be isomorphic. What mathematicians are looking for
> are New and Different things. While, perhaps, your
> algorithm might run faster, that is not material.
>
> Thomas
> --
> Freelance System Programming. http://www.fsp.com
>
>
============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/ Search, Read, Discuss, or Start Your Own
------------------------------
From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: How do I determine if an encryption algorithm is good enough?
Date: Mon, 29 Mar 1999 16:05:05 -0800
Reply-To: [EMAIL PROTECTED]
Ludvig Strigeus wrote:
> I've created my own encryption algorithm, but how do I test if it's
> secure?
Nobody else seems to have mentioned this yet, so I will. Create a
"fair" challenge (complete program source, test data, and challenge
of suitable length with ciphertext and mostly known plaintext, and
offer a fat monetary reward for the rest of the plaintext. Given
the AES competition, there are enough important new ciphers out
there right now for people to spend their unpaid time on without
taking on randomly-created ones.
Given the source, a fair challenge and enough money, people will
help you test it.
--
Jim Gillogly
Sterday, 8 Astron S.R. 1999, 00:00
12.19.6.1.2, 4 Ik 15 Cumku, Fourth Lord of Night
------------------------------
From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: Is initial permutation in DES necessary?
Date: Mon, 29 Mar 1999 15:37:46 -0800
Reply-To: [EMAIL PROTECTED]
Christoph Haenle wrote:
> However, when
> using 3DES, it could make a difference (in security) whether EDE or
> EEE is used, even when using three different keys (when using EDE, the
> permutations between E and D and between D and E won't disappear as
> opposed to the EEE scheme).
Both encryption and decryption start with IP and end with FP, its
inverse; so in both EEE and EDE the interior permutations cancel.
--
Jim Gillogly
Highday, 7 Astron S.R. 1999, 23:35
12.19.6.1.2, 4 Ik 15 Cumku, Fourth Lord of Night
------------------------------
From: smustafa <[EMAIL PROTECTED]>
Subject: Raw statistical data on computer security - in particular encryption
Date: Mon, 29 Mar 1999 19:19:38 -0500
I am a PH D student and need raw statsitical data such as # of encrypted
transactions per year, most popular key size, commonly used algorithms
etc.
The data need not be technical but a summary of collected stats by some
organization.
Please e-mail me at <[EMAIL PROTECTED]>
------------------------------
From: Scott Fluhrer <[EMAIL PROTECTED]>
Subject: Re: New Encryption Algorithim
Date: Tue, 30 Mar 1999 00:36:57 GMT
In article <7dotni$74c$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] wrote:
>I've written a new encryption algorithim with a 64-bit key and a 64-bit
>block. (It's unnamed, any ideas?) I would like to submit it for public
>discussion. Here's the C source:
Ok, I spent approximately 5 minutes. Lets see how it fared:
>
>#include <stdio.h>
>
>int crypt( unsigned long K[2], unsigned long block[2] )
>{
> unsigned long S = 0x14159265;
> unsigned long A = 0xabcdef01;
> unsigned long B = 0x10fedcba;
> unsigned long C = 0x12345678;
> unsigned long D = 0x87654321;
> int i;
> char *p;
>
> for(i=0;i<256;i++)
^^^
GAAK!!! 256 rounds??? How slow is this cipher anyways? Is it as
fast as RSA?
> {
> S = (((((K[0]<<i%16)^(K[1]>>i%16))) + K[0]) - K[1]) ^ S;
>
> S ^= (A | (B+S) | (C^S) | (D&S))
>
> A += S;
> B -= S;
> C ^= S;
> D |= S;
>
> block[0] += S;
> block[0] ^= S;
> block[0] -= S;
>
> block[1] -= S;
> block[1] ^= S;
> block[1] += S;
> }
Weakness here: either both LSBits of block[0] and block[1] will be
flipped, or neither will be...
>
> block[0] *= ( K[0] % 32 );
> block[1] *= ( K[1] % 32 );
^
Unless you know that K[0], K[1] are odd, then this might not be
invertable.
> block[0] ^= K[0];
> block[1] ^= K[1];
> block[0] += K[0];
> block[1] += K[1];
> block[0] += (K[0] - K[1]);
> block[1] += (K[1] - K[0]);
> block[0] ^= K[0] ^ K[1];
> block[1] ^= (K[0]>>1) ^ (K[1]<<1);
> block[0] ^= K[0] << (K[1]%16);
> block[1] ^= K[1] >> (K[0]%16);
> block[0] -= K[1];
> block[1] -= K[0];
>}
In addition, any bit of the ciphertext (say bit 7 of block[0])
depends only on lower bits in the plaintext from that same
block (bits 7-0 of block[0]). In the extreme, bit 0 of block[0]
in the ciphertext depends only on bit 0 of block[0] of the
plaintext! This is an incredibly major problem, which allows
the cryptanalyst access to many of the plaintext bits, if he has
a little known ciphertext.
--
poncho
------------------------------
From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Twofish Book Published by Wiley
Date: Tue, 30 Mar 1999 00:57:54 GMT
A book about Twofish has been published by John Wiley & Sons. This
book contains all the information in the initial Twofish submission
and the first three Twofish tech reports, expanded and corrected.
The book costs $50, but I am offering it at a 20% discount. See:
http://www.counterpane.com/twofish-book.html
Bruce
**********************************************************************
Bruce Schneier, President, Counterpane Systems Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN 55419 Fax: 612-823-1590
Free crypto newsletter. See: http://www.counterpane.com
------------------------------
From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Mon, 29 Mar 1999 14:50:20 -0700
Reply-To: [EMAIL PROTECTED]
"R. Knauer" wrote:
> Without doing actual calculations, consider the fraction of sequences
> that are close to the "normal", namely close to the mean expectation
> value. Let's say that you calculate the fraction of sequences that are
> within +- 5% of the mean for a large number of steps. That fraction
> is very small compared to the other 95% of sequences outside +- 5%.
>
> Therefore the TRNG has a significant likelihood of outputing a
> sequence that deviates from the norm. For sequences just outside +- 5%
> of the norm, the 1-bit bias greater than 55 to 45, which is a
> significant amount of bias. All other sequences farther outside +- 5%
> of the norm have a 1-bit bias much greater than that.
>
You are not clear what you are measuring here. Do you mean those having 45% to
55% the expected number of ones?
Your next statement about 95% outside of the +-5% level is wrong in either
case. The number would be 90% and you are treating it as if it were a
confidence limit.
Tony
------------------------------
From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: Live from the Second AES Conference
Date: Mon, 29 Mar 1999 15:48:28 -0800
Reply-To: [EMAIL PROTECTED]
David Wagner wrote:
> In article <[EMAIL PROTECTED]>, Jim Gillogly <[EMAIL PROTECTED]> wrote:
> > I don't know whose standard answer this is, but it has to be
> > totally bogus for immediately obvious reasons: if A.B is
> > E2.RC6 and it's weaker than either of them, then you
> > can mount an effective attack on E2 by encrypting it with
> > RC6, and vice versa. This can be done by the cryptanalyst
> > in possession of A-ciphertext, and doesn't need to have been
> > done by the sender.
>
> You might want to check out
> ``Cascade Ciphers: The Importance of Being First''
> in J. Cryptology vol. 6 1993 pp. 55--61.
> The authors show that A.B (encrypting with A first, then B second)
> is at least as secure as A, but not necessarily as secure as B (under
> some threat models, anyway -- in particular, under ciphertext-only
> probable-plaintext attack, A.B might be weaker than B, if you are
> especially unlucky).
I'll check the reference. However, unless the two use related keys and
the ciphertext of the first is distinguishable from random, I have yet to
be convinced that applying B to A in this case doesn't constitute an
attack on A that could be applied by the cryptanalyst who simply obtains
A ciphertext. But I'll read the paper...
> Also, one requires the assumption that A and B are keyed independently,
> which raises the question: what key schedule should we use for A.B?
> Should we use A's key schedule or B's key schedule, or something totally
> new (and of unknown security)? The best answer isn't entirely clear.
The obvious suggestion to use each component's native key schedule.
Are you suggesting merging the E2 and RC6 key schedules in some way, to
use my example above? The advantage to keeping them separate is that
this is the version that has been tested. Further, if you merge them
you have related keys, which (if the ciphertext is distinguishable from
random) would be a serious hazard. What's the disadvantage of keeping
their native key schedules?
--
Jim Gillogly
Highday, 7 Astron S.R. 1999, 23:38
12.19.6.1.2, 4 Ik 15 Cumku, Fourth Lord of Night
------------------------------
From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: How do I determine if an encryption algorithm is good enough?
Date: 29 Mar 1999 14:21:41 -0500
In article <[EMAIL PROTECTED]>,
Trevor Jackson, III <[EMAIL PROTECTED]> wrote:
>Patrick Juola wrote:
>>
>> In article <[EMAIL PROTECTED]>,
>> R. Knauer <[EMAIL PROTECTED]> wrote:
>> >On Fri, 26 Mar 1999 15:44:37 -0800, Andrew Carol <[EMAIL PROTECTED]>
>> >wrote:
>> >
>> >>The odds greatly favor that people with experience in a field will
>> >>typically do better than those who have none.
>> >
>> >I agree with you in principle, but disagree with you in practice.
>> >
>> >Science has progressed only on the basis of disagreement with those
>> >who claim experience.
>>
>> Re-read your Kuhn. The vast majority of scientific development
>> occurs in periods of "normal science" by skilled people refining
>> the existing paradigm. And for every successful new paradigm
>> proposed, there are literally thousands that don't stand up because
>> they can't account for the facts.
>>
>> The gadflies lose. Every so often, one gets lucky and is remembered.
>
>Ben Franklin's observation is germane here: Reasonable adapt themselves
>to their circumstances. Unreasonable men adapt their circumstances to
>themselves. Thus all progress is due to unreasonable men.
Germane, but factually incorrect.
Penicillin, for example, was not due to "unreasonable men"; the
discovery of penicillin was a fortuitous accident that happened to
someone who had been spending his career in the search for antibacterial
and antibiotic substances for medical purposes.
>I agree that the vast majority of scientific *activity* takes place as
>refinements and incremental improvements of existing concepts. But the
>vast majority of *advancement* takes place in leaps. For example,
>Maxwell's equations unifying a vast collection of observations cannot
>really be characterized as a refinement.
True. But just as thoroughly, the careful measurements of electric
charge, voltage, current, and so forth that gave him the *data* to
create his equations were equally essential to the progress. Without
Brahe and his observations, there would have been no Kepler or
Copernicus.
Unless your definition of "advancement" is *equivalent* to "paradigm
shifting", in which case you're guilty of circular reasoning using
a personal definition that few other people knows.
>As for being remembered, "when it's time to railroad, people will
>railroad". For example, Leibnitz and Newton both developed calculus
>based on infinitessimals because "it was time to
>integrate/differentiate".
What, the Zeitgeist fairy tickled their thought patterns? I doubt it.
No one needed calculus at the time. More bluntly, I call "bullshit"
on this assertion.
-kitten
------------------------------
From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: How do I determine if an encryption algorithm is good enough?
Date: 29 Mar 1999 15:27:31 -0500
In article <[EMAIL PROTECTED]>,
R. Knauer <[EMAIL PROTECTED]> wrote:
>On Mon, 29 Mar 1999 11:43:14 -0800, "Harv" <[EMAIL PROTECTED]>
>wrote:
>
>>Extraordinary claims require extraordinary evidence.
>
>Occam's Razor states otherwise. It is usually the less extraordinary
>claims that requires the more extraordinary evidence.
What?
Add William of Occam to your suggested re-reading list.
-kitten
------------------------------
From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: Twofish Book Published by Wiley
Date: Tue, 30 Mar 1999 02:28:19 GMT
In article <[EMAIL PROTECTED]>,
Bruce Schneier <[EMAIL PROTECTED]> wrote:
>A book about Twofish has been published by John Wiley & Sons. This
>book contains all the information in the initial Twofish submission
>and the first three Twofish tech reports, expanded and corrected.
>The book costs $50, but I am offering it at a 20% discount. See:
>
> http://www.counterpane.com/twofish-book.html
Is there anything in the book that's not on your web site?
One exception might code listings, due to export controls.
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 30 Mar 1999 02:25:16 GMT
Reply-To: [EMAIL PROTECTED]
On Mon, 29 Mar 1999 14:50:20 -0700, "Tony T. Warnock"
<[EMAIL PROTECTED]> wrote:
>You are not clear what you are measuring here. Do you mean those having 45% to
>55% the expected number of ones?
>Your next statement about 95% outside of the +-5% level is wrong in either
>case. The number would be 90% and you are treating it as if it were a
>confidence limit.
Calculate the frequency of particles that are within +- 5% of the mean
in n steps for the uniform Bernoulli process. The mean for the random
walk is zero, namely the origin. And the extremes are at +- n.
Compare that frequency with that for all the rest of the sequences.
For reasonably large n, the frequency of sequences that are within +-
5% of the mean (the origin) is small. Most of the sequences are
outside +- 5%. The reason is that the Gaussian profile has flattened
out considerably. Appeal to the physical intuition of diffusion. Most
of the ink particles have diffused to a location outside +- 5% of the
origin.
For example, let's say that we watch the ink particles diffuse 1
million steps. The extremes are at +- 1 millon. Therefore the inner
range we are considering (+- 5%) is at +- 10,000 from the origin,
which is not all that small.
Even a particle at the boundary of that narrow range has a 1-bit bias
of 10,000 bits, which is not trivial. And only a small fraction have
that property - all the rest have a 1-bit bias in excess of 10,000
bits.
IOW, the vast majority of sequences have a very large bias. It is
little consolation that the percentage bias is small. It is still very
large in each individual case.
Bob Knauer
"What right does Congress have to go around making laws just because
they deem it necessary?"
- Marion Barry, Mayor of Washington DC
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************