Cryptography-Digest Digest #426, Volume #9 Tue, 20 Apr 99 02:13:03 EDT
Contents:
Re: RC6 new key standard from AES conference? (Paul Rubin)
Re: NSA not so bad after all? (was Re: RC6 new key standard from AES conference?)
("Steven Alexander")
Re: Question on confidence derived from cryptanalysis. (David A Molnar)
Re: Thought question: why do public ciphers use only simple ops like shift and XOR?
(Jerry Coffin)
Re: SSL implementation? (Phil Howard)
Crypto Burglary (was: FSE-6 Report: Slide Attack) (Boris Kazak)
Re: Adequacy of FIPS-140 (R. Knauer)
Re: Adequacy of FIPS-140 (R. Knauer)
Re: True Randomness & The Law Of Large Numbers (R. Knauer)
S/Key over radius (Steve Droz)
Re: Adequacy of FIPS-140 (kurt wismer)
Re: Thought question: why do public ciphers use only simple ops like shift and XOR?
(Terry Ritter)
Re: SSL implementation? (Paul Rubin)
Re: Thought question: why do public ciphers use only simple ops like shift and XOR?
(Paul Rubin)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: RC6 new key standard from AES conference?
Date: Tue, 20 Apr 1999 00:54:12 GMT
In article <7ffuc7$q6l$[EMAIL PROTECTED]>,
<[EMAIL PROTECTED]> wrote:
>
>> I think it is important that the AES perform reasonably well on smartcards.
>
>But essentially any algorithm can be made into a smartcard, just build a
>custom ASIC for it, instead of using a 6805 or 8051 (why would you use an
>8051 anyways?)
You'd use a 6805, 8051, etc. because you don't want to spend $1+ million
making a custom asic. Unless you're making stupendous quantities,
it's best to use commodity parts whenever you can.
------------------------------
From: "Steven Alexander" <[EMAIL PROTECTED]>
Subject: Re: NSA not so bad after all? (was Re: RC6 new key standard from AES
conference?)
Date: Mon, 19 Apr 1999 16:44:25 -0700
The NSA is very much ahead of the rest of the world in terms of
cryptographic and computer technology. They were researching crypto long
before the rest of the world. Barring again their motivation for key-escrow
their algorithms are going to be very strong. Also, they only problem with
DES was it's short key space. DES has held up amazingly well to the attacks
that it has undergone in the 20+ years since its invention. Also, the NSA
was rumored to have backdoored DES because they made changes to its S-boxes.
However, when differential cryptanalysis was first discovered in the public
sector DES was found to be very much optimized against differential
cryptanalysis. This was specifically because of the chagnes that they made
to the S-boxes. While you may question their motives/intents/etc. it is
generally going to be pretty safe to trust their abilities at crypanalysis
and cryptographic design.
my $.02
-steven
------------------------------
From: David A Molnar <[EMAIL PROTECTED]>
Subject: Re: Question on confidence derived from cryptanalysis.
Date: 19 Apr 1999 23:21:54 GMT
Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
> For example, several "significant" results in academic papers say
> that certain systems can be broken with an inordinate amount of
> resources, if 2^24 chosen plaintexts are used. It's hard to justify
> such work when your job performance is measured by practical results
> "in the field".
I agree with you about 85%. The other 15% comes from refinements of those
attacks which make them more practical, and the cases where bad design of
a system make them relevant. For example, it was known that knapsack
public-key systems leaked bits of information long before any specific
catastophic results were known. The single bit doesn't help much, but
acts as a warning sign that something is wrong. Now knapsacks are the
prime example of crypto that seemed 'pretty secure' and wasn't. i
Adaptive chosen ciphertext attack is a very strong attack, requiring that
the adversary decrypt values of its choice on your equipment, and perhaps
lots of them. It is not obvious how someone would apply it to a real
world system. Yet Daniel Bleichenbacher found that some implementations
of SSL aren't secure against it. Even though that attack is just barely
on the edge of practicality, we now have a new RSA PKCS standard.
Then you can improve attacks by gaining more information about what, exactly,
it is that you're attacking. There's a paper in Crypto '98 (for the life
of me I can't find it now, I'm sorry ) on "From Differential
Cryptanalysis To Ciphertext Only Attacks." It uses the assumption that
the cryptanalyst is dealing with English text to turn chosen-plaintext
attacks into ciphertext-only attacks. I can't find it, or else I'd report
how efficient the new attacks are -- but this is a qualitative difference
in utility. It wouldn't be possible without the earlier work.
My point is that there's enough precedent that I can imagine a boss
with foresight not being too dismayed by the "2^24 chosen plaintexts,
needs 2^42 operations and 2^56 blocks of memory" sort of result.
I can imagine that she wouldn't be thrilled, but I can also imagine
that she'd try to follow it up, too.
> Generally speaking, Terry is right to be concerned over the unknown,
> but some risks are greater than others. The specific algorithms you
> mentioned previously are among the better risks. If the stakes are
> really high, thoroughly studied systems are better bets than untested
> ones. That's not to say that we don't need new, better systems, but
> it takes *time* to subject them to enough testing and analysis to
> develop confidence in them. Maybe some day we'll all understand that
> Terry's approach (or David's) is a better way to go -- or maybe not.
Thank you for referring to it that way, but I'm rather new to the approach.
I suspect my enthusiasm comes from its novelty, as well as the prospect
of finally being able to "measure" security. :-)
So far provable security doesn't seem to do much for block ciphers, though...
at least that I've seen (and I've heard about DFC but haven't looked at it
much yet), or indeed quick bulk ciphers of any kind. That leaves
Terry's approach and whatever you want to call the other.
Honestly, I need to read more about ciphers by ritter and see what this
'scaled down to experimental size' means, along with everything else.
Thanks,
-David Molnar
------------------------------
From: [EMAIL PROTECTED] (Jerry Coffin)
Subject: Re: Thought question: why do public ciphers use only simple ops like shift
and XOR?
Date: Mon, 19 Apr 1999 15:38:34 -0600
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
says...
[ ... ]
> BTW, I also experimented with multiplication mod 2^64+1 and 2^64-1.
> Unfortunately, I am not so great a programmer, and my computer has
> no 64-bit registers. So beyond some basic knowledge, nothing yet did
> come into practice (but the ciphers could be terrific!).
...or they might not be. 2^32-1 happens to be a prime number. In
many cases, the smallest factor of your modulus has a large effect on
the security of encryption using that modulus.
By contrast, 2^64-1 is what you might call extremely composite -- its
prime factorization is (3 5 17 257 641 65537 6700417). This large
number of relatively small factors will often make this a particularly
bad choice of modulus.
Depending on what you're doing, 2^64+1 is likely to be a MUCH better
choice -- it's still not a prime, but its prime factorization is
(274177 67280421310721). In many cases, the largest prime factor is
what matters, and in this case, it's MUCH larger -- 14 digits instead
of 7 (which is also considerably larger than 2^32-1). Unfortunately,
using 2^64+1 as a modulus is likely to be fairly difficult even if you
have a 64-bit type available.
I obviously haven't studied your encryption method in detail (or at
all) so I don't _know_ that this will make a difference in your
particular case, but it's definitely something to keep in mind. Many,
many forms of encryption that work quite well in 32-bit arithmetic
basically fall to pieces when converted to use 64-bit arithmetic
instead.
------------------------------
From: [EMAIL PROTECTED] (Phil Howard)
Subject: Re: SSL implementation?
Date: Tue, 20 Apr 1999 02:34:13 GMT
On Mon, 19 Apr 1999 18:49:18 +0200 denis bider ([EMAIL PROTECTED]) wrote:
| It's got to be commercial because of our client's "if it's free it can't
| be any good" attitude. Also, BSAFE SSL is out of the question - I've
| emailed them two simple questions 10 days ago and they said they would
| answer, but they still haven't.
It's not free if you charge them money to support it.
Ask them if they like code that is released on the deadline date whether
it is ready or not.
--
Phil Howard KA9WGN
[EMAIL PROTECTED] [EMAIL PROTECTED]
------------------------------
From: Boris Kazak <[EMAIL PROTECTED]>
Subject: Crypto Burglary (was: FSE-6 Report: Slide Attack)
Date: Mon, 19 Apr 1999 20:00:14 -0400
Reply-To: [EMAIL PROTECTED]
James Frey wrote:
>
> The Slide Attack
>
> Summarized from the FSE-6 Conference in March, 1999, Rome
>
> The authors were Alex Biryukov and David Wagner at the
> Fast Software Encryption Workshop #6. The Slide Attack
> is done using known plaintexts and known
> ciphertexts. Two encryptions are put side by side,
> with a one round offset. F is the round function:
>
> P0
> F
> P1 P0'
> F F
> P2 P1'
> F F
> P3 P2'
> F F
> P4 P3'
> F F
> C P4'
> F
> C'
>
> Pn is a plaintext in round n, C is a ciphertext.
>
> In each round, a guessed subkey is used. Non-Feistel ciphers require
> about O(2^(n/2)) work using known plaintexts with known
> ciphertexts. For Feistel ciphers which split blocks in half,
> the work and number of Chosen plaintexts and ciphertexts
> needed are about O(2^(n/4)). Ciphers that do not split the block
> in half are more secure against this attack.
=========================
Whoever will read this, please in advance forgive
my ignorance. I am not a cryptologist, even less a
cryptanalyst, and I simply want to understand something
which does not add up properly in my mind.
My question relates to "Slide Attack",
which was posted in this group along with the reference
to the original paper by D. Wagner.
Both the post and the paper describe an attack where
either a chosen plaintext is compared (matched) to the
intermediate product of known-plaintext encryption, or a
final ciphertext is compared (matched) to the intermediate
product of chosen-plaintext encryption.
In both cases it seems that in order to perform such a
comparison it is necessary to have access to the intermediate
products of encryption on round-by-round basis. May be I am
missing something here, but please explain me, does the
Kerckhoff's principle imply the availability of such access?
*******************
If it really does, then read the rest very carefully!
I am going to show you, how with this assumption our cherished
BLOWFISH will be competely broken after only 1 (one) trial
encryption, 18 XOR-ings and 512 encryptions using the BLOWFISH
itself as a self-breaking tool.
*******************
Ready? Go...
In each of the BLOWFISH rounds the L half is XOR-ed with a
number from the P-box, then the R half is XOR-ed with a number
derived from the 4 S-boxes via the F-function. Last, the two
resulting halves are swapped.
Now, if we take the result of the 1-st round and compute
Z = R(1)^L(0)
this will yield pure and simple the value of P(0). Enough for
this round, store this value somewhere.
Let us proceed. In the 2-nd round L(1) is XOR-ed with
the subsequent number from the P-box and R(1) is XOR-ed
with a number derived from the 4 S-boxes via the F-function.
Last, the two resulting halves are swapped.
Now, if we take the result of the 2-nd round and compute
Z = R(2)^L(1)
this will yield pure and simple the value of P(1). Enough for
this round, store this value somewhere.
Probably, you already got the idea. Proceeding like this,
we can recover all 18 numbers contained in the P-box (the
last 2 will be recovered by XOR-ing the final ciphertext
with the result of 16-th round).
Now, the final coup! Load the numbers thus recovered into
the P-box, load the digits of Pi into S-boxes (obviously
starting with the 73-d byte and on), and launch the BLOWFISH
key setup from the moment when it starts computing the S-boxes
(just bypass the 9 encryptions needed for the P-box). The
silly program will faithfully perform 512 encryptions, et voila!
You have all the keys properly set up, you can encrypt and
decrypt whatever you wish...
This is only true, however, of one has access to all
intermediate products on a round-by-round basis. This is why I
ask again - is this assumption implied in Kerckhoff's principle?
BTW, I understand that this can be a real-life situation,
that encryption program can be launched under some debugging
monitor with breakpoints and options to investigate registers
and variables, but still...
I'd rather call it not Crypt-Analysis, but Crypto-Burglary.
Best wishes BNK
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Adequacy of FIPS-140
Date: Tue, 20 Apr 1999 01:26:51 GMT
Reply-To: [EMAIL PROTECTED]
On Mon, 19 Apr 1999 09:14:06 -0500, Jim Felling
<[EMAIL PROTECTED]> wrote:
>> That statement is true for classical crypto, but not for quantum
>> crypto.
>Why is this? What property of quantum phenomena frees it from external
>coupling and resonance phenomena?
Good question. To get the answer you will have to read the book on
quantum computing by Williams & Clearwater.
The short answer is that any external influences will immediately
decohere the quantum computer, so means to prevent that from happening
must be built into the design.
Either the quantum computer works and works perfectly as designed - in
which external influsences are prevented from having an effect on it,
or it fails to work at all.
Bob Knauer
"Our revels are now ended. These are actors, as I foretold you were
all spirits, and then are melted into air, into thin air. And like
the baseless fabric of this vision, the cloud-capped towers, the
gorgeous palaces, the solemn temples, the great globe itself, yea,
and all that it inherits, shall dissolve. And like this insubstantial
pageant faded, leave not a rack behind. We are such stuff as dreams
are made on, and our little life is rounded with a sleep."
-- The Tempest
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Adequacy of FIPS-140
Date: Tue, 20 Apr 1999 01:30:22 GMT
Reply-To: [EMAIL PROTECTED]
On Mon, 19 Apr 1999 09:16:09 -0500, Jim Felling
<[EMAIL PROTECTED]> wrote:
>I can make a lucky guess at the message and get it right.( Its not likely, but
>the attack can work against an OTP)
How do you know that the plaintext you guess is the actual message?
Remember that all possible plaintexts can be decrypted with an equal
probability of being the correct message.
Bob Knauer
"Our revels are now ended. These are actors, as I foretold you were
all spirits, and then are melted into air, into thin air. And like
the baseless fabric of this vision, the cloud-capped towers, the
gorgeous palaces, the solemn temples, the great globe itself, yea,
and all that it inherits, shall dissolve. And like this insubstantial
pageant faded, leave not a rack behind. We are such stuff as dreams
are made on, and our little life is rounded with a sleep."
-- The Tempest
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 20 Apr 1999 01:45:37 GMT
Reply-To: [EMAIL PROTECTED]
On Mon, 19 Apr 1999 22:00:55 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
wrote:
>If I buy a bit-stream generator that has been advertised as generating
>a "truly random" (uniform equiprobable) bit stream, and the acceptance
>test shows the likelihood of its meeting its advertised specification
>is less than 1 in 1,000,000, I am justified in rejecting it and finding
>another vendor.
And you could not be faulted. Just realize that you did that to be on
the safe side, not because you made a reasonably correct
determination. The TRNG could have been a perfectly good device.
In an infinite sequence, all subsequences are possible. There is
nothing to prevent a TRNG from generating one of them when you decide
to test it.
The false assumption that a time average is the same as an ensemble
average leads to the bigotry we see in ordinary life. In this case,
because the TRNG did not output a sequence that had the appearance of
randomness, you were bigoted against it because you could not take the
chance that it was not truly random.
Of course, you could be correct - the TRNG could be non-random. In
fact, it is quite possible that it is indeed non-random because it
failed to give the appearance of being random. But notice that's the
same criterion people use to discriminate against certain classes of
other people - that woman who is kept away from the board room because
she does not have the appearance of leadership needed to run a
company, or that black man who is kept away from a position of trust
because he does not have the appearance of trustworthyness, etc.
If someone does not have the appearances we associate with a certain
character trait, we discriminate against them using false statistical
arguments - just the same as we descriminate against a TRNG because it
happens to output a perfectly ordinary sequence that does not look
random.
What are we gonna do when a quantum computer programmed to calculate
true random numbers generates a sequence which failes the FIPS-140
Monobit Test? Throw out quantum mechanics?
Bob Knauer
"Our revels are now ended. These are actors, as I foretold you were
all spirits, and then are melted into air, into thin air. And like
the baseless fabric of this vision, the cloud-capped towers, the
gorgeous palaces, the solemn temples, the great globe itself, yea,
and all that it inherits, shall dissolve. And like this insubstantial
pageant faded, leave not a rack behind. We are such stuff as dreams
are made on, and our little life is rounded with a sleep."
-- The Tempest
------------------------------
From: Steve Droz <[EMAIL PROTECTED]>
Subject: S/Key over radius
Date: Mon, 19 Apr 1999 15:50:20 +0200
<!doctype html public "-//w3c//dtd html 4.0 transitional//en">
<html>
Hy everyone,
<br>I have a radius server working with a ACE/sever SecurID.
<br>I search another solution. I want use my radius server with a S/Key
server, but I haven't found a S/Key server than works with a radius server.
<p>Does anyone have a solution or a name of products?
<p>Kind regards.
<p>S. Droz</html>
------------------------------
From: kurt wismer <[EMAIL PROTECTED]>
Subject: Re: Adequacy of FIPS-140
Date: Tue, 20 Apr 1999 01:30:21 GMT
wtshaw wrote:
>
> In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Kurt
> Wismer) wrote:
> >
> > why can't you take the minimum cost of all the attacks... it doesn't give
> > you a value that will remain constant inspite of new developments in
> > cryptanalysis but i don't see how anything could... calculating the cost for
> > all possible attacks on the algorithm and taking the minimum would at
> > least be formal and the set could be added to/updated as techniques are
> > refined or new ones are discovered...
>
> There are a few details to be cleaned up: One is this Shannon Unicity
> idea. For many of the algorithms you don't need much to work on to think
> of brute force.
i don't know... my understanding of unicity suggests there's some
underlying assumptions about what the plain text looks like, and those
assumptions don't really give me a warm fuzzy feeling...
> For a few, it looks like you can't begin to even think in
> terms of brute force, or getting enough ciphertext to reasonably mount
> some sort of other attack.
>
> You are only going to be able to be formal with the algorithms most likely
> to fail to pass your attack. The others, you need not even talk about
> because they are in a better league.
i wasn't thinking in terms of just one person's attack, more along the
lines of the entire cryptographic community's efforts....
nor was i suggesting that the parameters those efforts produce would be
static, if it could be shown that attack X on algorithm Y cost less than
was previously considered then the set of costs for attacking algoritm Y
would have to be updated...
it's not a very elegant idea, though...
> > it seems to me any metric is going to have to take all possible attacks
> > into account since no single one is best for all cryptosystems...
>
> Yet there are systems that seem to resist all known attacks; if there is a
and for the period of time when it can't be shown to be less resistent
to any of those attacks the parameters may well simply be said to be
greater than the cost of a brute force attack (or however close to that
cost it may be if it isn't greater)...
> > i know i'm stating the obvious here, but since you need to take all that
> > information into account, and you're looking for the lower bound on the
> > cost of a succeful attack, i don't see any other obvious alternatives...
>
> Agai I say, make the burden absurdly difficult to consider.
once upon a time people did calculations with an abacus... long and
complex ones too, because at the time they had nothing better...
> Today, we drove into strange areas without a map and discovered the best
> chicken fried steak that we had ever eaten, and somehow got to Mary of
> Puddin' Hill sometime later. Sometimes it is best to just not to try so
> hard to reap the experience of freedom, just do it. Explore some strange
> idea in crypto for the fun of it and see if it leads anywhere, no
> footnotes involved.
well, that's kinda what i was doing, actually... thinking out loud... i
think i kinda get why it won't work...
--
"when the truth walks away everybody stays
cause the truth about the world is that crime does pay
so if you walk away who is gonna stay
cause i'd like to make the world be a better place"
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Thought question: why do public ciphers use only simple ops like shift
and XOR?
Date: Tue, 20 Apr 1999 04:24:33 GMT
On Mon, 19 Apr 1999 20:15:32 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (John Savard) wrote:
>[EMAIL PROTECTED] (Terry Ritter) wrote, in part:
>[...]
>>The truth is that we *never* know the "real" strength of a cipher. No
>>matter how much review or cryptanalysis a cipher gets, we only have
>>the latest "upper bound" for strength. The lower bound is zero: Any
>>cipher can fail at any time.
>
>I agree with you that we don't have a way to prove that a cipher really is
>strong. But cryptanalysis still gives the best confidence currently
>available.
I guess I dispute "confidence." Confidence and Trust and Reliability
are exactly what we do not have. I cannot say it more clearly:
cryptanalysis gives us no lower bound to strength.
As an engineer growing up with an engineer dad, I have lived with
bounded specifications most of my life. These bounds are what we pay
for in products; this is the performance the manufacturer guarantees.
I suppose like me most buyers have been caught at least once by the
consequences getting the cheapest part on the basis of "typical" specs
instead of "worst case." But "typical" is all cryptanalysis tells us.
Depending on that will sink us, sooner or later.
>[...]
>>On the other hand, I have been pioneering the use of scalable
>>technology which, presumably, can be scaled down to a level which can
>>be investigated experimentally. The last I heard, experimentation was
>>still considered a rational basis for the understanding of reality.
>>Indeed, one might argue that in the absence of theoretical strength
>>for *any* cipher, experimentation is about all we have. But note how
>>little of it we see.
>
>Are you drawing a distinction between "experimental investigation" and
>"cryptanalysis"? If so, it would appear you are saying that there is an
>additional method for obtaining some additional, though still imperfect,
>confidence in a cipher design.
We were OK up to the "c" word: I assert that we *can* have no
confidence in a cipher. We have no way to prove strength. Any
strength we assume is based upon the conceit that all others are just
as limited in their capabilities as we are. Drawing conclusions by
wishing and hoping the other guy is at least as dumb as us is not my
idea of good cryptography.
I do make a distinction (which probably should not exist) between
"theoretical" or "equation-based" or "academic" cryptography and
experimental investigation. I suppose this is really much like the
difference between math and applied math, with much of the same
theoretically friendly antagonism.
It is clear that we may never have a provable theory of strength.
This may mean that our only possible avenue toward certainty is some
sort of exhaustive test. Surely we cannot imagine such testing of a
full-size cipher. But if we can scale that same design down, in the
same way that small integers work like large ones, maybe we can work
with large enough samples of the full population to be able to draw
reasonable experimental conclusions.
>>>Plus, the risk that one's adversary is a hacker of the future with a very
>>>powerful desktop computer seems much greater than the risk that one's
>>>adversary will be an accomplished cryptanalyst, able to exploit the most
>>>subtle flaws in an over-elaborate design.
>
>>But we don't know our Opponents! If we have to estimate their
>>capabilities, I think we are necessarily forced into assuming that
>>they are more experienced, better equipped, have more time, are better
>>motivated, and -- yes -- are even smarter than we are. There is
>>ample opportunity for them to exploit attacks of which we have no
>>inkling at all.
>
>Most cipher users are more worried about their communications being read by
>the typical computer hacker than by the NSA.
>
>I suppose it's possible that one day a giant EFT heist will be pulled off
>by retired NSA personnel, but that's the sort of thing which happens far
>more often as the plot for a movie than in real life.
>
>The problem is, of course, that if one has data that should remain secret
>for 100 years, one does have to face advances in cryptanalytic
>knowledge...as well as _unimaginable_ advances in computer power.
I wrote in a post which I did not send that if *only* NSA could read
my mail, the way it is now, I would not much care. Of course things
change in politics, and my view could change as well. But for me, NSA
is really just an illustration of the abstract threat.
As I understand security, one of the worst things we can do is to make
assumptions about our Opponents which do not represent their full
threat capabilities. ("Never underestimate your opponent.") Because
of this I am not interested in identifying a cipher Opponent, unless
in the process I can identify them as the absolute worst threat and
know their capabilities as well. This is obviously impossible. So if
we are to enforce our security despite the actions and intents of
others, we must assume our Opponents are far more powerful than we
know, then learn to deal with that threat.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: SSL implementation?
Date: Tue, 20 Apr 1999 02:29:26 GMT
In article <[EMAIL PROTECTED]>,
denis bider <[EMAIL PROTECTED]> wrote:
>Hello folks,
>
>does anyone know of a good commercial C/C++ SSL implementation?
Try www.consensus.com.
>It's got to be commercial because of our client's "if it's free it can't
>be any good" attitude.
Make sure to charge them plenty so they'll know you're doing a good job :-).
------------------------------
From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: Thought question: why do public ciphers use only simple ops like shift
and XOR?
Date: Tue, 20 Apr 1999 02:50:00 GMT
In article <[EMAIL PROTECTED]>,
Jerry Coffin <[EMAIL PROTECTED]> wrote:
>...or they might not be. 2^32-1 happens to be a prime number.
2^32-1 = (2^16)^2-1
= (2^16+1)*(2^16-1)
= (2^16+1)*(2^8+1)*(2^8-1)
= (2^16+1)*(2^8+1)*(2^4+1)*(2^4-1)
= (2^16+1)*(2^8+1)*(2^4+1)*(2^2+1)*(2^2-1)
= 65537 *257 *17 *5 *3
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************