Cryptography-Digest Digest #466, Volume #14 Mon, 28 May 01 20:13:00 EDT
Contents:
Turbo Small Public Key Cryptosystem (Tom St Denis)
Re: Fractionated Morse - Help (John Savard)
Re: Medical data confidentiality on network comms (Tom McCune)
Re: Essay on "The need for a look at real life crypto" ("M.S. Bob")
Random number generation. (Benjamin Johnston)
Re: Random number generation. (Tom St Denis)
Re: Euroean commision will recommend all citizens to use encryption in (Mok-Kong
Shen)
Re: Medical data confidentiality on network comms ("Harris Georgiou")
Re: Turbo Small Public Key Cryptosystem ("Jack Lindso")
Re: Turbo Small Public Key Cryptosystem ("Tom St Denis")
Re: Random number generation. (Benjamin Johnston)
Re: Turbo Small Public Key Cryptosystem (Tony L. Svanstrom)
Re: Good crypto or just good enough? (David Hopwood)
Re: Quantum Computers with relation to factoring and BBS (Charles Blair)
Re: Quantum Computers with relation to factoring and BBS ("AY")
Re: Turbo Small Public Key Cryptosystem (Tom St Denis)
Re: question on Luby's book (Charles Blair)
----------------------------------------------------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Turbo Small Public Key Cryptosystem
Date: Mon, 28 May 2001 21:44:18 GMT
Ok this is just for fun (so keep that in mind)
I wrote a really super small PK system for *NIX today (in about 1.5 hrs)
It uses DH and RC4 to encode/decode messages. It doesn't do signatures
(what's the point?). It's very compact... in linux it builds to about
32kb in size and is very fast.
It doesn't do ascii armor or silly headers etc. (So it's probably very
vulnerable if you try to decode /dev/urandom as a message). I've barely
tested it (it does encrypt/decrypt afaik).
http://tomstdenis.home.dhs.org/tc.zip
This is by no means a replacement for PGP since it lacks alot (say
signatures, key'ids and such, but I feel those are meaningless anyways!)
The source is not commented but it's very easy to follow. It's mostly
modular and somewhat organized (comes with a makefile that installs it
in /usr/local/bin/)
Tom
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Fractionated Morse - Help
Date: Mon, 28 May 2001 21:43:31 GMT
On 28 May 2001 12:53:26 -0700, [EMAIL PROTECTED] (Charles
Eastwood) wrote, in part:
>Can anyone point me to references on breaking fractionated Morse
>messages?
>I think there are at least two types. In one the Morse string,
>including "X" for letter breaks, is divided into groups of three
>characters. A table is then used to turn each group into one letter of
>the CT. In the other system a string of numbers is formed based on
>the number of dots and dashes in each letter. That string of numbers
>is transposed, and then used to generate a new Morse string which is
>then converted to letters.
>I am interested in attacks on both systems.
Well, the book "Cryptanalysis for Microcomputers" from TAB Books talks
about the first system. The second system is discussed in Gaines, and
since as illustrated the only key is the length of the group over
which the lengths are reversed, it can be solved by brute force by
hand.
Both systems are described on my web site, but their cryptanalysis is
not discussed.
John Savard
http://home.ecn.ab.ca/~jsavard/frhome.htm
------------------------------
Crossposted-To: comp.security.misc
From: Tom McCune <[EMAIL PROTECTED]>
Subject: Re: Medical data confidentiality on network comms
Date: Mon, 28 May 2001 21:47:58 GMT
In article <V7yQ6.46$[EMAIL PROTECTED]>, "Roger Schlafly"
<[EMAIL PROTECTED]> wrote:
>You could use N bits for some large value of N, and those keys will
>still be accessible by human beings. And usually low-level employees.
>The general public won't have access to the encrypted data anyway.
Not really - this HCFA ruling applies to patient data transferred over the
internet. Once its on the internet, it may never disappear, and can
possibly wind up anywhere, including such broad government tracking as
Carnivore or Echelon
>So what is the scenario in which 2048-bit RSA is better than 1024
>bit RSA (or DH)? That maybe 30 years from now an employee gains
>access to the encrypted files and a lot of supercomputer time, but
>not to the keys? And then breaks the RSA key for the purpose of
>exposing someone as having had some childhood disease? It seems
>very farfetched to me.
I wouldn't place any bets on how secure a 1024 bit key will be in 20 years.
And if one of these individuals is then running for US President, or US
Senate, is in a sensitive government or corporate position, etc., there
could be very significant damage. But regardless of what damage is seen as
possible to then occur, we are required to protect this legally and
ethically. For what it is or isn't worth, I wouldn't trust a 1024 bit key
for such long term needs. For internal purposes, it is a different
situation, but these rules apply to the use of the Internet.
Again, I come from a mental health view (currently working with youth), and
much of what we deal with is seriously aggressive behavior, including
illegal activity. But our admissions are through hospital emergency rooms,
so those records are directly applicable to our patients.
>Meanwhile, the next time the target checks into a hospital, he will
>sign a blanket release that gives hundreds of clerks access to all
>his medical files.
Actually, my understanding is that we are not to accept blanket releases.
But regardless, the potential carelessness of what a patient does, does not
justify the same level of carelessness on the part of a health care
provider.
Tom McCune
http://www.McCune.cc
Please use PGP for Privacy & Authenticity
------------------------------
From: "M.S. Bob" <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Essay on "The need for a look at real life crypto"
Date: Mon, 28 May 2001 22:57:27 +0100
Tom St Denis wrote:
>
> Again you're missing the point. The essay is about trusting buzzwords.
> I picked on PGP because it's a buzzword. Nobody says "I use the
> AcornSoft PK to sign my message".
If that is the point of the essay, I'd suggest re-writing it. I came
away with a different conclusion, that cryptography isn't perfect.
And I'd like to know what you recommend for non-experts to evaluate
cryptographic products for their real-world needs?
I think the premise of non-experts not being able to rely on using
standard products, i.e. "buzzwords", is flawed. I think that is one of
the most sensible things a non-experts can do, to help produce more
secure products. I'd rather trust a CSPRNG produced by RSA Security or
Baltimore than Jane application programmer. That doesn't mean the
product will be perfect, but it is a starting point to use standard
components and standard algorithms and protocols and is better than
nothing at all or home brewed "Bass-O-Matic" ciphers.
Of course this leaves the correct and secure implementation of the
entire application to be dealt with. Security cannot be bolted on as a
mere afterthought. Numerous attacks of systems have been successful
because the security didn't solve the right problem[1].
I think I would suggest that using standard cryptography toolkits
(PGPsdk, BSAFE, OpenSSL, cryptlib) is a good starting point, but it does
not mean the entire application or system is secure from the various
threats, it does eliminate some problems we have already dealt with.
This includes using flawed protocols, no encryption, or no digital
signature; using standard cryptography toolkits is an improvement.
For a variety of reason[2] we often get the problem wrong or try to
solve the wrong problem. Cryptography is one part of computer security,
it has to be considered in the larger scale rather than just encrypting
bits and doing modular arithmetic. But trying to achieve "impossible to
fail" secure computer networks is like trying to build an unsinkable
ship, only much harder.
[1] Anderson, Ross _Why Cryptosystems Fail_
<http://www.cl.cam.ac.uk/users/rja14/wcf.html>
[2] his new book <http://www.cl.cam.ac.uk/~rja14/book.html>
------------------------------
From: Benjamin Johnston <[EMAIL PROTECTED]>
Subject: Random number generation.
Date: Tue, 29 May 2001 08:04:36 +1000
Good Day,
I've been considering the need for generating random numbers in my "toy"
crypto project.
One approach I was considering is amassing a heap of semi-random data like
time stamps and key strokes (ie. as suggested in "all the books")...
But, then I need to turn this data into actual "random" data (well, close
enough for cryptographic purposes).
To do this, I figured there are two approaches;
1. have plenty of data and then hash, say, 1Kb (or whatever I estimate is
necessary given the entropy of the semi-random data) into 160 bits at a
time (and not use the same 1Kb of data, again, for any other purpose).
2. have plenty of data, and prepend an integer to all of the data, and
hash the whole lot to 160 bits. If I need more "random" data, I just
increment the integer and hash it all again.
I think approach 2 is easier, and it means that I don't have to worry
about "wasting" the semi-random data if I start discarding the hashed
values. And I figure it should be just as secure as approach 1, if we
assume that SHA, for example, is "good" - so the two hash values should be
completely unrelated.
So, what I'm asking, is:
1. is my approach (ie. the second one) silly?
2. what is standard practice for this kind of problem?
Thanks a lot,
-Benjamin Johnston
[EMAIL PROTECTED]
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Random number generation.
Date: Mon, 28 May 2001 22:16:54 GMT
Benjamin Johnston wrote:
>
> Good Day,
>
> I've been considering the need for generating random numbers in my "toy"
> crypto project.
>
> One approach I was considering is amassing a heap of semi-random data like
> time stamps and key strokes (ie. as suggested in "all the books")...
>
> But, then I need to turn this data into actual "random" data (well, close
> enough for cryptographic purposes).
>
> To do this, I figured there are two approaches;
>
> 1. have plenty of data and then hash, say, 1Kb (or whatever I estimate is
> necessary given the entropy of the semi-random data) into 160 bits at a
> time (and not use the same 1Kb of data, again, for any other purpose).
>
> 2. have plenty of data, and prepend an integer to all of the data, and
> hash the whole lot to 160 bits. If I need more "random" data, I just
> increment the integer and hash it all again.
>
> I think approach 2 is easier, and it means that I don't have to worry
> about "wasting" the semi-random data if I start discarding the hashed
> values. And I figure it should be just as secure as approach 1, if we
> assume that SHA, for example, is "good" - so the two hash values should be
> completely unrelated.
>
> So, what I'm asking, is:
>
> 1. is my approach (ie. the second one) silly?
Generally yes. The idea is not to reuse seed material. Why not
consider this instead?
Hash the lump of seed material into a 160 bit symmetric key and use a
cipher in CTR mode. (Ahem, afaik that's Yarrow)
> 2. what is standard practice for this kind of problem?
I actually don't know off hand.
Tom
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Euroean commision will recommend all citizens to use encryption in
Date: Tue, 29 May 2001 00:16:55 +0200
Vincent Quesnoit wrote:
>
> Mok-Kong Shen a �crit :
>
> > France had a very restictive crypto law. But that has
> > been discarded (presumably due to the Echelon affairs), or
> > am I ill-informed in that?
>
> Correct, the french law used to mandate that authorisation be requested prior to
> using encryption with keys bigger than some limit (I guess 64 bits, but I am not
> sure).
> This law was discarded, but I do not think this decision was related to echelon
I just found an URL that gives more details about encryption
policy of diverse nations, including France:
http://www.gilc.org/crypto/crypto-survey-99.html#_Toc450793155
M. K. Shen
------------------------------
From: "Harris Georgiou" <[EMAIL PROTECTED]>
Crossposted-To: comp.security.misc
Subject: Re: Medical data confidentiality on network comms
Date: Tue, 29 May 2001 01:26:41 +0300
� Roger Schlafly <[EMAIL PROTECTED]> ������ ��� ������ ���������:
VvtQ6.29$[EMAIL PROTECTED]
> "Larry Kilgallen" <[EMAIL PROTECTED]> wrote
> > > Personally, I'd be quite happy with 56-bit ciphers and 512-bit PK keys
> > > if the various other privacy leaks were plugged. The real medical
> privacy
> > > threats have almost nothing to do with crypto.
> > But some of them are susceptible to cryptographic controls.
> > ........
>
> There would have to be a culture change among physicians if anything
> like that were to take effect. US physicians just don't believe
> in those sorts of limits. And patients just don't have enough
> power to ask for limits like that.
>
Yes I agree. In fact, I think the big problem is medical info security is
how to apply a basic scheme for authentication and encryption of sensitive
data in practical terms. It is much like the situation with ATM machines for
money exchanges. This system has succeeded because all that the people had
to do is go over to the machine, insert a card and type in the PIN. The
critical link here is how to ensure that the person who uses the card is the
owner (i.e. PIN) and how to prevent anyone else to get access to that (most
people still have their PIN written on some paper inside the same wallet
that keeps the card, or even on the card itself).
The problem with secure/authorized access to medical data is that the person
is not always available to give permission or even provide (in some way)
his/hers private access key. Having the key placed somewhere else
invalidates it's own existance because there is always the possibility of
unauthorized access or/and access denial when it is really needed by
doctors. And the problem keeps growing more and more with all the genetic
information used today.
--
Harris
- 'Malo e lelei ki he pongipongi!'
------------------------------
From: "Jack Lindso" <[EMAIL PROTECTED]>
Subject: Re: Turbo Small Public Key Cryptosystem
Date: Tue, 29 May 2001 01:38:32 +0200
Ok, so we take user input as a password --> H() -->256 bit key.
Where is the security here, today's dictionary attacks are pretty powerful.
Our user lets call her Jane, inserts a password "JackRobinson" (her
boyfriends name). And we call this stuff secure ???
It's not an attack towards your prog (I didn't look at it), it's a kind of a
general case. What can we do about it? Salt, but it's got to be stored on
the computer which lets say is quite reachable. It seems we don't need a
super NSA driven quantum chugger, a short trip to Jane's computer will tell
us the salt. Is there something I'm missing, please correct me...
--
Anticipating the future is all about envisioning the Infinity.
http://www.atstep.com
====================================================
"Tom St Denis" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Ok this is just for fun (so keep that in mind)
>
> I wrote a really super small PK system for *NIX today (in about 1.5 hrs)
>
> It uses DH and RC4 to encode/decode messages. It doesn't do signatures
> (what's the point?). It's very compact... in linux it builds to about
> 32kb in size and is very fast.
>
> It doesn't do ascii armor or silly headers etc. (So it's probably very
> vulnerable if you try to decode /dev/urandom as a message). I've barely
> tested it (it does encrypt/decrypt afaik).
>
> http://tomstdenis.home.dhs.org/tc.zip
>
> This is by no means a replacement for PGP since it lacks alot (say
> signatures, key'ids and such, but I feel those are meaningless anyways!)
>
> The source is not commented but it's very easy to follow. It's mostly
> modular and somewhat organized (comes with a makefile that installs it
> in /usr/local/bin/)
>
> Tom
------------------------------
From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Turbo Small Public Key Cryptosystem
Date: Mon, 28 May 2001 22:38:51 GMT
"Jack Lindso" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Ok, so we take user input as a password --> H() -->256 bit key.
> Where is the security here, today's dictionary attacks are pretty
powerful.
> Our user lets call her Jane, inserts a password "JackRobinson" (her
> boyfriends name). And we call this stuff secure ???
>
> It's not an attack towards your prog (I didn't look at it), it's a kind of
a
> general case. What can we do about it? Salt, but it's got to be stored on
> the computer which lets say is quite reachable. It seems we don't need a
> super NSA driven quantum chugger, a short trip to Jane's computer will
tell
> us the salt. Is there something I'm missing, please correct me...
Solution don't use weak passwords.
Or do what I've been saying for a long time. Get a keyboard with a mag
reader and put your super random password on a credit card.
Tom
------------------------------
From: Benjamin Johnston <[EMAIL PROTECTED]>
Subject: Re: Random number generation.
Date: Tue, 29 May 2001 08:43:50 +1000
> > 2. have plenty of data, and prepend an integer to all of the data, and
> > hash the whole lot to 160 bits. If I need more "random" data, I just
> > increment the integer and hash it all again.
> >
/snip/
> > So, what I'm asking, is:
> >
> > 1. is my approach (ie. the second one) silly?
>
> Generally yes. The idea is not to reuse seed material. Why not
> consider this instead?
>
> Hash the lump of seed material into a 160 bit symmetric key and use a
> cipher in CTR mode. (Ahem, afaik that's Yarrow)
The idea was to use it as a "general purpose" random number source across
my entire project; which includes prime number generation.
For this reason I need a more general-purpose source of random data -- so
that I can retreive variable amounts of data.
I didn't want to have to, for example, "use up" lots of random data to
generate some value (say, a candidate prime) only to discover that I have
to discard it (sure -- for things to be random, you'd want this event to
be partially "random"... but it seems a bit wasteful).
I figured that with a large semi-random source, I could keep hashing the
same data over and over again (making small, linear changes), and this
would be fine provided that I didn't try to "extract" more "randomness"
than was present in the original, collected data.
Hmmm... I can already see a slight cause for concern... but part of the
reason I suggested the second variation was that if I break up my
semi-random data into "chunks"... I may be in danger of having some
"chucks" which have very low entropy... but by hashing the whole file at
once, then I reduce the danger of encountering a "bad block".
> > 2. what is standard practice for this kind of problem?
>
> I actually don't know off hand.
-Benjamin Johnston
[EMAIL PROTECTED]
------------------------------
Subject: Re: Turbo Small Public Key Cryptosystem
From: [EMAIL PROTECTED] (Tony L. Svanstrom)
Date: Mon, 28 May 2001 23:47:37 GMT
Tom St Denis <[EMAIL PROTECTED]> wrote:
> I wrote a really super small PK system for *NIX today (in about 1.5 hrs)
>
> It uses DH and RC4 to encode/decode messages. It doesn't do signatures
> (what's the point?). It's very compact... in linux it builds to about
> 32kb in size and is very fast.
You call that "super small", wanna hear the size of the one I wrote in
perl...?! hehe
/Tony
--
########################################################################
I'm sorry, I'm sorry; actually, what I said was:
HOW WOULD YOU LIKE TO SUCK MY BALLS?
- South Park -
------------------------------
Date: Mon, 28 May 2001 22:14:30 +0100
From: David Hopwood <[EMAIL PROTECTED]>
Subject: Re: Good crypto or just good enough?
=====BEGIN PGP SIGNED MESSAGE=====
Sam Simpson wrote:
> "Mark Wooding" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > Sam Simpson <[EMAIL PROTECTED]> wrote:
> >
> > > Is this a suspicion or is there a proof for this somewhere?
> >
> > Suppose that A is an adversary which distinguishes triple-DES from a
> > random permutation within some given resource bounds (say, less than
> > 2^{56} time). Then construct an adversary B which distinguishes single-
> > DES by choosing two more DES keys at random, and giving A the provided
> > single-DES-or-random-permutation oracle and two further rounds of DES as
> > its oracle. B will distinguish single-DES from a random permutation in
> > the same resources as A except for three times as many DES encryptions.
> >
> > Will that do you?
>
> It's certainly interesting!
>
> I'm thinking back to a paper (probably Schneier) where they found an attack
> where 2-key 3DES was stronger than 3-key 3DES.
That was a related key attack, assuming you're referring to
John Kelsey, Bruce Schneier, David Wagner,
"Key-Schedule Cryptanalysis of IDEA, G-DES, GOST, SAFER, and Triple-DES".
http://www.counterpane.com/key_schedule.html
> You seem to provide a proof
> for a specific class of attack, but may there be other attacks where the
> security [of DES and 3DES] could be found to be equivalent?
It's unlikely that they would be found to be *equivalent* under any attack.
Kelsey, Schneier, and Wagner's attack is still more difficult than breaking
single-DES (since it requires a meet-in-the-middle attack on double-DES),
even if the related-key conditions hold.
> I guess I'm asking: does your proof imply that, for all attacks, 3DES is
> stronger that DES?
No, Mark Wooding's proof assumes random keys (which implies that the 3
single-DES keys are independent), so it does not apply to related key
attacks. However, you can make a strong argument that such attacks should
be prevented at the protocol level, so if the algorithm itself prevents
them, that is just a bonus. If you look at section 3.3 of the Kelsey,
Schneier, and Wagner paper, which tries to give a motivation for why
related key attacks on ciphers are important, all the examples given
rely on serious protocol flaws.
The other case where related-key attacks are significant is when constructing
a hash function from a cipher. However, that's because the constructions that
attempt to do this rely on much more than the standard security definitions
for ciphers; it's very easy to create examples of ciphers that are secure
when used for symmetric encryption, but insecure when used a hashing mode
such as Davies-Meyer, etc. Personally I consider that to be a flaw in these
hashing modes; I prefer dedicated hashes.
- --
David Hopwood <[EMAIL PROTECTED]>
Home page & PGP public key: http://www.users.zetnet.co.uk/hopwood/
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5 0F 69 8C D4 FA 66 15 01
Nothing in this message is intended to be legally binding. If I revoke a
public key but refuse to specify why, it is because the private key has been
seized under the Regulation of Investigatory Powers Act; see www.fipr.org/rip
=====BEGIN PGP SIGNATURE=====
Version: 2.6.3i
Charset: noconv
iQEVAwUBOxK/ojkCAxeYt5gVAQHR0AgAngEwlyIdolxTmEAoj6NblB3qPpFO5ZjL
teywm2XcW/UgktoaOsSlYGD9ft/Nwobi0alpJWCHV9bCbPWkTwwLN2XjNw2Gosac
du492wH5vZ++7cRrIad62VCeDWr7KgOOEjtCVDQMcU7KK1uRlKuWFPJd44+lCYLv
p2DADYiENQBx0dB6U9IGY0kZszcyQLeeR7NoXa+MWtNZZq5t6r8aAWr3aB8gG58H
E6F6CaAsDKKpZTDh616/ZHSlck6sTQGSAxTxbeGl+6qx9D/9DB8S0lnRyeq4Ejob
KDNa3OMBS1eRF/nFyUJ9Ek5b+bfXpgaNdpfxmbBPRmFCkuifhsMxfg==
=9bKD
=====END PGP SIGNATURE=====
------------------------------
Subject: Re: Quantum Computers with relation to factoring and BBS
From: [EMAIL PROTECTED] (Charles Blair)
Date: Mon, 28 May 2001 23:48:14 GMT
"Roger Schlafly" <[EMAIL PROTECTED]> writes:
>"Simon Johnson" <[EMAIL PROTECTED]> wrote
>> I now want to establish a few facts based on the assumption
>> that p!=np...
>> 1. Do we know factoring is NP for certain?
>Yes. That means that the factors can be verified in polynomial time.
As far as I know, it is still open whether factoring is NP-complete.
Thus, it is possible that there is a polynomial-time algorithm for
factoring (on traditional, non-quantum computers) but that there
is no such algorithm for things like the travelling salesman problem.
------------------------------
From: "AY" <[EMAIL PROTECTED]>
Subject: Re: Quantum Computers with relation to factoring and BBS
Date: Tue, 29 May 2001 00:57:18 +0100
>1. Do we know factoring is NP for certain?
I think one should be more precise in the wording regarding such questions.
The complexity classes P and NP (a few others) are defined for *decision*
(YES/NO) problems rather than functional ones ("What are the factors?") even
though they are closely related. In fact there exists functional complexity
classes FP and FNP etc.
One also needs to be clear about the decision problem and its complementary
problem (also a decision problem), in our case PRIME and COMPOSITE.
For the decision problem COMPOSITE: "Given n, is it composite?" Clearly this
is in NP. Given n, guess m where 1<m<n, and one can check in polynomial time
whether m divides n.
The coplemtary problem to COMPOSITE is PRIME: "Given n, is it prime?" By
definition of complementary complexity classes PRIME is in co-NP, but that
does not imply PRIME is in NP. By a separate proof (Pratt's theorem 1975) it
is shown that PRIME is indeed in NP. Note that PRIME is in NP n co-NP (where
'n' is meant to be the intersect operator), which is a superset of P.
Now we know PRIME is in NP. But is it in P? The answer is yes, provided the
Extended Riemann Hypothesis (ERH) is true. Basically it boils down to the
fact that if ERH is true, then following is also true:
if n is composite then it has a witness of its being composite of length
O(log(log(n)).
So one could do an exhaustive check on the log(n) potential witness
candidates, each check takes polynomial time. If no witness is found, it is
a prime, and composite otherwise. A corollary of this is that COMPOSITE is
also in P.
But in practise we rarely use above algorithm to check for primality: even
though it's a polynomial algorithm, it is much more efficient to use a
probabilistic algorithm with a negligible probability of error (by repeating
enough times).
For primes and composites a family of probabilistic complexity classes, in
addition to P and NP, may also be of interest: RP, ZPP and BPP
For further info, I think primes and related problems are discussed in HAC.
www.cacr.math.uwaterloo.ca/hac
AY
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Turbo Small Public Key Cryptosystem
Date: Mon, 28 May 2001 23:52:59 GMT
"Tony L. Svanstrom" wrote:
>
> Tom St Denis <[EMAIL PROTECTED]> wrote:
>
> > I wrote a really super small PK system for *NIX today (in about 1.5 hrs)
> >
> > It uses DH and RC4 to encode/decode messages. It doesn't do signatures
> > (what's the point?). It's very compact... in linux it builds to about
> > 32kb in size and is very fast.
>
> You call that "super small", wanna hear the size of the one I wrote in
> perl...?! hehe
Um yes but recall that most of the 32kb is in fact library and object
file overhead. My actual code is very small.
At anyrate I uploaded another copy that includes a readme :-)
Tom
------------------------------
Subject: Re: question on Luby's book
From: [EMAIL PROTECTED] (Charles Blair)
Date: Mon, 28 May 2001 23:52:27 GMT
David A Molnar <[EMAIL PROTECTED]> writes:
>I just picked up Luby's _Pseudorandomness and Cryptographic Applications_.
>My copy has what looks like a publishers' error. On page 32, the page
>terminates in the middle of a discussion of security-preserving reductions
>with the words
>"It turns out that the primary quantity that determines the strength of
>the reduction is the ration s'(n)/s(n). The bigger the ratio the more"
>and then on p.33 where the punchline should be, my copy has "Lecture 3"
>starting.
My copy, purchased several years ago, has page 33 completing the sentence
``...the loss in security.'' Several paragraphs later, my page 33
has a short section headed ``Practice versus Asymptoics.'' Page
34 is headed ``Uniform versus Non-uniform Reductions.'' Lecture 3
starts on page 35.
I hope you can get a refund for your seriously defective copy.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to sci.crypt.
End of Cryptography-Digest Digest
******************************