Cryptography-Digest Digest #71, Volume #12 Tue, 20 Jun 00 14:13:01 EDT
Contents:
Re: Online Text Encryption (JPeschel)
Re: Is this a HOAX or RSA is REALLY broken?!? ("Douglas A. Gwyn")
Re: Cipher design a fading field? ("Douglas A. Gwyn")
Re: small subgroups in Blum Blum Shub (Mark Wooding)
Re: Is this a HOAX or RSA is REALLY broken?!? ("Axel Lindholm")
mother PRNG - input requested ("David S. Hansen")
Re: Is this a HOAX or RSA is REALLY broken?!? ("Trevor L. Jackson, III")
Re: Double Encryption Illegal? ([EMAIL PROTECTED])
Re: Online Text Encryption ("Trevor L. Jackson, III")
Re: Variability of chaining modes of block ciphers (Paul Koning)
Re: obfuscating the RSA private key (Mike Rosing)
Re: Is this a HOAX or RSA is REALLY broken?!? (Jerry Coffin)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
Re: Is this a HOAX or RSA is REALLY broken?!? (Roger Schlafly)
Re: mother PRNG - input requested (Mike Rosing)
Re: small subgroups in Blum Blum Shub (Terry Ritter)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Subject: Re: Online Text Encryption
Date: 20 Jun 2000 16:10:46 GMT
[EMAIL PROTECTED] writes:
> So that you would get
>the same Salt value when you went to decrypt? That would kill the
>portability of the message, IE no good to email to someone else with a
>different machine. Another way to prevent dictionary attacks then, would be
>to, not include the Password, in an encrypted form, in the message.
Not including the password won't prevent dictionary attacks.
A brute-force attacker would look for plaintext that resembles
a specific expectation: English text, or an expected header.
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Is this a HOAX or RSA is REALLY broken?!?
Date: Tue, 20 Jun 2000 15:31:33 GMT
Arturo wrote:
> QC reminds me of the "artificial intelligence" debate back in the late
> 80s and early 90s. It was said that AI could make a computer do anything, from
> surgical operations to star-wars battle management. And now? The few surviving
> AIs are now called "expert systems", and they work fine but far from earlier
> expectations. Are we putting again too much faith into a deus et machina?
You must be a youngster. AI has been "just around the corner" for
nearly 50 years. There *have* been some worthwhile developments in
that field, which tends to go by the name "Machine Intelligence" now,
other than expert systems, and these developments have been applied
successfully in various products. However, no system with the general
conceptual ability of a normal 10-year-old human has yet been
demonstrated.
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Cipher design a fading field?
Date: Tue, 20 Jun 2000 15:36:41 GMT
"John A. Malley" wrote:
> ... The cryptanalyst solving the substitution cipher uses an
> English-string-recognizing algorithm per some expected context.
It's more subtle than that -- the cryptanalyst does not usually
try *all possible keys* then examine the results to identify
which one is most probably plaintext. Rather, he uses known
properties of the source language to gradually pull the signal
out of the noise (so to speak). I might start out with the
expectation that the plaintext is a diplomatic message, but
during the process discover that it is actually about chess.
Since the same general properties of the natural language
apply (roughly) in both cases, the process works even though
the context was guessed wrong.
------------------------------
From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: small subgroups in Blum Blum Shub
Date: 20 Jun 2000 16:16:13 GMT
Tony T. Warnock <[EMAIL PROTECTED]> wrote:
> I think people are missing the point here. It's not that RSA, etc. are
> not secure but that the BBS generator using all the BBS bells and
> whistles can be proven secure.
But the BBS generator, without the bells and whistles, can be proven
secure under exactly the same assumptions, in particular that factoring
is hard. The cycling behaviour in question contradicts the assumption,
but that doesn't invalidate the proof.
-- [mdw]
------------------------------
From: "Axel Lindholm" <[EMAIL PROTECTED]>
Subject: Re: Is this a HOAX or RSA is REALLY broken?!?
Date: Tue, 20 Jun 2000 18:39:39 +0200
"Arturo" <[EMAIL PROTECTED]=NOSPAM> wrote:
> ...
> QC reminds me of the "artificial intelligence" debate back in the late
> 80s and early 90s. It was said that AI could make a computer do anything,
from
> surgical operations to star-wars battle management. And now? The few
surviving
> AIs are now called "expert systems", and they work fine but far from
earlier
> expectations. Are we putting again too much faith into a deus et machina?
AI isn't all about creating a "personality" for a computer or making it
"think" by itself. The neural network technology is a quite good example of
an AI that does extremely well in the applications it's been developed for.
I read an article in New Scientist once about a network diagnosing cancer
from x-rays with an accuracy of > 99%. A doctor's first diagnose from x-ray
pictures have an accuracy of approx. 60% according to the above mentioned
paper. I have no doubt about the hype created in the dawn of AI, but results
like these should somehow point out to us that even if AI might not be the
solution to everything it's certainly developed into something very usefull.
The question about QC as the ultimate computer is yet to be answered, but I
believe that such discoveries are looked at from a more realistic
perspective today than 50 years ago. Have faith in science my friend!
Axel Lindholm
------------------------------
From: "David S. Hansen" <[EMAIL PROTECTED]>
Subject: mother PRNG - input requested
Date: Tue, 20 Jun 2000 16:39:37 GMT
=====BEGIN PGP SIGNED MESSAGE=====
Hash: SHA1
Has anyone used/tinkered with George Marsaglia's MOTHER
generator?
What are your thoughts on a PRNG with a 32-bit seed but a
big-ass period, such as MOTHER - for the purposes of
medium-security crypto apps?
What, in your opinion, is the most cryptographically secure
PRNG you have ever seen?
*** David S. Hansen
*** [EMAIL PROTECTED]
*** http://www.haploid.com
=====BEGIN PGP SIGNATURE=====
Version: PGP 6.5.3
iQA/AwUBOU+ee7UtlIUTAKGREQJgbACeKascSThAX4G+YCxoMc/3dNqrFigAn0Rq
AQtbKOI1NmSiXOhpPELA+L1F
=BuK7
=====END PGP SIGNATURE=====
------------------------------
Date: Tue, 20 Jun 2000 13:07:50 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Is this a HOAX or RSA is REALLY broken?!?
tomstd wrote:
> [EMAIL PROTECTED] (Mark Wooding) wrote:
> >S. T. L. <[EMAIL PROTECTED]> wrote:
> >> <<The optimists in the field believe that in 5 or 10 years
> >> it will be possible to build a quantum computer that can
> >> factor the number 4 as 2x2.>>
> >>
> >> I believe that 15's already been factored. Now for something
> big,
> >> like 77....
> >
> ><fx: hand in the air> Ooh! Ooh! I can do that one!
>
> Simple 77 = 23 * 9. I am glad I am doing goodly in math.
... only for sufficiently small values of 23 and 9. ;-)
------------------------------
From: [EMAIL PROTECTED]
Crossposted-To: comp.databases.oracle
Subject: Re: Double Encryption Illegal?
Date: Tue, 20 Jun 2000 16:48:34 GMT
In article <8hrbrf$a5b$[EMAIL PROTECTED]>,
Crypto-Boy <[EMAIL PROTECTED]> wrote:
> On page 10-10 and 10-14 of the Oracle Advanced Security
Administrator's
> Guide (from release 8.1.6 December 1999), it says the following (in
bold
> no less):
>
> "Warning: You can use SSL encryption in combination with another
Oracle
> Advanced Security authentication method. When you do this, you must
> disable any non-SSL encryption to comply with government regulations
> prohibiting double encryption."
>
> Since when is it illegal to double encrypt in the US? I don't believe
> this is true.
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.
>
I heard that something like above 128 bit encryption is illegal. I
read it from some reliable source, but don't remember where. So the
statement sounds right to me.
I am not sure why this law exists, but to best of my knowlege there is
a maximum level of encryption that is legal. Maybe it's so that if
terorists transfer messages, government should be able to use the
messages in court. It it takes a few years to break the code, then
it's ok. But if it takes a million years, as in 256 bit encription,
then there is no way to decode the message.
Contact your lawyer if you plan to release software that is going to
use this type of encryption. If you are doing this to send messages
between yourself and your girlfriend, so that you wife can't break the
code, I don't think that you will get arrested.
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
Date: Tue, 20 Jun 2000 13:12:28 -0400
From: "Trevor L. Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Online Text Encryption
Dan Coyle wrote:
> Thank you for clarifying.
>
> Since the System, Assigns the Salt, wouldn't you have to decrypt the
> message, on the same system that you encrypted it on. So that you would get
> the same Salt value when you went to decrypt?
Yes.
> That would kill the
> portability of the message, IE no good to email to someone else with a
> different machine.
No, because the salt can be sent with the message. Salt elements are small. In
communication as opposed to authentication the term is "initialization vector",
and it may vary on a message-by-message basis just as a password salt varies on
a password-by-password basis.
> Another way to prevent dictionary attacks then, would be
> to, not include the Password, in an encrypted form, in the message.
>
> Dan Coyle
>
> "Trevor L. Jackson, III" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > Dan Coyle wrote:
> >
> > > Salt -
> > >
> > > "An unnecessarily cute and sadly non-descriptive name for an arbitrary
> > > value, unique to a particular computer or installation, prepended to a
> > > password before hash authentication. The "salt" acts to complicate
> attacks
> > > on the password user-identification process by giving the same password
> > > different hash results on different systems. Ideally, this would be a
> sort
> > > of keying for a secure hash. " (source Ritters Dictionary of Technical
> > > Cryptography - http://www.io.com/~ritter/GLOSSARY.HTM#Salt)
> > >
> > > This definition states that a salt is a, machine specific, algorithm for
> > > modifying password hashes stored in a message. It does not discuss using
> a
> > > given amount of time to encrypt the message, so if there is another
> > > definition that you would like to show me, please give me an URL or
> > > something so that I may see your sources.
> >
> > The definition you quoted leaves ont a significant purpose of salts. On a
> > single system salts confound simple dictionary attacks. A simple
> dictionary
> > attack proceeds through the dictionary encrypting each word once, and then
> > comparing the ciphertext to each of the encrypted passwords. This works
> if the
> > password is encrypted "bare" because if you and I use the same password
> the
> > system will store the same ciphertext for each of us.
> >
> > On a salted system each password is stored with a salt value that is
> included in
> > the encryption. Since the system assigns each password a distinct salt,
> if you
> > and I use the same password we'll get distinct ciphertexts. Then the
> simple
> > dictionary attack will fail. In order to attack a salted system one must
> > reencrypt each word of the dictionary with the salt for each password.
> This is
> > a significantly larger computation budget for the attack. It become
> > "encrypt"-bound rather than "compare"-bound.
> >
> > I'm certain Terry Ritter and most readers are familiar with this usage.
> >
------------------------------
From: Paul Koning <[EMAIL PROTECTED]>
Subject: Re: Variability of chaining modes of block ciphers
Date: Tue, 20 Jun 2000 12:07:41 -0400
Mok-Kong Shen wrote:
> ...I think that the variability
> of chaining modes could be advantageousy exploited such
> that the actual chanining mode used in a message has to be
> guessed by the opponent, thus redering his task much more
> difficult.
Make that "very slightly more difficult".
paul
------------------------------
From: Mike Rosing <[EMAIL PROTECTED]>
Subject: Re: obfuscating the RSA private key
Date: Tue, 20 Jun 2000 12:19:45 -0500
Dave Ahn wrote:
>
> I am looking for a way to obfuscate the *process* of encrypting with
> the private key as opposed to obfuscating the private key itself.
If the end user has access to the code, then they can follow the process
with a debugger. You can make it as convoluted as you want, but they'll
still be able to follow it.
If you don't use RSA, you can just give each user their own private key.
They send you their public key and all messages get encrypted with that.
When they need to decrypt, they can use *their* private key.
Maybe you can expand on the problem again, 'cause it sounds like we're
missing something.
Patience, persistence, truth,
Dr. mike
------------------------------
From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: Is this a HOAX or RSA is REALLY broken?!?
Date: Tue, 20 Jun 2000 11:19:11 -0600
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
says...
[ ... ]
> You must be a youngster. AI has been "just around the corner" for
> nearly 50 years. There *have* been some worthwhile developments in
> that field, which tends to go by the name "Machine Intelligence" now,
> other than expert systems, and these developments have been applied
> successfully in various products. However, no system with the general
> conceptual ability of a normal 10-year-old human has yet been
> demonstrated.
I think there's another basic point to keep in mind: what's
considered AI changes over time. 50 years ago, speech recognition,
hand-writing recognition and OCR were all considered legitimate
subjects of AI research. Now, they're all just normal programs, and
nobody thinks of them as related to AI at all. I suspect the same
basic thing will continue to happen: we'll be able to model various
other behaviors and actions to the point that we can be computers to
solve particular problems, but in the process we'll reduce those
problems from being things that require intelligence to things that
merely involve carrying out algorithms.
IOW, for the most part "AI" is simply used to refer to areas we don't
know how to deal with effectively yet. As we learn to deal with any
particular area effectively, it simply gets re-categorized so it's no
longer AI.
--
Later,
Jerry.
The universe is a figment of its own imagination.
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:21:12 GMT
On 19 Jun 2000 19:22:32 -0700, in
<[EMAIL PROTECTED]>, in sci.crypt d g
<[EMAIL PROTECTED]> wrote:
>[EMAIL PROTECTED] (Terry Ritter) writes:
>
>> [...] As I said in a previous posting, the issue here is nothing
>> less than the meaning of "proof" itself: If you are willing to call
>> something "proven secure," when the math itself says there is a
>> small, but preventable probability of weakness, you are willing to
>> bend more then I am.
>
>Could you state what you mean by "provable security"? As an example,
>see the definition in chapter 5 of Goldreich's "Foundations of
>Cryptography":
>
>http://www.wisdom.weizmann.ac.il/home/oded/public_html/foc-book.html
>
>A preprint of chapter 5 is available online - semantic security is
>defined in : #5.2.1, #5.2.2. Semantic security as defined by
>Goldreich is essentially probabilistic. Perhaps you have a different
>model of computation.
I do have a different model, but it is not a model of computation. My
model is what I perceive as the basis for the entire field of
cryptography: The need to find a system which is secure against any
possible attack. Mathematics being presumably the way to reach such a
result, "proven secure" is well-understood to be the goal of
cryptography itself. It is not a term to be usurped and re-defined as
something less. The general unqualified use of it as a term-of-art is
deceptive to newbies, managers, and managing directors. It might even
be seen as a marketing term.
>Do you agree with his construction and analysis of Blum-Goldwasser
>(which is based on the same QR intractability assumption as BBS) in
>section 5.3 in the monograph?
I did finally download the right thing, and it looks to me like a
tremendous contribution. But, since I'm not a mathematician, I'm
probably the wrong guy to ask about it. In general, I have no problem
at all with terms being re-defined in a technical context, as long as
that context is maintained. But the use of an existing term according
to a new technical definition will be deceptive. Special care must be
taken to avoid the problem simply because the workers in that
sub-field have not been sufficiently creative to find a new phrase for
a new concept. I suggest "statistically secure."
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:21:23 GMT
On 19 Jun 2000 19:28:47 -0700, in
<8imksv$nik$[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (David A. Wagner) wrote:
>Here's another way to see why we are willing to bend more than you are.
>[...]
>It's a steep slope, and once you get started down this path, there's
>no logical place to stop. Whatever key length you choose, it's always
>illogical to use it. We're paralyzed.
The key-guessing issue is a special case because *all* ciphers need
keys, and any key might be guessed. That is inherent in the concept
of ciphers. Because it is inherent, we learn to live with it; we have
no alternative, since key-guessing is not under our control. The use
of short cycles, however, is *not* inherent, such *use* can be
avoided, and it *is* under our control. All that is needed is the
will to do it.
Whatever key length we choose, there is always an additional,
preventable risk if we do not check for short cycles. As I have said
many times, this is -- as far as I know -- not a practical issue. But
it is a theoretical issue, and it is a theoretical weakness which can
be avoided. Avoiding that weakness is the reason for the "useless"
last 2/3 of the BB&S article. Using the BB&S prescriptions gives us
the ability to say that -- other than keyspace / brute force -- we
know of no weakness that we have not stopped.
Stopping every known weakness is the essence of cryptographic system
design, and that goes far beyond the cipher per se. Being willing to
allow a weakness because one *assumes* that it will not be selected is
-- in my view -- a serious flaw in a cryptographic system designer.
For one thing, it means that we cannot claim to have made the system
as secure as it can possibly be made to be. And it means that if
there were a test for weakness (beyond key-guessing), we could fail
that test and still be "in spec."
An educated technical person can understand the concepts of key,
keyspace, and key-guessing / brute force fairly quickly. They can
understand these risks and accept them. Beyond that understanding,
descriptions of a system with short cycles might be:
| * almost surely not insecure
| * secure unless one is very unlucky
But I suspect these descriptions would not be something a customer
would appreciate.
>[...]
>In the computational setting, security is inherently probabilistic.
>You don't get absolute security; you just get systems where breaking
>it is sufficiently large to prevent compromise. Similarly, you don't
>get guaranteed security; you just get systems where the probability of
>compromise is sufficiently small.
I agree. The issue is not practical security. The issue is being
able to say that we have done all we can.
It is difficult to reconcile "proven secure" with the ability to do
more, because if all holes were closed, there would be nothing left to
be done.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:21:39 GMT
On 19 Jun 2000 19:32:58 -0700, in
<8iml4q$nj8$[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (David A. Wagner) wrote:
>[...]
>Under standard theoretical definitions, there is no theoretical problem.
>Maybe you prefer non-standard definitions, but I haven't yet heard why
>you prefer them.
The problem is that both "proven" and "secure" are well understood by
an educated person. Math proofs start after grade school, so
everybody knows what "proof" should mean. And "security" is the
overall area which contains cryptography, so a sub-field of
cryptography hardly has the right to redefine it.
By itself, I believe "proven secure" means to most educated people "it
has been proven that insecurity does not exist." All we have to do is
to look around at the various discussions here on sci.crypt and see
how many times newbies -- and even some oldies -- have embraced the
delusion that there really are systems that are "proven secure" under
the common meaning of that phrase. I think much of this comes from
seeing the phrase in reputable technical journals where it has the
lesser technical meaning, and is not qualified as a "term of art."
Many confusing "terms of art" exist in many areas, but this one is
particularly deceptive and should be confronted head on when it
occurs. Deception is not a valid academic mode.
The goal of "proven security" is the basis for the entire field of
cryptography; it is the need to find a system which is secure against
any possible attack. Mathematics being presumably the way to reach
such a result, "proven secure" is well-understood to be the goal of
cryptography itself. It is not a term to be usurped and re-defined as
something less by an academic subfield.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:21:47 GMT
On 20 Jun 2000 10:04:09 GMT, in <[EMAIL PROTECTED]>,
in sci.crypt [EMAIL PROTECTED] (Mark Wooding) wrote:
>[...]
>Finally, I suspect Terry found an arc length of one because he started
>with a non-residue.
Sure. I covered every possible state. That, of course, establishes
the distribution encountered when choosing the starting state at
random.
In a BB&S system there are a lot of arcs of length 1 feeding into
cycles.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:22:02 GMT
On 20 Jun 2000 14:33:12 GMT, in <8invb8$9n9$[EMAIL PROTECTED]>,
in sci.crypt [EMAIL PROTECTED] (Klaus Pommerening) wrote:
>In <[EMAIL PROTECTED]> Terry Ritter wrote:
>> The short cycle weakness is a weakness which does not have to exist,
>> but which is normally allowed to exist. The fact that this additional
>> weakness possibility is not eliminated is sufficient to show that any
>> resulting system cannot be "proven secure." End of story.
>>
>So what do you think of RSA?
I have enough on my plate discussing BB&S. What do *you* think of the
last 2/3 of the BB&S article?
>There you also have short cycles, and you can't avoid
>them, because they come from the plain text, not from the
>key. Remember the iterative attack?
>
>[Assume m = plain text, c = cipher text, E = public encryption function]
>
>If c = E(m), then consider the cycle E(c), E(E(c)), E(E(E(c))),
>..., until E^s(c) = c. Now m = E^{s-1}(c).
>Hey - we have broken *every* public key encryption.
>
>"The fact that this additional weakness possibility is not
>eliminated is sufficient to show" ... that asymmetric encryption
>cannot be proven secure.
>
>By the way - finding a key for a symmetric128 bit encryption
>by pure guessing has a much higher success probability than
>running into a cycle for BBS or RSA with 1024 bit modulus.
>
>Therefore symmetric encryption also cannot be proven secure.
>End of story.
The concept of a key is the essence of cipher-based cryptography, and
we have no ciphers which are not vulnerable to key guessing. Changing
that is not under our control, so:
* We *cannot* build a cipher which avoids key-guessing.
* We *can* build a generator which does not use short cycles.
I would say a distinction exists.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:22:54 GMT
On 20 Jun 2000 07:20:09 -0000, in
<[EMAIL PROTECTED]>, in sci.crypt lcs Mixmaster
Remailer <[EMAIL PROTECTED]> wrote:
>[...]
>Obviously Terry Ritter is not going to be convinced by these arguments.
>Years of frustrating debate have established that.
The reason I am not going to be convinced by these arguments is that
they lead to the implication that black is white, and down is up, and
when that happens, the arguments are wrong no matter how crafty they
may be.
Fundamentally here, we have the use of "proven secure" as a "term of
art" while exactly the same phrase describes the long preceding
ultimate goal of cipher-based cryptography. This one phrase thus has
different meanings depending on context, but the context is rarely
preserved, and that makes the general claim deceptive. Deceptive
claims are particularly convenient when dealing with those who are not
skilled in the art, such as newbies, managers and managing directors.
A system with known, preventable weakness, no matter how small, is
different than the same system without that weakness. Using the same
description for both is a travesty and an embarrassment.
>The hope is that other
>readers will come to a clearer understanding of exactly what proof of
>security is provided by BBS. They will then see why such authoritative
>references as RFC 1750 describe the RNG in its simple form, without
>the elaborate additional checks and precautions which Terry Ritter has
>claimed are necessary.
Weren't you just saying that *my* arguments weren't scientific? Since
slurs and deliberate mis-slants are *your* idea of argument, it is no
wonder that my arguments don't seem fair to you. It is sad, really.
I have said many times that eliminating the use of rare short cycles
is not necessary in practice. But eliminating the use of short cycles
*is* necessary if we are to claim with a straight face that the system
is as secure as it can be made to be. The existence of a known
preventable weakness is an abomination in a cipher. Trying to sweep
that under the rug of a technical definition is just hiding the truth.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
From: Roger Schlafly <[EMAIL PROTECTED]>
Subject: Re: Is this a HOAX or RSA is REALLY broken?!?
Date: Tue, 20 Jun 2000 10:23:35 -0700
"S. T. L." wrote:
> <<The optimists in the field believe that in 5 or 10 years
> it will be possible to build a quantum computer that can
> factor the number 4 as 2x2.>>
>
> I believe that 15's already been factored. Now for something big, like 77....
Not by a quantum computer. Maybe it will be done in 20 years.
It would take millions of dollars of R&D.
------------------------------
From: Mike Rosing <[EMAIL PROTECTED]>
Subject: Re: mother PRNG - input requested
Date: Tue, 20 Jun 2000 12:25:32 -0500
David S. Hansen wrote:
> Has anyone used/tinkered with George Marsaglia's MOTHER
> generator?
Yup!
> What are your thoughts on a PRNG with a 32-bit seed but a
> big-ass period, such as MOTHER - for the purposes of
> medium-security crypto apps?
It's prefectly fine for long used systems. You might want to
save the internal state instead of just starting it with 32 bits
if you reboot often.
> What, in your opinion, is the most cryptographically secure
> PRNG you have ever seen?
This is too close to religion :-) I like hardware the best!
Patience, persistence, truth,
Dr. mike
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: small subgroups in Blum Blum Shub
Date: Tue, 20 Jun 2000 17:31:30 GMT
On 20 Jun 2000 16:16:13 GMT, in <[EMAIL PROTECTED]>,
in sci.crypt [EMAIL PROTECTED] (Mark Wooding) wrote:
>Tony T. Warnock <[EMAIL PROTECTED]> wrote:
>> I think people are missing the point here. It's not that RSA, etc. are
>> not secure but that the BBS generator using all the BBS bells and
>> whistles can be proven secure.
>
>But the BBS generator, without the bells and whistles, can be proven
>secure under exactly the same assumptions, in particular that factoring
>is hard. The cycling behaviour in question contradicts the assumption,
>but that doesn't invalidate the proof.
I have no idea what that could possibly mean. If the assumption is
contradicted, surely we cannot say the proof stands. Unless, of
course, we see "proof" as a mere manipulation of symbols, as opposed
to the correct conclusion.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************