Cryptography-Digest Digest #648, Volume #10 Mon, 29 Nov 99 15:13:02 EST
Contents:
Re: Peekboo Ideas? (Medical Electronics Lab)
Re: Random Noise Encryption Buffs (Look Here) (Tom St Denis)
Re: Elliptic Curve Public-Key Cryptography (jerome)
Re: Random Noise Encryption Buffs (Look Here) (Guy Macon)
Re: Random Noise Encryption Buffs (Look Here) (Tim Tyler)
Re: "The Code Book" challenge update (Tim Tyler)
Re: smartcard idea? (Guy Macon)
Re: Use of two separate 40 bit encryption schemes ("tony.pattison")
Re: Use of two separate 40 bit encryption schemes ("tony.pattison")
----------------------------------------------------------------------------
From: Medical Electronics Lab <[EMAIL PROTECTED]>
Subject: Re: Peekboo Ideas?
Date: Mon, 29 Nov 1999 12:31:23 -0600
Tom St Denis wrote:
>> I do have one question: How do I implement human-readble message
> signatures when things like email and usenet will reformat/addspaces?
> Do I just discount spaces or something? How does PGP do it?
Stripping "white space" is perfectly ok to compute the hash of
a message. Remove tabs, spaces and new-lines. That gives you
pure text pretty much. You may want to remove all control characters,
in case there are form feeds added or whatever.
Warn people tho, in case spacing is important, they should fill
lines with ........ to make sure things are in the right columns.
Patience, persistence, truth,
Dr. mike
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Random Noise Encryption Buffs (Look Here)
Date: Mon, 29 Nov 1999 18:46:17 GMT
In article <[EMAIL PROTECTED]>,
"Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote:
> Tom St Denis wrote:
> > If I took two exact copies [leave the copying theory behind here] of
> > an atom, and placed them in two exact same environments. Would they
> > not decay the same way? If so, that's hardly random at all.
>
> The simple answer is, no, two identically prepared quantum systems,
> constrained as tightly as nature allows, need not evolve along the
> same path.
>
That's like saying each time you went back in time [the exact same
time] you would observe a different state. Which means a atom can
never be in one state at any time. Kinda like an omni-state..
Tom
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (jerome)
Subject: Re: Elliptic Curve Public-Key Cryptography
Reply-To: [EMAIL PROTECTED]
Date: Mon, 29 Nov 1999 18:57:58 GMT
so to sum up, there is no proof of difficulty on both side (ecc or modp(aka
multiplicative group over a finite field)).
ecc has been studied since 15 years and modp since 20years.
On Mon, 29 Nov 1999 16:30:07 GMT, Bruce Schneier wrote:
>(This is reprinted from the November Crypto-Gram newsletter.)
>
>
>In September of this year, nearly 200 people using 740 computers
>managed to crack a message encrypted with 97-bit elliptic curve
>cryptography. The process took 16,000 MIPS-years of computing, about
>twice as much as used by the team that recently cracked a 512-bit RSA
>encryption key. Certicom, the company who sponsored this challenge,
>has offered this result as evidence that elliptic curve cryptography
>is stronger than RSA.
>
>Let's take a look at this claim a little more closely.
>
>All public-key algorithms, whether for key exchange, encryption, or
>digital signatures, are based on one of two problems: the factoring
>problem or the discrete logarithm problem. (There are other
>algorithms in academic circles, but they're too unwieldy to use in the
>real world.) The security of RSA comes from the difficulty of
>factoring large numbers. Strong RSA-based systems use 1024-bit
>numbers, or even larger.
>
>The security of most other public-key algorithms -- ElGamal, DSA, etc.
>-- is based on the discrete logarithm problem. The two problems are
>very similar, and all of the modern factoring algorithms can be used
>to calculate discrete logarithms in the multiplicative group of a
>finite field. To a rough approximation, factoring a number of a
>certain size and calculating the discrete logarithm of numbers the
>same size takes the same amount of work. This means that for a given
>key size, RSA, ElGamal, DSA, etc. are approximately equally secure.
>(This isn't strictly true, but it's a good enough approximation for
>this essay.)
>
>All of these algorithms require the use of something called an
>"algebraic group." When public-key cryptography was invented, the
>algorithms were all implemented in the simplest algebraic group: the
>numbers modulo n. For example, RSA encryption is m^e mod n, and a
>Diffie-Hellman public key is g^y mod n. As it turns out, any
>algebraic group will do. Elliptic curves are simply another algebraic
>group.
>
>In elliptic curve cryptography, public keys and private keys are
>defined as points along a mathematical object called an elliptic
>curve. (Don't worry; it doesn't really matter what that means.)
>Addition is an operation that combines two points and produces a third
>point. The algorithms look the same, but the detailed math is very
>different.
>
>But if any algebraic group will do, why is anyone bothering with
>elliptic curves? It turns out that for discrete-logarithm elliptic
>curve algorithms, perhaps we can get by with smaller keys. (This is
>not true for RSA, which is why you never see elliptic curve RSA
>variants).
>
>All of the fastest algorithms for calculating discrete logs -- the
>number field sieve and the quadratic sieve -- make use of something
>called index calculus and a property of the numbers mod n called
>smoothness. In the elliptic curve group, there is no definition of
>smoothness, and hence in order to break elliptic curve algorithms you
>have to use older methods: Pollard's rho, for example. So we only
>have to use keys long enough to be secure against these older, slower,
>methods. Therefor, our keys can be shorter.
>
>And they can be significantly shorter. In the wake of the recent
>break, Certicom recommends 163-bit keys. Compare this to the
>recommended key lengths for conventional discrete-logarithm
>algorithms, which are at least 1024 bits.
>
>Whether this recommendation makes sense depends on whether the faster
>algorithms can ever be made to work with elliptic curves. The
>question to ask is: "Is this lack of smoothness a fundamental
>property of elliptic curves, or is it a hole in our knowledge about
>elliptic curves?" Or, more generally: "Are elliptic curves
>inherently harder to calculate discrete logs in, or will we eventually
>figure out a way to do it as efficiently as we can in the numbers mod
>n?"
>
>If you believe the former, elliptic curves will always be more secure
>-- for the same key lengths -- than the numbers mod n. If you believe
>the latter, it's only a matter of time before they are broken.
>
>Certicom very much wants you to believe the former. They say things
>like: "Elliptic curves as algebraic/geometric entities have been
>studied extensively for the past 150 years, and from these studies has
>emerged a rich and deep theory." They conclude that because of this,
>we can gain good confidence that new algorithmic advances won't be too
>devastating.
>
>To me, this is a lot of wishful thinking. It would be nice if we had
>150 years of work on the cryptographic properties of elliptic curves.
>But we don't; instead, we have 150 years of work on the properties of
>elliptic curves that mathematicians care about, almost all of it only
>incidentally touching on what cryptographers care about. Elliptic
>curve cryptography was invented only in 1985, and has only been really
>studied seriously for a few years.
>
>Even today, most of the work on elliptic curves in the typical
>university math department is pretty irrelevant to us cryptographers.
>Sure, some of their results might occasionally help us understand the
>strength of elliptic curve algorithms; but that's almost never been
>the goal of the mathematicians' research studies. This is changing
>now, but slowly.
>
>Furthermore, work on efficient algorithms for elliptic curves is very
>new. The whole notion of efficient algorithms didn't even appear until
>about the 1960s or 1970s, and algorithmic number theory has only
>become popular in the past two decades. It just wasn't relevant
>before computers.
>
>The real answer to the question is "we don't know." We don't know if
>there are efficient ways to calculate discrete logarithms in elliptic
>curve groups. We don't know if there is a definition of smoothness
>that will allow us to apply the number field sieve to elliptic curves.
>We don't know if, in the long run, you can use shorter keys with
>elliptic curve algorithms.
>
>In the short run, Certicom's recommendations are reasonable. Today,
>we can't calculate discrete logs in elliptic curves as efficiently as
>we can in the numbers mod n. Systems can use shorter keys with
>elliptic curves. But in the long run, we don't know.
>
>There are other differences to consider, too. Checking elliptic curve
>signatures is still a big pain compared to checking RSA signatures.
>And all users of an elliptic curve system have to share the same
>curve. (If you don't do this, you lose most of the size benefits of
>the elliptic curve keys.) This has security implications: it is
>easier to break a key of a random user on a system than it is to break
>a specific user's key. I'd like to see more analysis of this aspect
>of elliptic curve systems.
>
>My recommendation is that if you're working in a constrained
>environment where longer keys just won't fit -- smart cards, some
>cellphones or pagers, etc. -- consider elliptic curves. If the choice
>is elliptic curves or no public-key algorithm at all, use elliptic
>curves. If you don't have performance constraints, use RSA. If you
>are concerned about security over the decades (almost no systems are),
>use RSA.
>
>Realize, though, that someday -- next year, in ten years, in a century
>-- someone may figure out how to define smoothness, or something even
>more useful, in elliptic curves. If that happens, you will have to
>use the same key lengths as you would with conventional discrete
>logarithm algorithms, and there will be no reason to ever use elliptic
>curves.
>
>Postscript: This same analysis applies to factoring (and the basic
>discrete log problem). RSA Security, Inc. likes to talk about the
>long mathematical history of the factoring problem, and how that gives
>us confidence about the security of RSA. Yes, it has been studied for
>centuries, but only recently has that study been even remotely related
>to cryptography. Moreover, working on factoring hasn't been a
>respectable area of study until very recently; before that, it was
>considered an eccentric hobby. And efficient algorithms for factoring
>have only been studied for the past couple of decades. We really have
>no idea how hard factoring truly is.
>
>The truth is that companies have a tendency to advertise their
>products. Before making a decision about cryptographic algorithms,
>customers should try to get a variety of independent opinions (from
>parties not financially involved in the outcome of the decision) about
>what they are buying.
>
>News on the recent elliptic curve cracking effort:
>http://www.computerworld.com/home/news.nsf/all/9909282ellip
>http://www.certicom.com/press/99/sept2899.htm
>
>An excellent mathematical introduction to elliptic curves:
>http://www.certicom.com/ecc/enter/index.htm
>
>An excellent discussion on comparative key lengths, including RSA and
>elliptic curves: http://www.cryptosavvy.com
>
>**********************************************************************
>Bruce Schneier, Counterpane Internet Security, Inc. Phone: 612-823-1098
>101 E Minnehaha Parkway, Minneapolis, MN 55419 Fax: 612-823-1590
> Free crypto newsletter. See: http://www.counterpane.com
------------------------------
From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: Random Noise Encryption Buffs (Look Here)
Date: 29 Nov 1999 11:10:53 PST
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Anti-Spam) wrote:
>
>Guy Macon wrote:
>>
>> In article <81rdc8$ovn$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Tom St Denis) wrote:
>>
>> >If things are to be randomly created, material must be randomly
>> >destroyed.
>>
>> Evidence, please.
>
>Gosh, this isn't crypto - but some things ARE randomly created and then
>randomly destroyed. The quantum vacuum qualifies as a physical example
>of material randomly created and then destroyed within a time defined by
>a Heisenberg Uncertainty. Pairs of particles-antiparticles materialize
>out of the energy of the vacuum and then recombine back into energy and
>dissappear - constantly - like a seething ocean of "almost-ness." We
>are not aware of it at our level. And AFAIK there is no experiment
>demostrating knowledge of a fluctuation at any given time or position
>will predict the presence or absence of another fluctuation at another
>time or another position. Passes that test for randomness.
>
>It's effects have been measured at the macroscopic level as the Casimir
>Effect. The zero point energy of the quantum electrodynamic vacuum
>between two conducting, separated plates in a vacuum has been shown to
>decrease while the plates conduct - a decrease measured as an attractive
>force between the two plates. The larger magnitude zero-point energy
>vacuum outside the plates "pushes" them together. (Source - Lorentzian
>Wormholes, From Einstein to Hawking, ISBN 1-56396-394-9, C. 1995,
>Section 12.3.2, pg. 121 )
>
>AFAIK no one's proposed using the Casimir effect as a source of
>randomness.
I hereby propose using the Casimir effect as a source of randomness.
Details shall be left as an exercise for the student. ;)
Seriously, though I agree that some things are randomly created and
randomly destroyed. The claim that we are examining is whether
this is a universal rule that must apply to all systems. I am not
so sure about this.
There are two logical possibilities:
[1] Sometime in the past, something was created from nothing.
[2] There is something that has been in existance for an infinite
amount of time, and had no begining.
I haven't the slightest clue which of the above staements is true.
There is a test that we could run; the "If Tom St Denis doesn't
understand how it works it must be false" test can tell us which
statement is true. My only concern is that the world's scientists
might start competing for time on the finite resource that is Tom
St Denis, or that a terrorist could neutralize this infallable
resource by giving him a physics book.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Random Noise Encryption Buffs (Look Here)
Reply-To: [EMAIL PROTECTED]
Date: Mon, 29 Nov 1999 19:02:18 GMT
James Felling <[EMAIL PROTECTED]> wrote:
[randomness from quantum events]
: Given that the present quantum theories are indeed accurate -- and we have
: no convincing evidence that they are not. It is impossible to model an
: atom so as to predict a decay.( you can make statistical assertions -like
: on average there should have been 10 decay events by time t ), but you
: cannot make definite predictions. The uncertianty principle prevents
: that. This type of phenomena is fundamentally random.
: True, if a paradigm shift occurs, and we come to understand these
: phenomena on a more fundamental level, all bets are off, but until that
: occurs(if ever it does), these events must be regarded as fundamentially
: random.
Even if one grants the thesis that quantum events may be considered
to be random, that does not help terribly much with building a secure
source of random numbers.
Anything that amplifies quantum phenomena to a usable size is itself a
macroscopic object. If you ime radioactive decay emissions, you need a
timing device. You need a source whose characteristics are fixed, and
so forth.
Polarisation /looks/ promising - but you need to ensure nothing interacts
with the particle between its creation and its measurement that can affect
this in a systematic manner. This means perfect vacuums, miles of lead
shielding and worse. Then there's bias in the detector itself - with
what polariser can you be sure that exactly 50% of the particles get
through? It seems that the width of the slit cannot be made small enough
for this to happen. Consequently you're left with a biased source that
you have to fix up.
As far as I'm aware, the search for some direct physical source of
completly secure random numbers is a fundamentally hopeless one - due
to these sorts of effect.
There remains the question of whether post-processing can patch things
up by concentrating the resulting entropy. Certain types of bias /can/
- in principle - be completely removed by this type of processing.
As far as I know, the answer to this question is also negative.
If you know otherwise, feel free to present some sort of scheme you
believe would be workable.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Laugh and the whole world thinks you're an idiot.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: "The Code Book" challenge update
Reply-To: [EMAIL PROTECTED]
Date: Mon, 29 Nov 1999 19:09:13 GMT
Troed <[EMAIL PROTECTED]> wrote:
: I think there is a big hint available as to what the plaintext might
: be for stage 5, considering that the ciphertext is the same in all
: different prints/translations of the book.
Seesh - Mr Singh must be paying you for all those French and German
editions of the book he will now sell to keen punters ;-)
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
A LISP programmer knows the value of everything, but the cost of nothing.
------------------------------
From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: smartcard idea?
Date: 29 Nov 1999 11:22:08 PST
In article <81u86v$m93$[EMAIL PROTECTED]>, [EMAIL PROTECTED] ([EMAIL PROTECTED])
wrote:
>The only time an authorized user would get locked out is if their one
>time pad gets out of sync with the server.
It seems to me that in any situation where a one time pad gets out
of sync the reciever could apply standard brute force techniques
but instead of having to try all possible values of the one time
pad [note] he would only have to try a sliding window that traverses
the one time pad. It seems like this would be pretty easy to do.
Note: the brute force attacker, if he lives that long, will have a
copy of the original message. He will also have copies of every
other possible message of that length, many of which would seem tp
be valid, and no way to tell which one is the real one.
------------------------------
From: "tony.pattison" <[EMAIL PROTECTED]>
Subject: Re: Use of two separate 40 bit encryption schemes
Date: Mon, 29 Nov 1999 19:26:55 -0000
Many thanks for that.
Tony
Terje Mathisen <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> tony.pattison wrote:
> >
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > as I do not live in the land of the free, I'm not permitted to have
> > more than 40 bit DES (I don't know why not, perhaps if we had it,
> > we'd start asking for our colonies back ^_^). As this is pitifully
> > inadequate, I'm thinking of encrypting the data in my packets (again
> > 40 bit encryption) before I send them out over my 40 bit DES
> > encrypted lines.
> >
> > Would I get the equivilant of 80 bit encryption doing this, or would
> > it be less (the paket headers are not being encrypted by the first
> > encryption)?
>
> You would get up to 41 bit encryption doing this: You have to break two
> 40-bit codes to get at your data, which is equivalent to breaking a
> single 41-bit code.
>
> To double the effective number of bits, you must make it impossible to
> solve the problem by halves, but since the 40-bit encrypted line can
> clearly be decoded by itself, this doesn't work in your case.
>
> Sorry. :-(
>
> Terje
>
>
> --
> - <[EMAIL PROTECTED]>
> Using self-discipline, see http://www.eiffel.com/discipline
> "almost all programming can be viewed as an exercise in caching"
------------------------------
From: "tony.pattison" <[EMAIL PROTECTED]>
Subject: Re: Use of two separate 40 bit encryption schemes
Date: Mon, 29 Nov 1999 19:29:09 -0000
Johny,
thanks for that. Alas the systems that I was thinking of, does not have
access to realiable (new) encryption tecniques.
I was trying to find a way around the problem, by using what I have at hand.
Thanks
Tony
Johnny Bravo <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> On Sun, 28 Nov 1999 20:58:56 -0000, "tony.pattison"
> <[EMAIL PROTECTED]> wrote:
>
> >-----BEGIN PGP SIGNED MESSAGE-----
> >Hash: SHA1
> >
> >as I do not live in the land of the free, I'm not permitted to have
> >more than 40 bit DES
>
> This is pretty funny coming from a person with a 128 bit encryption
> program. <grin>
>
> > As this is pitifully
> >inadequate, I'm thinking of encrypting the data in my packets (again
> >40 bit encryption) before I send them out over my 40 bit DES
> >encrypted lines.
>
> Screw DES, use something else.
>
> Best Wishes,
> Johnny Bravo
>
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************