Cryptography-Digest Digest #959, Volume #10      Sun, 23 Jan 00 12:13:01 EST

Contents:
  Re: Does RSA use real prime ? (Michael)
  Re: Intel 810 chipset Random Number Generator (Guy Macon)
  Re: McDonald's claims Nobel peace fries [off-topic] (Guy Macon)
  Re: Intel 810 chipset Random Number Generator (wetboy)
  Re: Challenge. (Paul Schlyter)
  Re: How much random needed for random (Tom St Denis)
  Re: ECC vs RSA - A.J.Menezes responds to Schneier (Bob Silverman)
  New Crypto Rules (SCOTT19U.ZIP_GUY)
  Re: MIRDEK: more fun with playing cards. (CLSV)
  Re: NIST, AES at RSA conference (CLSV)
  Re: MIRDEK: more fun with playing cards. (Rex Stewart)

----------------------------------------------------------------------------

From: Michael <[EMAIL PROTECTED]>
Subject: Re: Does RSA use real prime ?
Date: Sun, 23 Jan 2000 13:12:50 +0100

> As I know, the encryption/decryption of RSA is based on Fermat's theorem,  which says
> 
> If p is a prime, then a^(p-1) = 1 (mod p) for a in Zn    (*)
> 
> I don't have a detailed proof on this. And I think it is not a two-way theorem. That 
>means
> the relation might hold for some carefully chosen 'a' even p is not a prime.

You are indeed correct. There exists numbers p, where p is not a prime
number, so this theorem still holds. These numbers are called Carmichael
numbers, and have the property that n is not a prime number even though
n | a^n - a for every a in Zn. It has been recently proved that there
are infinitely many Carmichael numbers.

There are different ways of finding huge prime numbers, but here is the
most used in the RSA system:

The main idea is an observation made by Miller and Rabin:

Suppose that p is a prime number. If the square of a number x^2 is = 1
(mod p), then x has to be either = 1 (mod p) or = -1 (mod p). This holds
since p must divide (x-1) or (x+1), if p divides x^2 - 1 = (x-1)(x+1)
(proff of this can be found elsewhere)

Now suppose that the exponent in Fermat's theorem is factored as

p-1 = (2^k)*q

For an integer 1 <= a < p, we know that a^(p-1) = 1 (mod p), so that
a^(2^(k-1)q) has to be = 1 (mod p) or = -1 (mod p). If a^(2^(k-1)q) = 1
(mod p), then a^(2^(k-2)q) has to be = 1 (mod p) or = -1 (mod p) and so
on. We have proved that either 

a^q = 1 (mod p)

or we can find an i such that 0 <= i < k such that

a^((2^i)q) = -1 (mod p)

This can be made into an algorithm called Rabins Primality Test...

If we want to test wether an odd number N is a prime, we are sure that N
is not a prime if a^q <> 1 (mod n) and there is no 1 <= i < k such that
a^((2^i)q) = -1 (mod N) for a given integer a, where N - 1 = (2^k)q

If that's easier to understand, here's the algorithm is pseudo code...

choose random a, 1 < a < N
m = q
if (a^m mod N = 1) [PROBABLY PRIME]
i = 0
TEST i:
  if (i = k) [NOT PRIME]
  if (a^m mod N = -1) [PROBABLY PRIME]
  m = 2 m
  i = i + 1
  [TEST i]
  

The reason this is efficient is due to the fact that for a composite
number N there are at most (1/4)phi(N) bases a  of the phi(N) relatively
prime numbers less that N which gives the answer [PROBABLY PRIME]. The
other (3/4)phi(N) will immediately give the result that N is [NOT PRIME]

This of cause requires a proof, but I won't give that here..

Bottom line is that if we pick a random relatively prime interger a to
N, then the probability that a passes the test is 1/4. If we choose 30
a's relatively prime to N, the chance that all 30 of them passes the
test is (1/4)^30. If N is a prime number this test will never discard it
as a composite number, if N is composite and we test it approx 50 times,
we can be pretty sure that the test will fail at some point. In real
life the Rabin test is usually invoked less that 50 times...

-- 
IMPORTANT NOTE:
To reply by e-mail you MUST include the word I-ACK in the subject

------------------------------

From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: Intel 810 chipset Random Number Generator
Date: 23 Jan 2000 07:49:44 EST

In article <86dvah$aht$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Michael 
Kagalenko) wrote:
>
>Guy Macon ([EMAIL PROTECTED]) wrote 
>]In article <86au71$m0n$[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Michael 
>Kagalenko) wrote: 
>]>
>]>Guy Macon ([EMAIL PROTECTED]) wrote 
>]>]
>]>]I can see the logic behind trying, but not if you are looking for
>]>]a good RNG.  But what if you are looking for a cheap RNG?
>]>]a cheap crystal (or, better yet, ceramic) oscillator costs very
>]>]little, and hooks up to a serial or parallel port easily, and is
>]>]pretty much immune to 60 Hz electrical noise.  I agree with
>]>]everything said about the lack of randomness, though.  You really
>]>]are just measuring fine differences in the time between reads
>]>]of your serial/parallel port.  Such a circuit, if Von Neuman
>]>]compensated and exlusive or'ed with the output of the best PRNG
>]>]you can program, would seem to be, on a practical level,  much
>]>]superior to the PRNG alone.
>]>
>]> No. You are wrong. So long as you can reliably count the number of cycles
>]> of quartz crystal, you can use this to recover thermally random numbers.
>]> Temperature dependence may be indeed a proble, but it can be accounted
>]> for either by thermostabilising or by simply measuring it and feeding
>]> it into computational process. 
>]
>]I made at least six claims in the paragraph above, and I cannot
>]tell from your response exactly what I wrote that prompted the
>]"You are wrong" comment.  Could you elaborate as to exactly what
>]you are disagreeing with?  I stand by what I wrote above.
>]
>
> It is too bad that you stand by it, because a lot of it is bogus.

Forgive me if I ignore claimed bogosity that you fail to identify.

You may wish to examine your habit of namecalling without mentioning
exactly what you disagree wit or why.  Your current practice is
sure to cost you in the areas of career advancement and personal
relationships.

> The most bogus part is
>
>>  I agree with
>> everything said about the lack of randomness, though.  You really
>> are just measuring fine differences in the time between reads
>> of your serial/parallel port. 
>
>  As I said above, you can obtain truly random data by measuring clock
> drift due to thermal noise in the crystal.

And you think that the circuit I described will accomplish this?
Or are you postulating a high precision (better than the crystal)
measurement system then complaining that my comments about a cheap
crystal oscillator hooked to a parallel port don't describe your
system?  Thanks for defining "Bogus" by example.

You are also confusing "has some randomness, however small"
with "random enough to be a good RNG for crypto".


------------------------------

From: [EMAIL PROTECTED] (Guy Macon)
Subject: Re: McDonald's claims Nobel peace fries [off-topic]
Date: 23 Jan 2000 07:52:25 EST

 
Paul Schlyter wrote:
>
>Jerry Coffin wrote:
> 
>> [EMAIL PROTECTED] wrote:
>> 
>>> Either we all die, or we quit fighting :-)
>> 
>> That's not an either/or situation: if we all die, we'll surely quit 
>> fighting.
> 
>Not necessarily ... in old Norse mythology, you went to Valhalla
>after death, where you continued fighting.... :-)))

I believe that the Klingon afterlife shares this property. :)


------------------------------

From: wetboy <[EMAIL PROTECTED]>
Subject: Re: Intel 810 chipset Random Number Generator
Crossposted-To: sci.physics
Date: Sun, 23 Jan 2000 13:55:40 GMT

In sci.physics Paul Koning <[EMAIL PROTECTED]> wrote:
: Michael Kagalenko wrote:
:> 
:> Paul Koning  ([EMAIL PROTECTED]) wrote
:> ]Of course they do.  But their signal to noise ratio is high.
:> ]If you're after noise, then you want a source that has a low
:> ](preferably negative) signal to noise ratio.  Crystals fail
:> ]that criterion by a very large margin, which is why no competent
:> ]designer uses them for this purpose.
:> 
:>  That's fair comment, however note, that quartz crystals are a very common
:>  component of digital equipment, and atomic time standard is available
:>  via internet. You can produce thermally random
:>  data by measuring the clock drift against more precise clock (first
:>  you'd have to find out the crystal frequency, of course). To elaborate
:>  a bit, if t is precise time, and t' is the time measured by quartz
:>  oscillator (reclaibrated by using t to avoid systematic drift),
:>  then
:>  <t-t'> = 0     (1)
:> 
:> (<> stands for math. expectation), however, that does not
:>  mean that there is no drift, but that drift in both directions is equiprobable
:>  (the recalibration I mentioned above consists in making sure that (1)
:>  holds)
:> 
:>  If the drift can be assumed to be brownian random walk,
:>  the average square drift < (t-t')^2 > grows linearly with time
:> 
:>  < (t-t')^2 > = constant * t

: I'm not sure what you mean by "thermally random data".  It seems
: that you're making a pile of assumptions that may or may not
: be valid.

: The error of a crystal oscillator comes from a number of components.
: First of all there's the manufacturing tolerance of the crystal,
: typically 0.01%.  Note that in practice this means the error is
: close to -0.01% or +0.01%, a bimodal distribution.
< snip >

Do you know that to be an actual fact, or is that a cynical
assumption you're making?

-- Wetboy

------------------------------

From: [EMAIL PROTECTED] (Paul Schlyter)
Subject: Re: Challenge.
Date: 23 Jan 2000 13:36:35 +0100

In article <86em8o$ckq$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
 
> In article <[EMAIL PROTECTED]>,
>   Roy Longton <[EMAIL PROTECTED]> wrote:
>
>> Second that.  For all I know, he could've used OTP.
> 
> Yeah, but I didn't.
 
So you claim -- but can you prove your claim?  <g>
 
If you don't supply additional detail, there are an extremely large
number of possible solutions to your crypto "challenge", and you
cannot expect anyone to read your mind and tell which one of these
you actually used.

In real-world cryptoanalysis, cracking a crypto is often impossible
if only a small amount of ciphertext is available, even if the
crypto used is quite unsafe.  What distinguishes a safe crypto
from an unsafe one is that the safe crypto will resist cryptoanalysis
even when huge amounts of ciphertext are available, and also if large
amounts of cleartext-ciphertext pairs are available.  And if one
particular key is captured, a safe crypto shouldn't make this any
easier to crack other keys.

There are a lot of amateur cryptologists who submit a small amount
of ciphertext only, asking people to crack it.  When no-one succeeds,
they jump the conclusion and think their crypto is safe, which it isn't
in the vast majority of cases.  Instead you should reveal your encryption
algorithm for others to analyze -- if you do, you can expect a lot of
responses about the safety or the weaknesses of your algorithm.  It's
really only by public reviews that truly safe ecnryption algorithms are
created (unless you hire your own staff of expert cryptologists -- 
however only the military can afford that).


-- 
================================================================
Paul Schlyter,  Swedish Amateur Astronomer's Society (SAAF)
Grev Turegatan 40,  S-114 38 Stockholm,  SWEDEN
e-mail:  [EMAIL PROTECTED]    [EMAIL PROTECTED]   [EMAIL PROTECTED]
WWW:     http://hotel04.ausys.se/pausch    http://welcome.to/pausch

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: How much random needed for random
Date: Sun, 23 Jan 2000 14:48:48 GMT

In article <[EMAIL PROTECTED]>,
  "Yoni M." <[EMAIL PROTECTED]> wrote:
> I'm using SecureRandom (java) in my SSL implementation to produce the
> random bytes needed for the challanges. My problem is performance, it
> takes too long to generate the bytes. If I'm using a simple Random
> everything is much faster.
> Can I initialize the secure random with the current time (long), or
> isn't it enough ? This way is much faster than initializing the
> SecureRandom without any seed.
> Does anyone knows of a short cut or suggestion how to improve the
> performance without reducing security ?
>
> Yoni.

I don't know much about Java but I bet their super fast RNG is just a
linear congruetial generator.  In which case DON'T USE IT FOR ANYTHING
REMOTELY SECURE.


Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: ECC vs RSA - A.J.Menezes responds to Schneier
Date: Sun, 23 Jan 2000 15:48:47 GMT

In article <86asr6$sko$[EMAIL PROTECTED]>,
  Greg <[EMAIL PROTECTED]> wrote:
>
> > Not in practice.  The security of the key generally depends
> > only on the size of the key.
>
> So what you are saying is that given a 10kbit key, no matter
> how I construct the key (say make up of 500 20 bit primes, it
> cannot be broken because the number is just too large for anyone
> to figure out?  Is that what you are saying?

May I suggest you read a bit more carefully?  Where did I say
that it can not be broken no matter how you construct the key?

I said that ** IN PRACTICE **  the security of the key only depends
on its size.  This is because primaility tests will (even probabilistic
ones) screen out deliberately constructed p,q  which have a
large number of prime factors.  Your purported 10 Kbit key made from
two integers each the product of 250 primes will not get past
M-R.


> > > and with a quick check because the strength lies in the
> > > key itself.  With ECC, the strength lies in the curve, not the
> > > key,
> >
> > The security of ECC certainly does depend on the choice of
> > private key because if it is small, then it can be found
> > by direct search.
>
> But can you not say that about any key search for any cryptosystem?

No.  For RSA (e.g.) they keysize is fixed.  For EC the private keysize is
a random variable (with an exponential distribution with mean 1 bit
less than the field size)


> The strength of ECC is in the key space being too large to
> search and too complex to readily determine the factor,

What "factor"?  If you wish to discuss this subject stick to
standard terminology.


> which
> is a result of the curve and field being used.  If you build
> an ECC with field size n and you don't want to use key values
> less than n/2, then when you go to build the key, make certain
> that one of the bits between n/2 and n are set.

Existsing STANDARDS permit the use of a private key selected at
random.  Note also that if indeed you implement your suggested scheme,
then you have reduced the time to break the key by  1/sqrt(2).


  Just one of them
> will solve this problem you point out, particularly if a key space
> of n/2 is unsearchable to begin with.
>
> > Further, if it can be guessed that the private key lies in
> > a restricted range,...
>
> ...then there is something wrong,

Why? Cryptographers worry all the time about partial key revelation.
It is a standard tool in cryptanalysis.


> but what ever it is that is
> wrong has nothing to do with ECC or any other cryptosystem.

WRONG.  With ECC this information is very useful in attacking the key.
With RSA is is not (unless you reveal TOO MUCH)

>
> > With RSA, if the top 20 bits of a key somehow get revealed,
> > that info is of no use to NFS. But with ECC if the top 20
> > bits get revealed, then you can speed up attacks on the key
> > by a factor of 1000.
>
> You are correct, but again this has absolutely nothing to do
> with ECC itself.  However, if this is such an important issue,
> let's talk about it for a moment.
>
> Given an elliptic curve of 500 bits and an RSA key of 500 bits,
> if you expose half the bits, which key is likely to be cracked in
> our life time?

 May I suggest that
you go read the literature on this subject? Especially in terms
of what can be done and what has been done?

512 bit RSA is breakable in a month even with no bits revealed.

And 500 bit ECDL is so far beyond what is needed that noone is
suggesting to use keys this large.



--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: New Crypto Rules
Date: Sun, 23 Jan 2000 17:05:39 GMT

 Could someone be nice enough to state Plainly. What is the minimun
one needs to do to leaglly do to put ones own free source code to ones 
encryption software out on the net for others to download. Please don't
say look at the BXA rules. THey are of no help to those of use in the game
for fun. I do not wish to make a profit. I just want the minum hassle of 
putting my source code on the net for FREE.



David A. Scott
--

SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
                    
Scott famous encryption website NOT FOR WIMPS
http://members.xoom.com/ecil/index.htm

Scott rejected paper for the ACM
http://members.xoom.com/ecil/dspaper.htm

Scott famous Compression Page WIMPS allowed
http://members.xoom.com/ecil/compress.htm

**NOTE EMAIL address is for SPAMERS***

I leave you with this final thought from President Bill Clinton:

   "The road to tyranny, we must never forget, begins with the destruction of the 
truth." 

------------------------------

From: CLSV <[EMAIL PROTECTED]>
Subject: Re: MIRDEK: more fun with playing cards.
Date: Sun, 23 Jan 2000 16:56:37 +0000

Paul Rubin wrote:
 
> In article <[EMAIL PROTECTED]>,
> Paul Crowley  <[EMAIL PROTECTED]> wrote:

> >I can think of one way, which is simply to start the new message with
> >the state where you left off the old message, but this requires that
> >the recipient either receive all of your messages (unlikely) or at
> >least know how long they all were (OK if your recipient is decrypting
> >with a computer).
 
> Why not just do what CipherSaber does, which is combine the
> passphrase with a random salt before doing the key setup, and
> send the salt in the clear?

The problem is that you want to skip key setup because
it is very labor intensive. But I posted an alternative
like initializing I and J with other values than 0.

Regards,

        CLSV

------------------------------

From: CLSV <[EMAIL PROTECTED]>
Subject: Re: NIST, AES at RSA conference
Date: Sun, 23 Jan 2000 16:53:58 +0000

John Savard wrote:

> Unfortunately, the market fails to percieve a pressing need for
> fundamentally better ciphers than the ones we already have.

I think you underestimate the market. The most pressing
need for encryption is probably in the financial sector.
They have knowledgable people (and the money to pay them).
Example: one of the people who knows a lot about prime
numbers (A. Lenstra) works for Citicorp.

> One has the alternative of leaving things to the free market, or
> asking for some kind of intervention in the free market: such as
> having the government fund research on drugs that for some reason are
> unprofitable to pharmaceutical companies, or having the government pay
> to land men and women on Mars.

Well, governments *are* paying experts to research
cryptography. In the United States they are concentrated
in a certain Fort Meade in Maryland. What more could one
wish? ;-)

But seriously I think a crisis is made up (concerning pure
cryptographic research) in this thread where no real crisis
exists.

Regards,

        CLSV

------------------------------

From: Rex Stewart <[EMAIL PROTECTED]>
Subject: Re: MIRDEK: more fun with playing cards.
Date: Sun, 23 Jan 2000 16:48:58 GMT

I have been practicing Mirdek and have noticed several things.

First is that, because I haven't done much with card deck cyphers
in a while I am rusty, and it appears to require about 15 hours
to get proficient.  This is an important fact to convey to anyone
planning to use these cyphers in a "real world" situation.  The
first several hours of this will be spent getting their mind used to
seeing the letters of the alphabet on the red and black cards and
learning the rules of the encryption system.  You can't just agree
on a system and run off to a hostile area and expect to do this
cold turkey.  Also, this means if they have trouble at first they
should not give up too easily.

At this point I do not feel proficient enough to state whether
ARC4-52, Mirdek, or even Solitaire - and I haven't even begun
to look at John Savard's (unnamed?) card deck cypher - is
faster. I expect it will require several more weeks of practice
before I will feel comfortable making that statement.  I think it
would be interesting if someone were to take the time to make
a table showing all of the operations used in each cypher, how
many of each operation are required in the keying and encryption
routines, and how long an "average" person would take (after
some practice) to execute each operation.  Using a generic
"timer tic" unit of time measurement would allow comparing
speed on one operation to another while obscuring the
comparison of the speed of one person to another.

Again, I must point out, this is something may never be agreed
upon:  which operation take longer, or are more complex, for
a human.  The time it takes a Pentium or K6 processor to do
a particular operation can be looked up in a book - and it is
likely if one operation is quicker than another (addition is quicker
than division) on one processor it is likely to be quicker on most
of them.   Humans are much more variable. I can do some
math functions in my head that astonish most algebra and trig
teachers - but as I stated above, trying to remember a couple
of numbers while counting to 31 befuddles me.  And I can't
manipulate cards with my hands as quickly as many of my
card playing friends.

I would expect this non-agreement to be increased because the
author of a particular cypher will likely have much more practice
with his/her own cypher than with other cyphers.

While reading the following, it should be noted:
I am not a cryptographer.  Especially every time you see
the words "should" or "seems" or "appears"   :-)

While there has been a lot of discussion about bias in the output,
I have been looking at all of these cyphers from a different point
of view.  A bias of several percent in the output in a hand cypher
is not nearly as important as insulating the factors that are known
(or can be guessed) to an adversary from those that the adversary
does not know.  While experimenting with my own variations of
card deck cyphers, I was able to create one that I am certain has
very balanced output, but when I looked at a known plaintext attack,
it was a disaster.  It leaked over two bits of state information for
every consecutive known plaintext letter.

Not that the discussions on bias are completely unwarranted,
but I wonder: in 60 separately keyed messages of 400 characters
each, with 30 consecutive known plain text bytes - how much
state information would you expect the bias to reveal?  (See the
disclaimer above, but in this case I don't think the bias would
be significantly useful.)

On output feedback cyphers based on ARC4 variants (like
Solitaire) insulating the state information (and therefore the key)
from the output seems fairly straightforward, and in fact Solitaire
seems to have two layers of such insulation. The downside of
course is the requirement to use a new key for each message.
In cyphers which incorporate the cyphertext or plaintext into the
state entropy, this can be much more difficult.  Terry Ritter
posted a message a few years back about this problem with
his Dynamic Substitution.  An explanation of the problem can
probably be found on his website.

I was first bothered by the open transmission of the IV in
the message, but on closer observation, swapping the
decks between keying and mixing seems to effectively
hide this information by reordering the IV based on the
key.  This cypher adds entropy to the state information in
much the same way as Terry Ritter's Dynamic Substitution,
and that could be a problem, although the simultaneous
adding of entropy from the right hand deck should provide
enough confusion (it adds 3.5 bits entropy each round, the
plain text adds 1.3 bits) to preclude recovery of significant
state information in the case of a known plaintext attack
(I am not so sure about a chosen plaintext attack).

The 26 mixing steps themselves seemed at first to be
aimed at insulating the key from the state information,
but because the IV is sent in the clear this is not the case
and in fact Paul Crowley, the author, clearly warns it is not
the case.  These steps appear aimed at using the key
information  to hide the IV.  As such it seems excessive,
since every card in the IV should be effected by key at least
most of the key information (I have not done a full statistical
analysis of this) in the first seven mixings, and should be
effected twice after only ten mixings.

In fact, in all of these card deck cyphers, it seems the keying
phase is the biggest single chunk of overhead - and it seems
in all of them the keying is somewhat excessive.  OTOH,
I don't have a degree in statistics and am not prepared to
present a universally acceptable alternative.

--
Rex Stewart
PGP Print 9526288F3D0C292D  783D3AB640C2416A


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to