Cryptography-Digest Digest #852, Volume #11      Wed, 24 May 00 14:13:01 EDT

Contents:
  Re: RSA/PK Question (Tom St Denis)
  Re: Matrix reduction ([EMAIL PROTECTED])
  Re: Yet another block cipher: Storin ([EMAIL PROTECTED])
  HTML encryption (DigiboyCiPHER)
  The Code Book / Are factor techniques really that secure? (DigiboyCiPHER)
  Re: Smooth numbers (Anton Stiglic)
  Re: Yet another block cipher: Storin (Mark Wooding)
  Re: The Code Book / Are factor techniques really that secure? (Mark Wooding)
  Re: Encryption within newsgroup postings (Anton Stiglic)
  Re: Chosen Plaintext Attack (Raphael Phan)
  Re: Patent state of Elliptic Curve PK systems? (Mike Rosing)
  Re: RSA/PK Question ("Eric Verheul")
  Re: Chosen Plaintext Attack (Raphael Phan)
  Re: Crypto patentability ("Paul Pires")
  Re: The Code Book / Are factor techniques really that secure? (Bob Silverman)
  Re: RSA/PK Question (Bob Silverman)
  Re: Smooth numbers (Bob Silverman)
  Re: Smooth numbers (Francois Grieu)

----------------------------------------------------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: RSA/PK Question
Date: Wed, 24 May 2000 15:59:38 GMT

In article <8gghmg$6ul$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] wrote:
> If Kp and Ks are the public and secret RSA Keys and if Ka is a random
> key of the same lenth:
>
> If Ka is XORed with Kp and Ks indpendently i.e.
>
> Kp`= Ka XOR Kp
> Ks`= Ka XOR Ks
>
> Can one still encrypt with Kp`and decrept with Ks`using RSA ?
>
> My second question relates to long RSA keys...Is it reasonable to
assume
> that a 10000 bit RSA key will take 10 times as long to generate as a
> 1000 bit RSA key.... Would appreciate some key generation
timmings...for
> long key pairs...

Question 1:  Why on earth would you want a 10kbit RSA key? That's the
first sign you don't know what you are doing.

Question 2:  Why would you assume it would take linearly more time?
Cuz you don't know what you are talking about?

I admit I am not the godsen of math here, but be realistic and just ask
how RSA keys are made.  Once you figure that out, you will see why
10kbit keys could take you a day or so to make (well it would take a
long time to say the least).

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Matrix reduction
Date: 24 May 2000 11:25:42 -0400

In article <8gg194$s1c$[EMAIL PROTECTED]>, [EMAIL PROTECTED] writes:
>In article <8ge93d$[EMAIL PROTECTED]>,
>  [EMAIL PROTECTED] wrote:
>> In article <[EMAIL PROTECTED]>, Chris Card
...
>> >The (half-baked) idea I had was to get a random linear
>> >combination of the vectors, score it (= number of non-zero
>> >elements) and then slighty change the combination looking for
...
>> >combination, which doesn't sound too bad. But I've no idea
>> >how likely this approach is to get a linear dependency, and how
>> >long it's likely to take.

   I agree (and, so, faithfull included) that the above considerably
   clairfies the intent of the first post.  I was mostly wondering
   whether your proposal of a new(?) matrix reduction method had gotten
   past preliminary estimates;  and, if so, whether you had in mind
   problems of the size that have actually occurred (not to mention
   the sizes that will occur as we push toward the next benchmark).
   Thinking about a few K rows isn't the same as thinking about 10's
   of millions, especially in the near-vacuous range of sparseness
   of these factoring matrices.
     Our friend Bob Silverman asserts that block Lanczos is near the
   theoretical best-possible runtime.  If you're saying that you haven't
   considered formulating a runtime for your method, then there's still
   some considerable effort required to reach half-baked.

>>  Since my first two attempts at replying to "Chris Card" off newsgroup
...
> ... deja.com was upgrading at the time (and let's not
>get into arguments about newreaders - I only had web access at that
>point), and Remarq kindly protected me against spam, and against helpful
>answers too it seems. I am very interested in gnfs and I'm on the
>learning curve at the moment (reading Murphy's thesis is one of the
>things I'm doing). ... this idea about hill-climbing ...

    The reference in Arjen's code for the Montgomery/Murphy polyn
    selection, presumably taken directly for one of Peter or Brian's
    code, Does use a steepest descent, but from a 1977 numerical
    analysis text;  rather than recent cryptographic suggestions.
    There's something to be said in favor of not mixing things
    together unless there's a good reason to.

>>   I'm not familiar with anything other than block Lanczos, ...
...
>I thought my second posting above explained what I had in mind.
>Is it rubbish? (e.g. unlikely ever to find a dependency, or much too
>slow?)

    I don't have anything to contribute on your question (hence the
    attempts to reply off-newsgroup), but while we're waiting for a
    comment from an expert with sufficient patience to reply, perhaps
    I could append my standard advertisement for participation in the
    current Cabal project (the group that factored RSA-155).  The site
    is http://www.lehigh.edu/~bad0/cabal773.html, and since my previous
    post/solicitation we've gotten well past the range in which feasibility
    of our (special!) nfs factorization of the 774-bit number 2^773+1 is
    any more in question.
       Despite my snide comment about microsoft monopolizing his attention
    (so to speak), we do expect that the factoring matrix will be harder
    than the one from RSA-155 (despite substantial strategic adjustments,
    such as keeping the factor bases small), and that Peter will be thinking
    about getting his Lanczos method to stretch to this new record largest
    test case.
       B. Dodson


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Yet another block cipher: Storin
Date: Wed, 24 May 2000 16:13:14 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Mark Wooding) wrote:
...

> My objective is to test the strength of the matrix multiplication as a
> primitive.  If it looks good, I may try building a faster stream
cipher
> around it.  Even as it is, I don't think Storin is terribly slow on
> 24-bit DSPs.
>
> You can fetch the package from the usual place (which, if memory
serves,
> is http://www.wizard.net/~echo/crypto-contest.html) if you like.  You
> can get just the paper, as compressed postscript, from my own web
pages
> at http://www.excessus.demon.co.uk/crypto/storin.ps.gz.
>
> Any cryptanalysis is welcome.
>
> -- [mdw]

Mr. Wooding,

Perhaps I have found a weakness.  1 weak key is present if 864 bit  keys
are allowed.  The schedule sets up a fixed array, P in Blowfish
notation.  The P array is then XORed with the user key.  Since the fixed
P array is known, one key will result in an all 0 array after the XOR.
With the zero vector as input, the output will also be the zero vector.
This loop will repeat causing all the round keys to be the zero vector!
The cipher will then essentially have no key.  By encrypting the zero
vector , an attack can tell if the weak key is used.

Blowfish avoids this problem by setting the max key length to be 14
words instead of 18.

Also, different length equivalent keys seem to be present.  The key A
will yield the same schedule as AA or AAA etc. This is not a weakness
just a curiousity.  Blowfish shares this property.

--Matthew


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: DigiboyCiPHER <[EMAIL PROTECTED]>
Subject: HTML encryption
Date: Wed, 24 May 2000 16:33:55 GMT

Is there any easy way to encrypt HTML source but retain use amongst
non-javascript browsers?

--
======
Marcus
======
www.cybergoth.cjb.net
Techno gothic cyberculture


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: DigiboyCiPHER <[EMAIL PROTECTED]>
Subject: The Code Book / Are factor techniques really that secure?
Date: Wed, 24 May 2000 16:32:36 GMT

The subject line is possibly vague, but recently I have been finishing
off reading The Code Book by S.Singh and (I spent more time trying to
crack the ciphers at the back) eventually got to the end where it talks
about RSA, PGP etc etc

Surely there are speedier ways to factorise large numbers than going
through _every_ possibility.

---

I mean, for example on p276 Singh states that he came up with the number
1709023 in ten seconds but it will take probably a whole afternoon to
find out the prime factors using only a calculator. I found them in
almost exactly 5 minutes.

(Using deduction of sorts you can say that the last digits of the
factors must either be 3x5,3x1,7x9 and therefore 3 is likely for a last
digit. To get a first digit of 1 and the size of that number it is most
likely that the factors are two four digit numbers beginning with 1 -
assuming human choice. So you end up with a number that is 1--3 leaving
only 100 options to go through - less if it were through a computer that
knew prime numbers. The two numbers I got were 1423 x 1201, by the way.)

---

Enough of that! :) My point being that it is probably wrong to assume
that even humungously larger numbers are unfactorable within a normal
time-span... there appears to be a lot of shortcuts that can get round
it and with powerful enough computers (in existence, excluding quantum
ones) it's surely a doddle. e.g. The governments seem to be _relatively_
calm about the encryption techniques that are thought to be unbreakable.

Ah well, just thought I'd post anyway...

======
Marcus
======
www.cybergoth.cjb.net
Techno gothic cyberculture


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Smooth numbers
Date: Wed, 24 May 2000 12:44:37 -0400

Eric Hambuch wrote:
> 
> Does anybody know the number of primes n, where n-1 is "smooth" (n has
> only small prime factors p_i of size O(log n)) ?
> 
> Any hints (or better proofs and references) are welcome !
> 
> Eric

There was a thread about this subject here not to long ago,
you can go search on deja.com of directly go to the following link


http://x74.deja.com/viewthread.xp?AN=620610788&search=thread&svcclass=dncurrent&ST=PS&CONTEXT=959186301.720764949&HIT_CONTEXT=959186301.720764949&HIT_NUM=6&[EMAIL PROTECTED]%3e%231/1&group=sci.crypt&frpage=getdoc.xp&back=clarinet

Anton

------------------------------

From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: Yet another block cipher: Storin
Date: 24 May 2000 16:43:56 GMT

[EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:

> Perhaps I have found a weakness.  1 weak key is present if 864 bit  keys
> are allowed.  

You're right.  That's one of the reasons I don't recommend more than
120-bit keys ;-)

Note that, given a random user key, the attack succeeds with probability
2^{-864}.  I'm not sure that, given a keyspace of 2^864, this qualifies
as `better than brute force', actually. ;-)

> Blowfish avoids this problem by setting the max key length to be 14
> words instead of 18.

Actually, it's more subtle than that.  If you've read my paper,
detailing my various bits of analysis, I noted a peculiar feature of the
key schedule, that the second round key doesn't depend on the first
one.  To see this, note that the first round key is equal to the
plaintext used to generate the second round key, giving a zero output of
the first round.

Blowfish has a variant of this property.  Consider the second encryption
used in the key schedule: P[0] and P[1] have just been set, and we're
going to encrypt them again to find the (new) P[2] and P[3].  The first
two rounds look like this:

   P[0]          P[1]
    |             |
   (+)<- P[0]     |
    |             |
    0            P[1]
    |             |
    |-----[F]--->(+)
    |             |
    |            (+)<- P[1]
    |             |
    0           F(0)
    |             |
   (+)<---[F]-----|
    |             |
   F(0)         F(0)

(+) is an XOR.

I think Bruce Schneier noticed this, and required a 14-word maximum in
order to ensure that the first 64 bits of the user key affected the
encryption of the second word pair in a part of the cipher earlier than
the postwhitening.

(This particular bit of Blowfish had puzzled me for a while.  I'm glad
to be able to finally put that problem to rest.)

> Also, different length equivalent keys seem to be present.  The key A
> will yield the same schedule as AA or AAA etc. This is not a weakness
> just a curiousity.  Blowfish shares this property.

Yes, I know this.  I don't think it's a worry, even for hashing modes,
since in that case the input size is fixed.

-- [mdw]

------------------------------

From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: The Code Book / Are factor techniques really that secure?
Date: 24 May 2000 16:52:13 GMT

DigiboyCiPHER <[EMAIL PROTECTED]> wrote:

> Surely there are speedier ways to factorise large numbers than going
> through _every_ possibility.

There are *much* faster methods than trying every possiblity.  They
still take too long, or use too much memory, with the size of numbers
currently used in public-key cryptography.

-- [mdw]

------------------------------

From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Encryption within newsgroup postings
Date: Wed, 24 May 2000 12:54:11 -0400


> Posting:
> Xuk yffi wfubsq ptiua bleow lfm fefea pl kyv ezpc!
> 
> Ncldm chla meeo rujo efeh acy uofjf
> jysh umiym kqc pwsj let tt hodk nee
> fjelk ailkn daltd sjs mlmjp ifujb big ueiub eotpev ymwk
> 

I don't think that there is a two letter word in english 
that has the same two letters (in french neither...).  
Thus, "tt" would not decrypt to an english word using 
a simple substitution cipher, unless it's something like
a roman number... ?  "nee" looks suspicious as well.
Unless the cipher function is more complex, the text
might be simple "jeeberish"...

my 2 .01$

Anton

------------------------------

Date: Thu, 25 May 2000 01:08:06 +0800
From: Raphael Phan <[EMAIL PROTECTED]>
Subject: Re: Chosen Plaintext Attack

That's the problem... without having a clear understanding how to apply a
chosen or known plaintext attack (how to generate pairs of chosen
plaintext), it is hard to start implementing the suggested attacks...

Mark Wooding wrote:

> David A. Wagner <[EMAIL PROTECTED]> wrote:
>
> > This is the point where it is probably best to refer you to Biham and
> > Shamir's book, Differential Cryptanalysis of the Data Encryption
> > Standard, which gives all the gory details.
>
> This is now out of print (at least, according to Amazon it is).  Where
> could I get a copy from?
>
> -- [mdw]

--

" Contentment is not the fulfilment of what you want, it is the
 realization of how much you already have.  "

"   When you were born, you cried and the world rejoiced.
  Live your life in such a manner that when you die,
         the world cries and You rejoice...          "



------------------------------

From: Mike Rosing <[EMAIL PROTECTED]>
Subject: Re: Patent state of Elliptic Curve PK systems?
Date: Wed, 24 May 2000 12:13:54 -0500

Scott Contini wrote:
> There is a patent held on elliptic curves over  GF(p)  where  p  is
> a special form.  I believe the form is  2^n +/- c  where  c  is
> a small integer.  This is a patented that should really be fought,
> since the ideas of doing fast arithmetic on these primes have been
> well known for a long time.  (I think this patent is held by Apple).

That's Crandall's patent on EC-DH over GF(p^m) for p of special form.
The form is correct, - is the choice to make things fit in machine
words.

> However, there are other primes that one can do arithmetic on approximately
> the same speed which fall outside this patent.  These are primes whose
> signed binary representation is sparse.  I believe I have seen such
> primes in some standard, but I do not know which one.

Check out "generalized Mersenne primes", described by Solinas at the 
last ECC conference.  It's probably in the P1363 archives somewhere,
or else on the cacr.waterloo.ca web pages.

Patience, persistence, truth,
Dr. mike

------------------------------

From: "Eric Verheul" <[EMAIL PROTECTED]>
Subject: Re: RSA/PK Question
Date: Wed, 24 May 2000 19:21:07 +0200


<[EMAIL PROTECTED]> wrote in message news:8gghmg$6ul$[EMAIL PROTECTED]...
> If Kp and Ks are the public and secret RSA Keys and if Ka is a random
> key of the same lenth:
>
> If Ka is XORed with Kp and Ks indpendently i.e.
>
> Kp`= Ka XOR Kp
> Ks`= Ka XOR Ks
>
> Can one still encrypt with Kp`and decrept with Ks`using RSA ?
>
>
> My second question relates to long RSA keys...Is it reasonable to assume
> that a 10000 bit RSA key will take 10 times as long to generate as a
> 1000 bit RSA key.... Would appreciate some key generation timmings...for
> long key pairs...
On our website (www.ecstr.com) you can find some software with which you can
time RSA key generation;
actually this software is meant to illustrate the parameter and key
generation of the XTR public key system.
making the lenght of your RSA primes 10 times larger will roughly make the
generation times 10^2-10^3
larger!

Eric




------------------------------

Date: Thu, 25 May 2000 01:26:35 +0800
From: Raphael Phan <[EMAIL PROTECTED]>
Subject: Re: Chosen Plaintext Attack

Mark,

There is a copy at the ETH-Bibliothek library, ETH-Zurich..
http://www.ethbib.ethz.ch/

You can request to borrow from there...

Mark Wooding wrote:

> David A. Wagner <[EMAIL PROTECTED]> wrote:
>
> > This is the point where it is probably best to refer you to Biham and
> > Shamir's book, Differential Cryptanalysis of the Data Encryption
> > Standard, which gives all the gory details.
>
> This is now out of print (at least, according to Amazon it is).  Where
> could I get a copy from?
>
> -- [mdw]

--

" Contentment is not the fulfilment of what you want, it is the
 realization of how much you already have.  "

"   When you were born, you cried and the world rejoiced.
  Live your life in such a manner that when you die,
         the world cries and You rejoice...          "



------------------------------

From: "Paul Pires" <[EMAIL PROTECTED]>
Subject: Re: Crypto patentability
Date: Wed, 24 May 2000 10:23:07 -0700


Mok-Kong Shen <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...

> What do you mean by ''Patent laws and the patent process don't impact
> or effect [our] area of concern''??? If I design an algorithm using
rotation,
> which I 'really' have many times used in my programs (in other fields)
> since decades, and a certain firm claims that I am imfringing its
> patent rights, do you mean that that does NOT concern me??? If we
> recognize (I am not sure that many of us can do that well, I myself
> at least not fully) "The patent examiner cannot process the patent on
> it's merits'', is that THE reason that we should close our eyes about what
> is being practiced in the patent offices in matters of crypto??

    Eye closing isn't advocated here. Just another veiwpoint. I think your
eyes are closed now and you do not see what is going on in those patent
offices. My point is that I see a bunch of pointing at patents and hear the
screams of "See how messed up the patent process is". Guilt is assumed by
association. The patent office doesn't write them.
    I spoke poorly about the patent process not impacting your area of
concern. I meant that they don't impact this area any more or less than any
other. Each patient in the emergency room always thinks his affliction is
more serious than most of the others. The Idea I wanted to get across is
that this whole rant angainst Patents is not well researched or well thought
out. Too much "Common knowledge" and perpetuation of myths is going on here.
You have a forum and an opportunity to discuss and perhaps effect the very
real problems with the system but I see the discussion breaking up into
polarized idealogical groups. (and I'm starting to feel very lonely over
here in mine)

> >     How long do you think that patents have been a part of our legal
system?
> > It goes way back to old English law. This is not some new social program
> > that isn't working. This is the culmination of hundreds of years of use.
My
>
> Mmh. Sentencing to death has been practiced since before man could
> write anything. Yet in most democratic countries of the world that has
> been eliminated from laws today.

    That's just a bit "over the top" don't you think? I point out that this
has been a working component in a free society for a long time. My intent
was to show that it probably serves some valuable purpose for the public and
you associate it with legislated homicide? I'm beginning to suspect that you
don't appreciate my input.

> >     Prior art is the big issue. The Patent must not be anticipated by
any
> > prior art. Most folks think that means previous patents. It does not.
Any
> > publication or offer for public sale is prior art. The examiner can only
> > search and review what he can find and what the applicant supplies (he's
> > legally obligated to fully disclose any he knows of). The patent grant
isn't
> > home free, any prior art found can invalidate an Issued patent. If you
think
> > this stuff was done before, find the publication or sale and link it to
a
> > date.
> >     You folks have been doing the single most important thing all along.
> > This is a public forum where issues described here become the very prior
art
> > that will keep a bad patent from being enforced. It won't keep it from
being
> > issued. I said the process was beautiful, not omnipotent.
>
> What are you actually suggesting here to us?

    What I am suggesting is that some problems in the system can be
addressed now and in fact you folks are doing a real part of it. This public
discussion of the art publicly discloses common concepts in detail making
them prior art. Bruce Schneier's post on AES and the Hitachi patents is a
case in point. Short concise and to the point. Do you think that Hitachi
won't read it, verify it and decide they probably shouldn't sue for
infringement based upon the information revealed? This is serious stuff for
a company. There are downsides to their side of the issue too. They go to
court and the whole patent will be suspect and redefined. Just because they
have more money to spend doesn't mean that they like to blow it on lost
causes. They have no burning desire to be the "Evil Corporate Monsters".
    What I am suggesting is that if you wan't to do more, study the problem
and find out what happend to cause this and why did it happen. I have and I
was surpprised that my "Common knowledge" was wrong. It isn't the Office,
the examiners or the law that is the problem. It is the volume of work, the
resources that they have and the newness of the feild as a commercial
endevor. The volume in this area has exploded since Whit Diffie dropped his
bomb and there aren't a surplus of cryptographers running around to work in
their special art unit.

    We seem to have a different take on this subject and that's OK. I think
you're throwing the baby out with the bath water and you think it's a tumor,
not a baby.
So we have different opinions. No big deal.

Paul








------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: The Code Book / Are factor techniques really that secure?
Date: Wed, 24 May 2000 17:40:44 GMT

In article <8gh074$i2l$[EMAIL PROTECTED]>,
  DigiboyCiPHER <[EMAIL PROTECTED]> wrote:
> The subject line is possibly vague, but recently I have been finishing
> off reading The Code Book by S.Singh

You have my sympathy.
May I suggest that anyone tempted to read this book read the review by
Jim Reeds in the March 2000 issue of Notices of the American Math. Soc.?

The Singh book is replete with errors. Many of them. And he does
not seem to understand the subject.

This should not be surprising. His book about Wile's proof of FLT
was equally bad.



>
> Surely there are speedier ways to factorise large numbers than going
> through _every_ possibility.

Of course there are.
See:

H. Riesel Prime Numbers and Computer Methods for Factorization
Birkhauser
>
> ---
>
> I mean, for example on p276 Singh states that he came up with the
number
> 1709023 in ten seconds but it will take probably a whole afternoon to
> find out the prime factors using only a calculator. I found them in
> almost exactly 5 minutes.

Yup. Sounds about right.

>
> Enough of that! :) My point being that it is probably wrong to assume
> that even humungously larger numbers are unfactorable within a normal
> time-span...

You have no basis for making such a statement. Indeed, you yourself
admit that you know nothing about factoring algorithms.
And you are grossly wrong. Read Riesel's book.




--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: RSA/PK Question
Date: Wed, 24 May 2000 17:44:00 GMT

In article <[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Mark Wooding) wrote:
> [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
>
> > If Kp and Ks are the public and secret RSA Keys and if Ka is a
random
> > key of the same lenth:
> >
> > If Ka is XORed with Kp and Ks indpendently [...] Can one still
encrypt
> > with Kp`and decrept with Ks`using RSA ?
>
> No.  This will hardly ever work.

Correct.


> > My second question relates to long RSA keys...Is it reasonable to
assume
> > that a 10000 bit RSA key will take 10 times as long to generate as a
> > 1000 bit RSA key....
>
> No.  It'll take about 1000 times as long, I think.  Key generation
times
> increase with the cube of the key length.

No.  The fourth power. Testing an individual candidate runs in time
O( (log N)^3)  and one must test O(log N) candidates on average.

--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Subject: Re: Smooth numbers
Date: Wed, 24 May 2000 17:51:09 GMT

In article <[EMAIL PROTECTED]>,
  Eric Hambuch <[EMAIL PROTECTED]> wrote:
> Does anybody know the number of primes n, where n-1 is "smooth" (n has
> only small prime factors p_i of size O(log n)) ?
>
> Any hints (or better proofs and references) are welcome !

It will be approximately c * n *  (log n/log log n) ^-(log n/log log n).
The constant c is given by a singular series and is effectively
computable, but has never been done so (AFAIK)

To see this result observe that the probability that a random integer
near N  has all of its prime factors less than log N  is about u^-u
where u = SIZE(N)/SIZE(log N) = log N/loglog N (by a theorem of
Canfield, Erdos and Pomerance).

The result is not exactly  n * u^-u   because n-1 "isn't quite"
random.  It is always divisible by 2,  is divisible by 3 with
probability 1/2,  divisible by 5 with probability 1/4 etc.  which
yields the singular series for c.



--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Francois Grieu <[EMAIL PROTECTED]>
Subject: Re: Smooth numbers
Date: Wed, 24 May 2000 20:04:21 +0200

Eric Hambuch <[EMAIL PROTECTED]> wrote:

> Does anybody know the number of primes n, where n-1 is "smooth"
> (n has only small prime factors p_i of size O(log n)) ?


References:

N. de Bruijn, On the number of positive integers <= x and free of
prime factors >= y, Indagationes Mathematicae, vol. 13,
pp. 50-60, 1951 and vol. 28, pp. 236-247, 1966; could not get at it.

Donald E. Knuth, The Art of Computer Programming,
volume 2, section  4.5.4, page 383 in third edition



The number of positive integers <= x and free of prime factors >= y
is noted psi(x,y).

For any real u>0, 
  lim  psi(x, x^u) = F(u)
 x->+inf

where F is Dickman's function, which can be defined as
(view in non-prop. font)

         /1
F[u] = 1-|  F[t/(1-t)]/t dt   when 0<=u<=1         F[u] = 1  when u>=1
        /u


A usefull approximation of psi(x,y) is F(Log(y)/Log(x)).


Computing Dickman's F function is uneasy. Here is a table
giving F[1/u] for u from 1 to 20.75

        +0          +1/4        +1/2          +3/4
 1  1.00000e+00  7.76856e-01  5.94535e-01  4.40384e-01
 2  3.06853e-01  2.02442e-01  1.30320e-01  8.10724e-02
 3  4.86084e-02  2.84272e-02  1.62296e-02  9.02923e-03
 4  4.91093e-03  2.61995e-03  1.37012e-03  7.03061e-04
 5  3.54725e-04  1.76081e-04  8.60186e-05  4.14019e-05
 6  1.96497e-05  9.19891e-06  4.25036e-06  1.93963e-06
 7  8.74567e-07  3.89772e-07  1.71787e-07  7.49034e-08
 8  3.23207e-08  1.38064e-08  5.84057e-09  2.44749e-09
 9  1.01625e-09  4.18228e-10  1.70635e-10  6.90346e-11
10  2.77017e-11  1.10277e-11  4.35595e-12  1.70761e-12
11  6.64481e-13  2.56708e-13  9.84764e-14  3.75172e-14
12  1.41971e-14  5.33711e-15  1.99346e-15  7.39887e-16
13  2.72919e-16  1.00061e-16  3.64684e-17  1.32140e-17
14  4.76063e-18  1.70552e-18  6.07651e-19  2.15328e-19
15  7.58991e-20  2.66135e-20  9.28406e-21  3.22240e-21
16  1.11292e-21  3.82495e-22  1.30828e-22  4.45366e-23
17  1.50908e-23  5.08996e-24  1.70905e-24  5.71296e-25
18  1.90135e-25  6.30068e-26  2.07903e-26  6.83141e-27
19  2.23543e-27  7.28510e-28  2.36461e-28  7.64464e-29
20  2.46178e-29  7.89696e-30  2.52353e-30  8.03374e-31

I computed the above values using extended precision arithmetic,
rather than by the so-called  de Bruijn approximation.
They match and extend published tables by Knuth and Riesel.


Hope this helps,

   Francois Grieu

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to