Cryptography-Digest Digest #305, Volume #10      Thu, 23 Sep 99 21:13:05 EDT

Contents:
  Re: Mystery inc. (Beale cyphers) (jerome)
  Re: Exclusive Or (XOR) Knapsacks ("Douglas A. Gwyn")
  Re: msg for Dave Scott (SCOTT19U.ZIP_GUY)
  Peekboo on hold for now (Tom St Denis)
  Re: msg for Dave Scott (Tom St Denis)
  Re: Securing Executables ("Kasper Pedersen")
  Re: Mystery inc. (Beale cyphers) (Curt Welch)
  Re: RSA 640 bits keys factored, French banking smart card system craked! ("Joseph 
Ashwood")
  Re: RSA 640 bits keys factored, French banking smart card system craked! (Alex)
  Re: Another bug RE: CryptAPI (Eric Lee Green)
  Re: RSA 640 bits keys factored, French banking smart card system craked! ("Dmitriy 
Morozov")
  Re: RSA 640 bits keys factored, French banking smart card system craked! ("Joseph 
Ashwood")
  Re: RSA 640 bits keys factored, French banking smart card system craked! ("Dmitriy 
Morozov")
  NEW COMPRESSION (SCOTT19U.ZIP_GUY)
  Re: EAR Relaxed? Really? (Greg)
  Re: RSA 640 bits keys factored, French banking smart card system craked! ("Dmitriy 
Morozov")
  Re: RSA 640 bits keys factored, French banking smart card system craked! ("Joseph 
Ashwood")
  Re: Galois field isomorphism? ("Brian McKeever")

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (jerome)
Subject: Re: Mystery inc. (Beale cyphers)
Date: 23 Sep 1999 21:41:03 GMT
Reply-To: [EMAIL PROTECTED]

On Thu, 23 Sep 1999 19:46:44 GMT, Tim Tyler wrote:
>
>Apparently there /is/ some independent evidence supporting certain parts
>of the story.  Simon Singh covers this evidence briefly in "The Code Book"
>and concludes that the Beale cypher is probably genuine.

ah ? i read it and dont find that.
maybe a guy with the same name than the coder has existed, it doesn't
mean that the gold story is true

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Exclusive Or (XOR) Knapsacks
Date: Thu, 23 Sep 1999 20:38:06 GMT

David Wagner wrote:
> In article <097G3.212$[EMAIL PROTECTED]>,
> Boris Kolar <[EMAIL PROTECTED]> wrote:
> > Encoding and decoding linear codes involves matrix multiplications, which
> > can't be performed with only XOR operations.
> I disagree.
> Most of the proposals I know of use GF(2), and over GF(2), everything's
> an XOR.  When the matrix (e.g., the public key) is fixed, the bits you
> XOR together are fixed.

XOR is addition in GF(2).  But there is also *multiplication* in
GF(2), which is AND, *not* XOR.

a       b       aXORb   aANDb
0       0       0       0
0       1       1       0
1       0       1       0
1       1       0       1

XOR spans a proper subalgebra of the (maximal) algebra spanned by
{XOR,AND}.  For example, with just XOR one can implement NOT:

NOT a  =df  a XOR 1

and several binary ops (a EQV b  =df  NOT[a XOR b]), but not AND.
Before WWII, Jaskowski explored the properties of this subalgebra
(in terms of {EQV,NOT}, which clearly spans the same algebra).

However, none of this has much to do with important issues...

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: msg for Dave Scott
Date: Thu, 23 Sep 1999 22:56:54 GMT

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Johnny Bravo) 
wrote:
>On Thu, 23 Sep 1999 12:51:20 GMT, [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
>wrote:
>
>>  I have looked at the 3-letter chaining modes and so they are weak
>>becasue all they are all "error recoveryabel" So that an ememy needs
>>to only have a small sample of the code to see if a KEY works. This
>>is not possibel in an "all or nothing cipher" like scott16u. But then
>>again you may not have understood the simple tests to see this.
>
>  Even if you could check each one of these keys by checking a single byte,
>it doesn't change the simple fact that a 256 bit symmetric key will need an
>average of 5.789e76 tests to break that key.   
>
>  Even if these so called "weak" systems made it a trillion times easier to
>brute force than a cipher didn't have error recovery.  You could get a billion,
>billion computers that can try a billion, billion keys each and every second
> and
>it would still take 18 billion, billion, billion years  to check the 1/2 of the
>keyspace that it would take to give you a 50/50 chance of cracking that one
> key.
>
>
>  But then again you may not have understood the simple math it takes to see
>this.
    Yes John I understand this. But I doubt if you do becasue you may be
under the illusion enigma is still safe. All your doing is saying it is safe 
do to a dumb blind search as if a blind search means everything. It doesn't
but you can't seen to understand that.


David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Peekboo on hold for now
Date: Thu, 23 Sep 1999 21:58:31 GMT

I am kinda putting it on hold now.  If you are a peekboo user cheer me up and
send a single line email to [EMAIL PROTECTED] saying something like
'peekboo I see you' or something.  It's kinda depressing not getting much
feedback.

Anyways 1.51 is out.  I know it was quick but I try to fix things/add things
in clumps (by demand) and then release it.  In this release

- Made the message dialog bigger so it's easier to edit longer messages
- Added a reply thingy to add '>' to the senders message (like in email).
- Added csybrandy's experimental SC4 cipher
- Successfully turned XTEA into a RC5 clone.

BTW there are REAL ciphers in peekboo as well including blowfish, twofish,
cast, rc4, rc5 and rc6.

Check it out

http://www.cell2000.net/security/peekboo/index.html

and let me know what you think.

Tom


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: msg for Dave Scott
Date: Thu, 23 Sep 1999 21:55:04 GMT

In article <7sd48d$1ilu$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY) wrote:
>    Actaully I feel that most systems are very weak. Since they incorporate
> weaknesses that are esily verifabel to leak info. I actaully designed the
> scott19u contest so that if the exact thing was done in AES using your
> favortie chaining mode that even David Wagner could solve the problem
> in less than a day. But you may not be bright enough to see that.
>   I have looked at the 3-letter chaining modes and so they are weak
> becasue all they are all "error recoveryabel" So that an ememy needs
> to only have a small sample of the code to see if a KEY works. This
> is not possibel in an "all or nothing cipher" like scott16u. But then
> again you may not have understood the simple tests to see this.
>   I looked at compression and can see that if the compression used
> is not "one to one" an attacker may have enough info to break
> a cipher knowing only the encrypted files and the compression
> method used regardless of the file encrypted. If that is not
> enough to make people wake up to who controlled and directed
> open crypto is then there is not much I can do.
>  Yes I think scott16u and scott19u is strong becasue it does
> not have all the weaknesses that leak info that will be in
> the short key weak 3-letter chaining systems that will be the
> norm in AES.
>  And yes I am not a GOD my code could be weak too. But at least
> it is not weak in areas that the AES methods will obviously be weak.
> I had hoped more would be in the realease of GnuPG but I see it
> will only use short keyed methods with non "one to one" compression.
> But why should any one care if the NSA can break it.

Well if CBC is so weak and gives out so much information get a copy of
peekboo and try to read this message

bfaqaaGik2oVegHPVtbaaaiaaaaag6KchEVngBzQnLvprvkIhyasYqmlYkeVWUkzD71s
RfjU9AyHGrvaEryg4aZnk8Zb1ApeR{swEJ2cLxpyyycL1KVBNohancJbWzWh3Rm33fa}
Mwbaa

(I think only I and csybrandy can read this message without tampering with
us, or taking the time to solve the dl which has shown to be difficult, this
was encrypted with PeekBoo 1.51)

Anyways, just admit it appears hard to solve.  I admit I can't break Scottu
and I admit I can't break RC4 (cast, etc...).  I admit I can't solve a dl nor
do I even know where to start (...).  But I don't proclaim that anything I
don't understand is weak.

Tom


Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.

------------------------------

From: "Kasper Pedersen" <[EMAIL PROTECTED]>
Subject: Re: Securing Executables
Date: Wed, 22 Sep 1999 23:02:38 +0200


Peter Johnson <[EMAIL PROTECTED]> wrote in message
news:7sb75f$f49$[EMAIL PROTECTED]...
> I'm designing a client/server application that will run in real-time.
> Assuming that the network traffic is secure by using strong encryption, a
> good random number generator for packet sequencing, compression, etc. how
do
> I protect against an attack on the client executable?

You can't. Someone else could install a new spying app and monitor your exe.

> For example, if the attacker were running the executable in a debugger
could

A user attacking the system is probably undetectable. He could be using an
in-circuit emulator.

You need some other form of security. Why does the client app need to be
secured? A passive listener might gather passwords, a hostile user already
has the password. In the days of Windows, passive attacks have become really
easy. Ensuring integrity of the environment (the computer/os/app) is
impossible without physical security. That's why my bank uses one-time pads
(on paper cards) for login/transaction acknowledgenemt, allowing the
computer/exe to be compromised without a disaster as result.

IF a user is able to do something he isn't supposed to WHEN he gets access
to the data stream, you should let the server deny him that service instead
of relying on the client.

/Kasper





------------------------------

Subject: Re: Mystery inc. (Beale cyphers)
From: [EMAIL PROTECTED] (Curt Welch)
Date: 23 Sep 1999 22:32:45 GMT

Tim Tyler <[EMAIL PROTECTED]> wrote:

> FWIW, you can get hold of all three cyphertexts by getting hold of a
> copy of Simon Singh's "The Code Book".
>
> Alternatively all three codes are available at:
>
> http://treasurehunt.miningco.com/bllet.htm?pid=2804&cob=home

That site seems to have the bogus Hart version of Beal #2.

-- 
Curt Welch                                            http://CurtWelch.Com/
[EMAIL PROTECTED]                          Webmaster for http://NewsReader.Com/

------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: Thu, 23 Sep 1999 16:09:30 -0700
Crossposted-To: alt.security.pgp

While I certainly am not an expert on what was actually done, I can draw
some conclusions about the strength based on what has been said here.
According to a previous message, it was assumed (or known) that the factors
were 320-bits each, these are quite rare, especially since the growth of the
total number of prime numbers is lg(x)/x, at 320 bits, the growth is
extremely slow, so 320 bit primes (even 320-bit pseudo-primes) are very
rare. Given modern methods, we can produce primes of a given length (say
320-bits) quite easily. Given these two facts it is quite possible that
there was a relatively small number of potential solutions, and that the
lists (two identical lists) were no longer than an estimated 600 long (just
a quick guess by me). By sorting these we could run very easily through
them, first searching to see if the 640-bit number is simply the square of 2
primes, second:
i=1
j=number_of_primes_in_list
p = primes_list[i]
q = primes_list[j]
while(j > i)
  pq = p*q
    if pq == number
        factors found successfully
    if pq > number
        j = j-1
        q = primes_list[j]
    if pq < number
        i = i+1
        p = primes_list[i]
end while
if we get here we failed, generate more primes and start again

given that there are exactly k 320-bit assumed primes (it's more costly to
check the primality of a number that size than to simply leave it in the
list, it will take at most k iterations through the loop, a proposition that
is very fast on todays systems.

I hope that answers the question of what special properties were exploited,
because that's a fairly major one.
                    Joseph



------------------------------

From: Alex <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: 23 Sep 1999 19:39:48 -0400


> the growth of the total number of prime numbers is lg(x)/x,

I was under the impression that it's the other way around, i.e. the
number of primes less than x is roughly x/log(x).  I don't know the
number theory, so I could be mixed up...  If I'm not mistaken, then I
think the approach you're suggesting for factoring the key would have
taken a very long time.

Alex.

------------------------------

From: Eric Lee Green <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: Another bug RE: CryptAPI
Date: Thu, 23 Sep 1999 16:02:13 -0700

Christopher Biow wrote:
> Back to the closed/open source comparison: Note that it was the leaking of
> source-code symbol information which allowed the identification of the "NSA
> key" vulnerability. Recovering that information from the object code (to
> the point of noting the vulnerability) would have been much more expensive.
> Recovering the semantics provided by "NSA key" from the object code would
> probably not have been possible.

Absolutely false. The vulnerability was explained in one of Bruce
Schneier's newsletters over six months before the "NSA Key" furor. It
was described as a second key, that could be replaced by your own public
key in order to insert unauthorized cryptographic components. 

Also see: http://www.cs.berkeley.edu/~daw/papers/

and read the paper about vulnerabilities in an (old) version of the
Netscape browser. David Wagner succesfully reverse-engineered the SSL
code in the original Netswcape=-SSL, and found a rather surprising
vulnerability. 

Regarding "open source vs. closed source", Unix or Unix-lookalike
operating systems account for approximately 75% of the operating systems
on the Internet. It is unsurprising that there have thus been more
network-related attacks directed against them. This is similar to the
fact that Microsoft software has a 90% market share in the desktop
operating system, and MS Office has 90% of the office suite market and
thus is the primary target of those who would create macro viruses. It
is quite possible to create a macro virus similar to "Melissa" that
would infect GNU Emacs (there's certain "magic" constants that will make
GNU Emacs execute arbitrary LISP code), but nobody has done it because
it's not the market leader. I have not seen significant differences
between the security alerts for closed-source Unix variants and the
security alerts for open-source Unix variants. If you have, please
direct me to the source of that information.  

-- 
Eric Lee Green    http://members.tripod.com/e_l_green
  mail: [EMAIL PROTECTED]
                   There Is No Conspiracy

------------------------------

From: "Dmitriy Morozov" <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: Thu, 23 Sep 1999 19:55:15 -0400

I'm definitely missing something here.

Joseph Ashwood <[EMAIL PROTECTED]> wrote in message
news:OI1M0ihB$GA.199@cpmsnbbsa02...
> According to a previous message, it was assumed (or known) that the
factors
> were 320-bits each, these are quite rare, especially since the growth of
the
> total number of prime numbers is lg(x)/x, at 320 bits, the growth is
> extremely slow, so 320 bit primes (even 320-bit pseudo-primes) are very
> rare.

Where did this come from. I know that there are approximately n/ln(n) prime
numbers for large n's. According to this fromula there would be 2^312
(10^93) primes that are 320 bits long.

> Given modern methods, we can produce primes of a given length (say
> 320-bits) quite easily. Given these two facts it is quite possible that
> there was a relatively small number of potential solutions, and that the
> lists (two identical lists) were no longer than an estimated 600 long
(just
> a quick guess by me).

What is this 600? And how are you going to create (in your life time), and
more of that store a list that has 2^312 320 bit entries in it?

>By sorting these we could run very easily through
> them, first searching to see if the 640-bit number is simply the square of
2
> primes, second:
> i=1
> j=number_of_primes_in_list
> p = primes_list[i]
> q = primes_list[j]
> while(j > i)
>   pq = p*q
>     if pq == number
>         factors found successfully
>     if pq > number
>         j = j-1
>         q = primes_list[j]
>     if pq < number
>         i = i+1
>         p = primes_list[i]
> end while

That's a pretty weird algorithm. Correct me if I'm wrong but if you actually
composed the kind of table you mentioned above, wouldn't it be easier to
just go over the whole table (just one of the lists), and find something
like number mod table[i]. If this is equal to 0 then you found your factors
if not then keep on going?

> given that there are exactly k 320-bit assumed primes (it's more costly to
> check the primality of a number that size than to simply leave it in the
> list, it will take at most k iterations through the loop, a proposition
that
> is very fast on todays systems.

Are you saying 2^312 iterations through the loop is fast???


> I hope that answers the question of what special properties were
exploited,
> because that's a fairly major one.

I definitely didn't get something. Explain me, please, what I am missing.

--
Dmitriy Morozov
[EMAIL PROTECTED]




------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: Thu, 23 Sep 1999 17:19:55 -0700
Crossposted-To: alt.security.pgp

I was just doing a quick approximation of the derivative (the method you
remember is the number of primes of at most a given length), it's probably
wrong, as a matter of fact I'm sure it's wrong, there needs to be a lg lg
term in there somewhere. However as an approximation of the number of primes
of a given length (not including the ones less than that length) it
shouldn't be too far off in the range we're talking about.

The main factor in the decrease of the time is that it was known (I don't
know how) that the factros were both 320-bits, a special case where the
range of primes is massively smaller, and meaning that for this special case
the derivative is far more important than the original.

            Joe

ps. Why is it that everyone missed the implications of the word "growth"?



------------------------------

From: "Dmitriy Morozov" <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: Thu, 23 Sep 1999 20:36:13 -0400

> I was guessing at an approximation of the derivative (it's not exact, but
in
> the realm that we're dealing in it's probably not far off), this is

Derivative of what?

> > According to this fromula there would be 2^312
> > (10^93) primes that are 320 bits long.
> Your number includes all of the primes up to and including 320-bits, I
will

No this number is the number of primes that are less than 2^321 minus the
number of primes that are less than 2^320, so that's the number of primes
that are 320 bits long. (Am I wrong?)

> admit though that the list is larger than 600, by a few orders of
magnitude.

A few hundred?

> estimate of 2^8 or 256 non-prime numbers less than 2^320, I can easily
count
> that many less than 514 (simply count the even numbers except 2). I can

I did not give such an estimate. My estimate would be 2^320 - 2^312 which is
not 2^8, but is (2^312)*(2^8-1) = (2^312)*255

> without much effort assert that there can be no more than 2^(n/2) primes
> less than or equal to 2^n, that's still a huge area, but if you see the
> above virtually all of the area can be ignored.

Sorry but I disagree. Out of the Prime Numbers Theorem (that's where the
formula n/ln(n) comes from) there would be way more primes, and I gave you
actual numbers...


> As I said earlier, it would be far less than 2^312, and the viable ones
are
> probably in the 2^40 range, which can be searched easily

We are obviously talking about two different things. Either you somehow
(how?) eliminate almost all prime numbers (well, most of them), or your
numbers are wrong.

> The problem with that is recieving false positives, and the fact that
> modular division is much more costly than the multiplication that I used.
> That is in fact the best algorithm I know of for finding factors of a
number
> known to have 2 factors.

Though it sounds kind of strange to me, I won't argue on this one, simply
'cause I don't know.

> > Are you saying 2^312 iterations through the loop is fast???
> I'm saying that it will run in a time determined by k*(time to multiply a
> 320-bit number), which given the fact that we're dealing with a much
smaller
> area than 2^312, can be completed far more quickly than you seem to
predict

Well, again why is the are much smaller than 2^312?

--
Dmitriy Morozov
[EMAIL PROTECTED]




------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: NEW COMPRESSION
Date: Fri, 24 Sep 1999 00:51:42 GMT

 In honor of another Great Scott. I thoughtt I would put out my newest
compression stuff.
h2comaa.exe
 
 What this program does is solve the limited symbol "one to one" compression.
It uses a condition file to preload the table with the symbols used.
The smallest number of unique 8-bit symbols it handles is 9.

called by h2comaa.exe  inputfile outputfile conditionfile

the outfile is an adaptive huffman compress texted based on the symbols in
the conditionfile.

slim.exe  inputfile outputfile conditionfile
this program rewrites the inputfile with characters not in conditionfile 
getting stripped. This is the file you get if you uncompress

h2uncaa.exe inputfile outputfile conditionfile
this uncompress using a preloaded start table for the adaptive
huffman decompressor.

 if the file is a randomfile for input the output will only contain the 
symbols in the conditionfile. 

AND IT IS ONE TO ONE

one thing you can do is take something written in 20 symbols compree and
use the 20 symbols in the condtion file.  You get a binary compressed
file you can then encrypt this file you favorite encryption porgram
then if you like your can use the decompressor to expand to a file with 9
to 256 symbols.

This thing changes strings in one base to another base in a unique way
that is one to one.

http://members.xoom.com/ecil/compress.htm   is the start
http://members.xoom.com/ecil/compres4.htm  is the one with this feature

P.S. I have tested it but it is still experimental
if you find errors let me know
 


David A. Scott
--
                    SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
                    http://www.jim.com/jamesd/Kong/scott19u.zip
                    http://members.xoom.com/ecil/index.htm
                    NOTE EMAIL address is for SPAMERS

------------------------------

From: Greg <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: EAR Relaxed? Really?
Date: Fri, 24 Sep 1999 00:31:51 GMT


> Any decent security company would never sign such a restrictive
> NDA in the first place.

But we are talking about Bill Gates and the international
market place.  That changes everything, doesn't it?

--
Truth is first ridiculed, then violently opposed, and then it is
accepted as self evident ("obvious").

I love my president... I love my president... I love my president...


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "Dmitriy Morozov" <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: Thu, 23 Sep 1999 20:53:36 -0400

> I was just doing a quick approximation of the derivative (the method you
> remember is the number of primes of at most a given length), it's probably
> wrong, as a matter of fact I'm sure it's wrong, there needs to be a lg lg
> term in there somewhere. However as an approximation of the number of
primes
> of a given length (not including the ones less than that length) it
> shouldn't be too far off in the range we're talking about.

Well, he is not wrong. The prime number theorem says something like the
following. lim ((number_of_primes_less_than_X)/(x/ln(x))) when x is
approaching infinity is equal 1. Thus for large n's the formula for the
number of primes less or equal n (n/ln(n)) would be true. Actually, I just
checked my Calculus textbook and that's what it says in there.

> The main factor in the decrease of the time is that it was known (I don't
> know how) that the factros were both 320-bits, a special case where the

You can make this assumption in many different cases I guess. Weren't both
prime factors of RSA-155 78 digits long?

> ps. Why is it that everyone missed the implications of the word "growth"?

We didn't miss it; we, well, I just didn't understand where it plus
derivative that you are talking about came from.

--
Dmitriy Morozov
[EMAIL PROTECTED]




------------------------------

From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: RSA 640 bits keys factored, French banking smart card system craked!
Date: Thu, 23 Sep 1999 17:15:29 -0700
Crossposted-To: alt.security.pgp

> Where did this come from. I know that there are approximately n/ln(n)
prime
> numbers for large n's.
I was guessing at an approximation of the derivative (it's not exact, but in
the realm that we're dealing in it's probably not far off), this is
important because of the information that both primes were 320-bits, if one
were under 2^320 the other would have to be at least 2^321, violating the
proposition of the second being 320-bits, this greatly reduces the potential
numbers. Add to this that we know that an overwhelming % of the time one of
two prime generation techniques was used (either 2^2^n+1 mod n, or the
strong prime method given in p1363) and we can eliminate most of the primes
even in the given category.

> According to this fromula there would be 2^312
> (10^93) primes that are 320 bits long.
Your number includes all of the primes up to and including 320-bits, I will
admit though that the list is larger than 600, by a few orders of magnitude.
The difference is still an absultely staggering number, due to the fact that
each of the primes is in fact 320-bits, hence a number like 2, or even
2^320-1 can not be a part of the solution. I have serious doubts about your
estimate of 2^8 or 256 non-prime numbers less than 2^320, I can easily count
that many less than 514 (simply count the even numbers except 2). I can
without much effort assert that there can be no more than 2^(n/2) primes
less than or equal to 2^n, that's still a huge area, but if you see the
above virtually all of the area can be ignored.

> What is this 600? And how are you going to create (in your life time), and
> more of that store a list that has 2^312 320 bit entries in it?
As I said earlier, it would be far less than 2^312, and the viable ones are
probably in the 2^40 range, which can be searched easily

> That's a pretty weird algorithm. Correct me if I'm wrong but if you
actually
> composed the kind of table you mentioned above, wouldn't it be easier to
> just go over the whole table (just one of the lists), and find something
> like number mod table[i]. If this is equal to 0 then you found your
factors
> if not then keep on going?
The problem with that is recieving false positives, and the fact that
modular division is much more costly than the multiplication that I used.
That is in fact the best algorithm I know of for finding factors of a number
known to have 2 factors.

> > given that there are exactly k 320-bit assumed primes (it's more costly
to
> > check the primality of a number that size than to simply leave it in the
> > list, it will take at most k iterations through the loop, a proposition
> that
> > is very fast on todays systems.
>
> Are you saying 2^312 iterations through the loop is fast???
I'm saying that it will run in a time determined by k*(time to multiply a
320-bit number), which given the fact that we're dealing with a much smaller
area than 2^312, can be completed far more quickly than you seem to predict





------------------------------

From: "Brian McKeever" <[EMAIL PROTECTED]>
Subject: Re: Galois field isomorphism?
Date: Thu, 23 Sep 1999 18:16:14 -0700


Bob Silverman <[EMAIL PROTECTED]> wrote in message
news:7saflo$382$[EMAIL PROTECTED]...
> In article <[EMAIL PROTECTED]>,
>   Paul Crowley <[EMAIL PROTECTED]> wrote:

> > Are there non-trivial isomorphisms from the set onto
> > itself?
>
> Yes. The Frobenius map is one such.

Just to clarify for the OP:  this is defined as f:F->F, f(x) = x^p, where p
is the characteristic of the field F.

Brian



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to