Cryptography-Digest Digest #152

2001-04-15 Thread Digestifier

Cryptography-Digest Digest #152, Volume #14  Sun, 15 Apr 01 16:13:01 EDT

Contents:
  Re: NSA-Endorsed Schools have a Mediocre Internet Presence ("Jack Lindso")
  Re: Rabin-Miller prime testing (James Davenport)
  Re: XOR_TextBox:  Doesn't write to swap file if... ("Ryan M. McConahy")
  Reusing A One Time Pad ("Mark G Wolf")
  Re: Announcing A New Rijndael Encryption Algorithm Implementation ("Ryan M. 
McConahy")
  Re: Reusing A One Time Pad ("Mark G Wolf")
  C Encryption ("Logan Raarup")
  Re: Password tool! (Matthew Skala)
  Re: LFSR Security (David Wagner)
  Re: Remark on multiplication mod 2^n (Mark Wooding)
  Re: C Encryption (Mark Wooding)
  Re: AES poll (SCOTT19U.ZIP_GUY)
  Re: Reusing A One Time Pad ("Tom St Denis")
  Re: MS OSs "swap" file:  total breach of computer security. (Steve K)
  Re: please comment ("Paul Pires")



From: "Jack Lindso" [EMAIL PROTECTED]
Subject: Re: NSA-Endorsed Schools have a Mediocre Internet Presence
Date: Sun, 15 Apr 2001 21:14:06 +0200

We aren't there yet (Utopia), and until we are I wouldn't touch the
SELinux. Were I in place of the NSA I would have certainly made
sure that the Linux works "exactly" as I want it to. That's life.

Anticipating the future is all about envisioning the Infinity.
http://www.atstep.com

"Mok-Kong Shen" [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]...


 Frank Gerlach wrote:
 
 [snip]

  I am attributing this to the dominance of the spooks, who have no
  real interest in spreading good security.

 The human society is extremely complex and involved.
 Look e.g. at the pharma industry. Their 'ideal' would
 be selling a particular product 'forever', thus saving
 the often very high investment to find better medicaments.
 Were it not for the competition, I don't believe that
 there would have been substantial incentives to conduct
 RD simply for the benefit of the illed on purely moral
 grounds, as long as the fiscal balance sheet of the
 company is excellent. Thus don't be surprised by the
 phenomenon you described and severely curse them. They
 are just humans, in fact not unlike most of us in
 'principle' (even if you would disagree and protest
 against this viewpoint), always attempting to find some
 'optimum' for themselves (alone). Other examples abound
 in the arena of politics.

 BTW, I think that the increased use of new technologies
 in wireless communications (I recently saw the term SR,
 software radio, in this connection. Could someone give
 the exact definition of it?) and the rapid expansion of
 the total message volume may one day render effective
 surveillance and intelligence gathering technically
 infeasible. At that time point, the existence of
 the agencies would be economically questionable. It
 could then be the case that these would be dissolved,
 releasing their scientists to the civilian world, and
 the knowledge 'gap' between them and the academics, as
 was mentioned in a previous post in this thread, would
 then be perfectly closed. Of course, this is yet all
 utopic.

 M. K. Shen



--

From: [EMAIL PROTECTED] (James Davenport)
Subject: Re: Rabin-Miller prime testing
Date: Sun, 15 Apr 2001 18:09:04 GMT

In the referenced article, "Tom St Denis" [EMAIL PROTECTED] writes:

"Benjamin Johnston" [EMAIL PROTECTED] wrote in message
news:9b9eru$t5m$[EMAIL PROTECTED]...

 I eventually managed to track down a paper (Primality Testing Revisited) by
 J.H. Davenport, 1992) which gave me the impression that it is standard
 practice to use the set of bases {3,5,7,11,13,17,19,23,29,31}.

In general I use the first N primes as my primes in MR.  If you use say 10
passes of MR you are going to be very sure you have a prime if it passes all
rounds.  In practice I have never made a prime with MR that Maple couldn't
test as prime too so I think the method works well.

If you are generating the primes, this is OK (in practice: there are still
some theoretical objections). However, as the paper Benjamin
quoted points out, if you are verifying that primes some-one else has sent
you really are primes, then any fixed list has problems.

One should also read:
Arnault,F.,
Rabin-Miller Primality Test:
Composite Numbers which Pass it.
Math. Comp. 64(1995) pp. 355-361.

James Davenport
[EMAIL PROTECTED]
  ^^^remove

--

From: "Ryan M. McConahy" [EMAIL PROTECTED]
Crossposted-To: talk.politics.crypto,alt.hacker
Subject: Re: XOR_TextBox:  Doesn't write to swap file if...
Date: Sun, 15 Apr 2001 14:15:31 -0400


"Anthony Stephen Szopa" [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]...
 "Trevor L. Jackson, III" wrote:
 
  Fair Warning (for the uninformed):  This software is garbage.  The
author7
  does no

Cryptography-Digest Digest #152

2000-11-13 Thread Digestifier

Cryptography-Digest Digest #152, Volume #13  Mon, 13 Nov 00 22:13:00 EST

Contents:
  Re: On an idea of John Savard (Tom St Denis)
  Re: On an idea of John Savard (Tom St Denis)
  Re: Learning Differential and Linear Cryptanalysis? (Tom St Denis)
  Re: Book recommendation, please ("[EMAIL PROTECTED]")
  Re: LFSR's (Tom St Denis)
  Re: Chimera ciphers (WAS Re: On an idea of John Savard) (John Savard)
  Re: Security of Norton YEO (Your Eyes Only) ("A [Temporary] Dog")
  Re: voting through pgp (Greggy)
  Re: voting through pgp (Greggy)
  Re: voting through pgp (Greggy)
  Re: voting through pgp (Greggy)
  Re: Why remote electronic voting is a bad idea (was voting through pgp) (Greggy)
  Re: voting through pgp (Greggy)
  DSS/DSA and DH Parameters ([EMAIL PROTECTED])
  Re: Algorithm with minimum RAM usage? ("Scott Fluhrer")
  PGP still the no1? ("Sascha Klein")
  Re: voting through pgp ("Trevor L. Jackson, III")
  Re: LFSR's ("Trevor L. Jackson, III")
  Re: LFSR's ("Trevor L. Jackson, III")



From: Tom St Denis [EMAIL PROTECTED]
Subject: Re: On an idea of John Savard
Date: Mon, 13 Nov 2000 23:56:32 GMT

In article [EMAIL PROTECTED],
  Mok-Kong Shen [EMAIL PROTECTED] wrote:

 If you interleave two good ciphers I believe that the
 result is certainly stronger than any single one.

Why?

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: Tom St Denis [EMAIL PROTECTED]
Subject: Re: On an idea of John Savard
Date: Mon, 13 Nov 2000 23:58:46 GMT

In article [EMAIL PROTECTED],
  David Schwartz [EMAIL PROTECTED] wrote:

 Mok-Kong Shen wrote:

  If you interleave two good ciphers I believe that the
  result is certainly stronger than any single one.

   This is certainly false if you, for example, interleave DES with
 reverse DES.

Unfortunately the problems with "mixing ciphers" is less trivial then
that.  For example in safer+ a three layer PHT is used to mix up the
plaintext, however in serpent a dedicated linear transform is used.  If
you arbitrarily remove the linear transform then the serpent round is
much less secure since we cannot easily count active sboxes, etc...

Anyways...

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: Tom St Denis [EMAIL PROTECTED]
Subject: Re: Learning Differential and Linear Cryptanalysis?
Date: Tue, 14 Nov 2000 00:04:39 GMT

In article 8ups0b$nap$[EMAIL PROTECTED],
  Simon Johnson [EMAIL PROTECTED] wrote:


 Where can i find refrence material, books etc. with a clear and
consise
 explanation of these two attacks?

 Books on the subject would be excellent, if not URL's :)

Differential attacks in theory are very easy to understand.  For some
reason I can't quite grasp linear attacks.  Well perhaps I haven't
given it enough thought.

At either rate look at Eli Bihams page he has papers on both types of
attacks.

In the generic differential attack you are looking for a input and
output difference that occur as a pair with a high probability.  Only
certain pairs of inputs will cause the difference required.  Thus given
a 'keyed' function such as y = F(x xor key) you can find the key by
determining what the input you sent was and what the required input was
(i.e just xor it out to get the key).  Often more then one pair of
inputs will cause the required output difference (assuming the input
pair has the same output difference) and in this case more then one key
will be suggested.

In the generic linear attack you have a linear expression of the input
and output bits (i.e the parity) and of some key bits (i.e the input
bits involved since they are linearly dependent on the input bits).
You take some expression such as y0 xor y1 = x0 xor x3 xor k0 xor k3
xor 1, and if the approximation holds with a probability of more/less
then 1/2, then it can be used to suggest keys.  In the trivial case of
something like y0 xor y1 = x0 xor k0, the expression will reveal 'k0'
since you know 'y0 xor y1' and 'x0'.

I hope I have the "generic" cases right.  Obviously you should read
Biham's papers (espescially "On Matsui's Linear Cryptanalysis"
and "Differential Cryptanalysis of DES-Like Cryptosystems" which he co-
authored with Adi Shamir).  They explain both attacks in some depth.

Tom


Sent via Deja.com http://www.deja.com/
Before you buy.

--

From: "[EMAIL PROTECTED]" [EMAIL PROTECTED]
Subject: Re: Book recommendation, please
Date: Tue, 14 Nov 2000 00:11:43 GMT

Thank you all for your suggestions.  I greatly appeciate it.


On Sun, 12 Nov 2000 19:25:51 GMT, "[EMAIL PROTECTED]"
[EMAIL PROTECTED] wrote:

My 16 year old son has become interested in cryptography (after
reading Cryptonomicon).  He is very computer literate, takes AP
Computer Science in school

Cryptography-Digest Digest #152

2000-07-03 Thread Digestifier

Cryptography-Digest Digest #152, Volume #12   Mon, 3 Jul 00 12:13:00 EDT

Contents:
  Re: very large primes (Mark Wooding)
  Re: Tying Up Lost Ends III (SCOTT19U.ZIP_GUY)
  Re: Newbie question about factoring (Bob Silverman)
  Re: Encryption and IBM's 12 teraflop MPC.. ("Tony T. Warnock")
  Re: Newbie question about factoring (Nick Maclaren)
  Re: very large primes ("Tony T. Warnock")
  Re: Encryption and IBM's 12 teraflop MPC.. (Tom McCune)
  Re: Hashing Function (not cryptographically secure) ("Scott Fluhrer")
  Re: Tying Up Lost Ends (SCOTT19U.ZIP_GUY)
  research on cryptography ("Foo Kim Eng")
  Re: A simple all-or-nothing transform (Mok-Kong Shen)
  Re: research on cryptography (David A Molnar)
  Re: Crypto Contest: CHUTZPAH... ("Paul Pires")



From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: very large primes
Date: 3 Jul 2000 12:24:41 GMT

Rick Pikul [EMAIL PROTECTED] wrote:

 IIRC, the proof by induction used the product of every *prime* up to
 and including the "last" prime.

It works either way.  As an example:

Let P be the `last' prime.  Then P! + 1 = 1 (mod n) for all integers 0 
n = P, and hence no such integer n divides P! + 1.  Thus, P! + 1 has at
least one prime factor greater than P, contradicting the hypothesis.

-- [mdw]

--

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Tying Up Lost Ends III
Date: 3 Jul 2000 13:21:43 GMT

[EMAIL PROTECTED] (John Savard) wrote in 
[EMAIL PROTECTED]:

On 2 Jul 2000 13:18:58 GMT, [EMAIL PROTECTED]
(SCOTT19U.ZIP_GUY) wrote, in part:

 In the previous posts two I cleaned up the lost Ends in John's work in
the follow up I will show what he did was a waste of time and
unfortunutly it is most likely a result of reading bad BS crypto books
that have led him astray.

Oh, before I ever saw Bruce's book, I went to University for several
years. Although I don't recall taking information theory in class,
that certainly did help me develop the mathematical background to
understand stuff I read about it on my own time.

  I guess the only good think but that is you didn't read it on
someone elses time. I admit I would never buy the book but have
looked at various chapters at bookstores or public librares.


So I suppose you could say that I've been "led astray" more
intensively.



  To bad a class on information theory would have helped you.
 



Of course, I was first exposed to the subject through a "crypto book",
but in this case, David Kahn's "The Codebreakers".

   The above book is a GOOD ONE at least the earler versions of
it. So I don't think the problen is from reading that one.


So one should not directly fault him
for his poor understanding of compression encryption and what the
defination of random means

This is polite of you, but I'm not sure you've convinced too many
people that my understanding is lacking.



  I think those that really have the ability to think of the problem
already have made up there mind. After all is light a particle or a
wave. Or is it best to treat it as either depending on the situation
at hand. However the fact is my compression "adds ZERO information"
to a file. Reagardless of the true unlerlying nature of real files.
distributions as they actaully exist. As even Tim Tyler has repeatedly
tried to tell you. I also doubt you considered this polite. But I am
amased you can wrote so intensively on how one should compress yet
never really Tye Lose Ends or write code. There is no doubt in my
mind that as much as you hate me that these last few letters may
Tie Up enough Losse Ends that you could actaully write code to do
your form of compression except for the so called random bits you
need. That is going to be tough.
  But before you say mine is weak at least have the balls to test
the code instead of hand waving, I feel it is a cheap shot to critize
mine and then write about what you think is a better method without
ever Tying Up Loose Ends to the point where you can even write code.
It is easy to test my "adaptive 1-1 compression routines" to see if
it fails the adding of ZERO information to a file. I wrote code and
yes people have discovered mistakes but in every case they find an
error it was fixed. It is hard to fix hand waving crap or even test
it when you don't even have a working product John. How much of your
coding do you want me to do. I won't do the whole thing. You have to 
get off your ass and do some of the coding your self. Especailly
the adding and obtaining the "random bits" you need for yours to even
function.



David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website NOT FOR WIMPS **no JavaScript allowed**
http://members.xoom.com/ecil/index.htm
Scott rejected paper f

Cryptography-Digest Digest #152

2000-02-18 Thread Digestifier

Cryptography-Digest Digest #152, Volume #11  Fri, 18 Feb 00 14:13:01 EST

Contents:
  Re: EOF in cipher??? (lordcow77)
  Re: Question about OTPs ("Dr.Gunter Abend")
  Re: OAP-L3 Encryption Software - Complete Help Files at web site (Lincoln Yeoh)
  Re: Method to break triple-DES (Mickey McInnis)
  Re: UK publishes 'impossible' decryption law ("shivers")
  Re: Processor speeds. (Mok-Kong Shen)
  Re: EOF in cipher??? (Mok-Kong Shen)
  Re: VB  Crypto (Glenn Larsson)
  Re: I stole also the diary and calendar of Markku J. Saarelainen (Mok-Kong Shen)
  Re: NTRU Speed Claims (100x faster, etc.), explained (Mike Rosing)
  Re: Basic Crypto Question 4 (Mok-Kong Shen)
  Re: Does the NSA have ALL Possible PGP keys? (Johnny Bravo)
  Re: EOF in cipher??? (Johnny Bravo)
  Re: Keys  Passwords. (Mok-Kong Shen)
  Re: Basic Crypto Question 4 (Mike Rosing)
  Re: EOF in cipher??? (Johnny Bravo)
  Re: EOF in cipher??? (Jerry Coffin)
  Re: Q: Division in GF(2^n) (Jerry Coffin)
  Re: EOF in cipher??? (Jerry Coffin)



Subject: Re: EOF in cipher???
From: lordcow77 [EMAIL PROTECTED]
Date: Fri, 18 Feb 2000 08:59:11 -0800

In article [EMAIL PROTECTED], Runu Knips
[EMAIL PROTECTED] wrote:
Thats both against the standard. ISO specifies that int must be
at least
16 bit (i.e. it says short must be 16 bit, and sizeof(int) =
sizeof(short),
which means int is also at least 16 bit). A character value,
however, is
the smallest type which has at least 7 bit.

ISO C Standard 5.2.4.2.1 :
"Their implementation-defined  values shall be equal or greater
in  magnitude  (absolute  value)  to  those shown, with the same
sign...number  of  bits for smallest object that is not a bit-
field CHAR_BIT 8"

"If the value of an object of type char is treated as a signed
integer  when  used  in  an expression, the value of CHAR_MIN
shall be the same as  that  of  SCHAR_MIN  and  the value  of
CHAR_MAX  shall be the same as that of SCHAR_MAX. Otherwise, the
value of CHAR_MIN shall be 0 and the value of CHAR_MAX  shall
be  the  same as that of UCHAR_MAX)  The value  UCHAR_MAX+1
shall  equal  2  raised  to  the power of CHAR_BIT"



* Sent from RemarQ http://www.remarq.com The Internet's Discussion Network *
The fastest and easiest way to search and participate in Usenet - Free!


--

From: "Dr.Gunter Abend" [EMAIL PROTECTED]
Subject: Re: Question about OTPs
Date: Fri, 18 Feb 2000 18:15:13 +0100

John Savard wrote:
 
Is that still true if you compress the plaintexts beforehand?
Especially if you use a compression algorithm that nearly
completely removes the redundancy of the original byte streams.
 
 That is sort of the complementary case to the OTP.
 
 Zero-redundancy plaintext, on the other hand, could be
 encrypted unbreakably even by a trivial system of encryption,
 since even if it was easy to try all possible keys, every key
 would yield a possible plaintext.

Oh, really?  If you use a home-made compression program, unknown
to the eavesdropper, it probably will compress ineffectively, so
that the statistical test gives a result, and if you use a
commercially available packer, he may try all possibilities in
order to *see* the plaintext. I fear, this trick will *not* solve
Arthur Dardia's problem.  It only makes it harder to detect a 
probable plaintext, because all possible packers have to be 
tested for all possible decryptions.

 The point is that the existence of such a case is not an
 argument against worrying about making a mistake that totally
 vitiates the security of an OTP.

No, of course. I only asked whether *simple* tools exist for
breaking impaired "OTP" of compressed messages.

Ciao,   Gunter

--

From: [EMAIL PROTECTED] (Lincoln Yeoh)
Crossposted-To: talk.politics.crypto,alt.privacy
Subject: Re: OAP-L3 Encryption Software - Complete Help Files at web site
Date: Fri, 18 Feb 2000 17:16:32 GMT
Reply-To: [EMAIL PROTECTED]

On Thu, 17 Feb 2000 22:43:42 -0800, Anthony Stephen Szopa
[EMAIL PROTECTED] wrote:

Convince us.  Prove your position.  Demonstrate that the software 
is "garbage."

Hmm didn't we go through this some time back in october 1999
(see OAP-L3: How Do You Spell S-H-A-R-E-W-A-R-E)

I ran your software through the LOAP-L3 (Link's Omniscient Analysis Program
- Level 3).
e.g.:

Sample output:
C:\LOAP-L3.EXE
Beginning analysis (please wait)
.
Done!

Result: "OAP-L3 is weak crypto/snake oil"

Thank you for using LOAP-L3!
C:\

As you can see, LOAP-L3 is a very useful piece of software. However you
aren't allowed to reverse engineer it to find out how it does what it does.
The source code and the algorithm used are top secret. I'm telling you it
works though! And I know what I'm talking about. Really! Honestly! Other
people are wrong, if they disagree they haven't looked at it thoroughly
enough."

Ha

Cryptography-Digest Digest #152

1999-09-01 Thread Digestifier

Cryptography-Digest Digest #152, Volume #10   Wed, 1 Sep 99 06:13:03 EDT

Contents:
  ignore this -- testing Free Agent (Ranche)
  Protecting license information (Ranche)
  Re: public key encryption - unlicensed algorithm (David A Molnar)
  Re: public key encryption - unlicensed algorithm (Paul Rubin)
  Re: Schneier/Publsied Algorithms (Eric Lee Green)
  Ratio plain/ciphertext (Ranche)
  Re: Can we have randomness in the physical world of "Cause and Effect" ? (sb5309)
  Re: WT Shaw temporarily sidelined (don garrisan)
  Re: n-ary Huffman Template Algorithm (Alex Vinokur)
  Re: public key encryption - unlicensed algorithm (Tony L. Svanstrom)
  Re: What if RSA / factoring really breaks? (Nicolas Bray)
  Re: Correction to Uhr Box Description (Frode Weierud)
  Re: Pincodes (Volker Hetzer)
  Re: public key encryption - unlicensed algorithm (Tony L. Svanstrom)
  SHA-1 and Blowfish in portable 80x86 Assembler ("Tomas FRYDRYCH")
  Re: RC4 question ("David Bourgeois")
  Matrix Exponentiation (Gary)



From: [EMAIL PROTECTED] (Ranche)
Subject: ignore this -- testing Free Agent
Date: Wed, 01 Sep 1999 03:15:09 GMT
Reply-To: [EMAIL PROTECTED]

I told you to ignore this :o)
---Ranche


--

From: [EMAIL PROTECTED] (Ranche)
Subject: Protecting license information
Date: Wed, 01 Sep 1999 03:23:55 GMT
Reply-To: [EMAIL PROTECTED]

I'd like to provide different versions of my application using some sort of licenses.
I want to make sure that nobody except me can create valid licenses.
What is the best way to achieve this?

--Ranche



--

From: David A Molnar [EMAIL PROTECTED]
Subject: Re: public key encryption - unlicensed algorithm
Date: 1 Sep 1999 03:16:12 GMT

Paul Rubin [EMAIL PROTECTED] wrote:
 hope the situation is similar in the UK, if a cc# does get stolen then
 the customer is not liable for any amount above 50 USD.  Some

I've been told that this is generally *not* the case in Europe and Asia.
If you lose your credit card, you are potentially liable to the credit
limit. A reference confirming or denying this would be nice. 

Then again, this came up during a presentation on SET, so perhaps it's
slanted towards impressing the audience with the need for SET. 

-David

--

From: [EMAIL PROTECTED] (Paul Rubin)
Subject: Re: public key encryption - unlicensed algorithm
Date: 1 Sep 1999 03:03:13 GMT

shivers [EMAIL PROTECTED] wrote:
This sounds usefull - however I have no control over what server software
our ISP uses - sounds like fun though ;)

Most current SSL servers support SGC.  If your ISP is running IIS,
Netscape, Stronghold, etc., then it's just a matter of installing an
SGC certificate for your domain into the server instead of an ordinary
certificate.

would be good - but our provider is in the US - the only place to get decent
facilities at a nice price ;)

It sounds like you're doing a fairly small-time web retailing
operation or something of that sort.  You have to ask yourself how
important fancy cryptography really is.  Stolen credit card numbers
are only worth around 7 USD each on the black market, so even with 40
bit SSL it's not worth the computing resources for anyone to steal
them by brute force cryptanalysis of 40-bit SSL keys.  Plus, they'd
have to somehow capture the data going into your server.  

The customer is generally protected anyway.  At least in the US, and I
hope the situation is similar in the UK, if a cc# does get stolen then
the customer is not liable for any amount above 50 USD.  Some
merchants like Amazon will fully guarantee the customer (i.e. for the
50 USD) and so far I don't think anyone ever has had to pay a single
penny on this type of guarantee.  Finally, consumers for the most part
aren't aware of this issue at all.  They see the closed "lock" icon in
their browser and feel safe--and in fact they are safe (because of the
bank guarantees, not the 40 bit cryptography).  The people who care
about SGC are mostly institutions like banks, because -they-, not
consumers or merchants, are the ones who get creamed if cc#'s get
stolen.  

Customers are much more likely to be scared away by special security
applets and similar weirdness than by the exportable cryptography in
their web browser.  If they care about the cryptography they can
always upgrade their browser to 128 bits (www.replay.com,
www.fortify.net, etc.)  And newer browsers (Netscape 4.6, etc.) use 56
bits instead of 40, as permitted by recent relaxation of the export
regs.  56 bit keys are not strong enough to protect really valuable
secrets, but they're far too strong to be worth breaking to get single
credit card numbers (remember that every SSL session uses a new 56 bit
key).

Summary: I think you're worrying about a problem that just doesn't
affect you in any practical way.  Unless there's something out of the
ordinar

Cryptography-Digest Digest #152

1999-02-27 Thread Digestifier

Cryptography-Digest Digest #152, Volume #9   Sat, 27 Feb 99 12:13:03 EST

Contents:
  Re: Quantum Computation and Cryptography (R. Knauer)
  Re: Quantum Cryptography (R. Knauer)
  Question on Pentium III unique ID ("Robert C. Paulsen, Jr.")
  Re: True Randomness (R. Knauer)
  Re: True Randomness (R. Knauer)
  Re: Any idea what this might be??? (Marc Hoeppner)
  Re: True Randomness (R. Knauer)
  Re: Testing Algorithms [moving off-topic] (R. Knauer)
  Re: Legal procedures for using third party crypto? (fungus)
  Re: Testing Algorithms (R. Knauer)
  Re: Question on Pentium III unique ID (Myself)
  Quantum Randomness (R. Knauer)
  Skipjack anyone? (Federico Ulivi)
  16Bit RC4 (iLLusIOn)
  Re: Testing Algorithms (fungus)



From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Quantum Computation and Cryptography
Date: Sat, 27 Feb 1999 12:29:48 GMT
Reply-To: [EMAIL PROTECTED]

On 26 Feb 1999 13:24:37 GMT, [EMAIL PROTECTED] (Coen Visser)
wrote:

Quantum computers can factorize faster than conventional computers. However
in general[*] they reduce the time complexity of a problem with just a square
factor. So if today a 128 bit key would be sufficiently secure, a 256 bit key
could be secure even when quantum computers are available.

Shor's quantum factoring algorithm is exponentially faster. I believe
you are referring to Glover's database algorithm.

Bob Knauer

"If you want to build a robust universe, one that will never go wrong, then
you dosn't want to build it like a clock, for the smallest bit of grit will
cause it to go awry. However, if things at the base are utterly random, nothing
can make them more disordered. Complete randomness at the heart of things is the
most stable situation imaginable - a divinely clever way to build a universe."
-- Heinz Pagels


--

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Quantum Cryptography
Date: Sat, 27 Feb 1999 12:39:26 GMT
Reply-To: [EMAIL PROTECTED]

On Sat, 27 Feb 1999 02:51:23 GMT, "Douglas A. Gwyn" [EMAIL PROTECTED]
wrote:

 "Moreover, the United States govternment is quietly funding research
 in code-breaking, using quantum computers".

So?  It would be criminally negligent to ignore potentially relevant
technology.

Hey, I am all for it. Better our govt than someone else.

The govt is the only entity that can afford to develop such
contraptions - what with an inexhaustable supply of other people's
money.

Actually, quantum computing is a fairly hot research topic in the
public sector.  The basic notion is to obtain massive parallelism
by encoding quantum states.

That's David Deutsch's original notion, and has been replaced by true
quantum computing, which not only takes advantage of massively
parallel superpositions (Deutsch's original notion) but also quantum
interference. It is quantum interference that makes the quantum
computer remarkable.

I think there was a Sci.Am. article on
this not too long ago, or I'm sure a Web search would turn up info.

As I have mentioned several times on sci.crypt, I recommend the book
entitled "Explorations In Quantum Computing" by Colin Williams and
Scott Clearwater. There are others, as you will discover if you visit
amazon.com.

But this book has a CD which has programs to run simulations. You can
actually see it factor (small) numbers on a probabilistic basis
running Shor's algorithm in simulation.

Bob Knauer

"If you want to build a robust universe, one that will never go wrong, then
you dosn't want to build it like a clock, for the smallest bit of grit will
cause it to go awry. However, if things at the base are utterly random, nothing
can make them more disordered. Complete randomness at the heart of things is the
most stable situation imaginable - a divinely clever way to build a universe."
-- Heinz Pagels


--

From: "Robert C. Paulsen, Jr." [EMAIL PROTECTED]
Subject: Question on Pentium III unique ID
Date: Sat, 27 Feb 1999 06:43:21 -0600

Perhaps I am missing something.

Many (most?) sustems today already have a unique ID that is software
accessible -- the MAC address of the network card.

In what way does the Pentium III add any additional privacy concern?

-- 
Robert Paulsen http://paulsen.home.texas.net
If my return address contains "ZAP." please remove it. Sorry for the
inconvenience but the unsolicited email is getting out of control.

--

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness
Date: Sat, 27 Feb 1999 13:03:18 GMT
Reply-To: [EMAIL PROTECTED]

On Sat, 27 Feb 1999 03:26:53 GMT, "Douglas A. Gwyn" [EMAIL PROTECTED]
wrote:

Maximum likelihood is a standard goal (there are others) for
estimation of model parameters:  determine the parameters that
maximize the likelihood of the observed training data.

I believe that