Cryptography-Digest Digest #500, Volume #11       Thu, 6 Apr 00 08:13:02 EDT

Contents:
  Re: ISO articles on sieving hardware (Francois Grieu)
  Re: Magnetic Remenance on hard drives. (Harald Milz)
  Re: Public|Private key cryptography? ([EMAIL PROTECTED])
  Re: Q: Entropy (Mok-Kong Shen)
  Re: Q: Entropy (Mok-Kong Shen)
  Re: Q: Simulation of quantum computing (Mok-Kong Shen)
  Re: Q: Entropy (Mok-Kong Shen)
  Re: Q: Entropy (Mok-Kong Shen)
  Re: Q: Entropy (Niklas Frykholm)
  Re: Stolen Enigma (Anonymous Sender)
  Re: Q: Simulation of quantum computing ("J.Wallace")
  Re: RNG based on Blum Blum Shub (Mark Wooding)
  Re: Stolen Enigma (Arturo)
  Re: Key exchange using Secret Key Encryption ([EMAIL PROTECTED])
  Re: Public|Private key cryptography? (Tom St Denis)

----------------------------------------------------------------------------

From: Francois Grieu <[EMAIL PROTECTED]>
Subject: Re: ISO articles on sieving hardware
Date: Thu, 06 Apr 2000 09:27:10 +0200

Here are the articles I found so far in my quest for existing work on 
dedicated hardware for the sieving phase of MPQS and NFS.



Carl Pomerance, J. W. Smith, Randy Tuler: A Pipeline Architecture for 
Factoring Large Integers with the Quadratic Sieve Algorithm. SIAM J. 
Comput. 17(2): 387-403 (1988)
[couldn't get a copy of this one..]

--

Hea Joung Kim: Data-Specific Number Factoring on Context Switching 
Configurable Computers (1998)
<http://www.sanders.com/csrc/Pubs/dsnf.pdf>

related links:
<http://www.darpa.mil/ito/psum1998/E223-0.html>
<http://www.sanders.com/csrc/index.html>
[BTW these pages give clear evidence that US government agencies are 
active at developing reconfigurable hardware, with possible applications 
to factoring and cryptanalysis]

--

Adi Shamir: Factoring Large Numbers with the TWINKLE Device, Proceedings 
of the CHES conference, Lecture Notes in Computer Science 1717, 
Springer, August 1999.
<http://jya.com/twinkle.zip>

--

Arjen K. Lenstra, Adi Shamir: Analysis and Optimization of the TWINKLE 
Factoring Device, to appear at Eurocrypt 2000.

--


Additional articles are welcome.

  Francois Grieu

------------------------------

From: Harald Milz <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp,comp.security.pgp.discuss
Subject: Re: Magnetic Remenance on hard drives.
Date: 6 Apr 2000 05:41:30 GMT
Reply-To: [EMAIL PROTECTED]

In comp.security.pgp.discuss Thor Arne Johansen <[EMAIL PROTECTED]> wrote:

> No cash price, but a huge commercial potential.
> The (data recovery) company I work for would certainly be interested in such
> technology.

AFAIK this type of service is offered by a number of data recovery shops.
Dunno if they actually _deliver_, though. 

-- 
"Is it time for another one of these already? Oh, bother."
-- Bruce Schneier posting to sci.crypt, August 8, 1997 - in response to the
Szopa quote :-)

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Public|Private key cryptography?
Date: Thu, 6 Apr 2000 07:38:12 GMT

###
Tom St Denis wrote:
> This is complete rubish.  Factoring numbers 1024+ bits is not possible.
> So your 'statistics' are moot.

maybe,
but its not *my* 'statistics',
i told the sources:
 http://www.users.globalnet.co.uk/~firstcut/pgpfaq.html
 http://www.Certicom.com/ecc/wecc4.htm

== <EOF> ==
Disastry  http://i.am/disastry/
remove .NOSPAM.NET for email reply
### end pegwit v8 signed text
cc8245695ff62064f70ad19e15233e5de7aabbf74c964c48bad1e78f527e
6bbac5206a3d64f14a424c6c698defb481ab98070f9f9b46a7da7cb3f447

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Q: Entropy
Date: Thu, 06 Apr 2000 10:16:52 +0200

Johann Hibschman wrote:
> 
> Mok-Kong Shen writes:
> 
> > Sorry that I didn't carefully read your follow-up. But it seems
> > to be disputable why you do lumping based on the number of 1's (or
> > equivalently 0's). The special sequence 01010101....... has a
> > very regular pattern, while anothers with the same number of 1's
> > may not. I am not sure that disregarding such properties wouldn't
> > mean neglecting some essential crypto-relavant aspects.
> 
> Pattern?  I care not for pattern.  Or at least entropy doesn't, a
> priori.  ;-)
> 
> > One way that I could think of (though practically not realizable)
> > is to take the Kolmogorov complexity to measure the information
> > content instead of the entropy or other concepts that could be
> > traced back to Shannon. But I don't know whether this is o.k. from
> > the standpoint of crypto.
> 
> From the little bit I know of it, Kolmogorov complexity is closer to
> what you're asking for, not entropy.  The Kolmogorov complexity will
> give you a measure of the regularity of a given bit-string, while the
> concept of entropy only applies to *ensembles* of bit strings.
> 
> You can define the entropy involved in rolling a 7 on 3 dice.  But you
> can't define the entropy of rolling 1 2 4 in an meaningful way.
> Similarly, you could define the Kolmogorov complexity of the sequence
> 1 2 4, but you can't define the complexity of rolling a 7.
> 
> In short, the question you're asking (i.e. what's the entropy of a
> given string) just doesn't make any sense.

I don't exclude that the difference in different disciplines
matters essentially here. For cryptology, whether a bit sequence
being used has patterns is an important question. A sequence of 
010101.... is virtually useless for encryption purpose. You seemed 
to claim that a particular sequence as such (your example 1 2 4) 
has no entropy. I must say that I am not yet very sure the one way 
or the other (that's one reason why I initiated this thread). But 
let me quote from Schneier's Applied Cryptography (which may not be
familiar to you physicists but is well-known in cryptology):

     Formally, the amount of information in a message M is
     measured by the entropy of a message.

This means that a single (particular) message does have entropy.

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Q: Entropy
Date: Thu, 06 Apr 2000 10:17:32 +0200

Bryan Olson wrote:
> 

> > According to the above, the answer should be yes. Now consider
> > applying compression to both, then they are likely to get compressed
> > to different lengths.
> 
> No.  The identity function is an optimal compressor for the
> given language (all n-bit strings equally likely).  In fact,
> if shorter strings are more likely than longer strings, then
> any compression function for which some string has a shorter
> image must have a worse expected compression ratio than the
> identity function.
> 
> Arbitrary compression functions will produce differing
> relative image lengths.  For any finite string there's some
> compressor that compresses it to one bit.
> 
> > Would that fact conform to their having
> > the same entropy?
> 
> The actual situation is consistent.
> 
> My previous posts answers the question of why a conventional
> compression algorithm, which is not optimal for the given
> language, does so much better on the sample string.

Which one is your 'previous post'? I did't see any previous post
of yours that mentioned the word 'compression'.

I don't understand your point. True, you could for a given 
particular/specific sequence (with consequently given length!) 
compress to 1 bit with a compressor 'specially designed' for that
unique case. But I was talking about compression in general. 
Just consider any of the publically available compression schemes. 
I am quite sure that sequences of all 0 or 010101..... etc. 
(in fact all where some patterns could be easily discerned) get 
fairly well compressed.

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Q: Simulation of quantum computing
Date: Thu, 06 Apr 2000 10:16:44 +0200

Bill Unruh wrote:
> 

> 
> >If no, does that imply that a Turing machine can't simulate a
> >quantum computer and hence that the latter is more powerful?
> 
> A Turing machine with a random oracle can completely ( though slowly) simulate a 
>quantum computer., incliding
> the readout. If you do not care about the readout you do not need the random
> number generator-- ie the classical computer can also tell you what the 
>probabilities of
> various outputs are, something a quantum computer "does not do". (Of course a 
>classical computer can be
> simulated with a quantum computer as well.)

The question involved 'Turing machine' not 'Turing machine WITH
a random oracle'! If I had had a 'random oracle', then I could
certainly have generated my truly random bits with the help
of that oracle :-)

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Q: Entropy
Date: Thu, 06 Apr 2000 10:17:38 +0200

Tony T. Warnock wrote:
> 
> Mok-Kong Shen wrote:
> 
> > Tony T. Warnock wrote:
> > >
> > > >
> > > > > It's the process that is "random" not the sequence that comes from the
> > > > > process. If the process is the usual "fair coin toss" then everything is OK.
> > > > > Probability (in this sense) is a priori. One doesn't get 40 heads in a row
> > > > > from an unbiased coin.
> > > >
> > > > I don't understand your last sentence. That probability is 2^(-40),
> > > > isn't it?
> > > >
> > > > M. K. Shen
> > >
> > > Yes. Really small. The point is, in practice one does not see things this rare.
> > >
> > > On a related issue. Not only will a team (finite) of monkeys not type the works 
>of
> > > Shakespeare, they won't even get the first 50 digits of pi. A team of screen
> > > writers could come up with "Romeo and Ethel, the Pirates Daughter" perhaps.
> >
> > But note that any other bit pattern has exactly the same (small)
> > probability!
> 
> Exactly!
> 
> One must not confuse symmetry (or interest) with probability. All zeros is 
>interesting
> because of the way it combines with other numbers, not because of its probability.

I am afraid that, since you agree that all patterns have the same 
probability (i.e. you have now withdrawn your claim that 40 zeros
were not possible), I don't understand currently what is exactly 
your point in the argumentations of the present thread. Could you 
state/repeat that, so that we could further discuss? Thanks in advance.

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Q: Entropy
Date: Thu, 06 Apr 2000 10:18:11 +0200

David Hopwood wrote:
> 

> Yes - which means that if you fix any specific pattern of 40 outcomes in
> advance, you won't see that pattern from an unbiased coin, either.

In any scientific experiment one never (CAN never) 'fix' any 
results in advance. (The fortune-tellers do of course otherwise.)
The result of a scientific experiment comes out, AFTER the experiment 
is done, as far as I am aware.

M. K. Shen

------------------------------

From: [EMAIL PROTECTED] (Niklas Frykholm)
Subject: Re: Q: Entropy
Date: 6 Apr 2000 08:16:48 GMT

>> Suppose by chance I get
>> the number 0, i.e. all n bits are 0. Should I still consider the
>> sequence to have some entropy and, in particular, the same
>> entropy as a sequence having an apparently fairly random pattern
>> of 0 and 1? Thanks.
>
>It's the process that is "random" not the sequence that comes from the
>process. If the process is the usual "fair coin toss" then everything is OK.
>Probability (in this sense) is a priori. One doesn't get 40 heads in a row
>from an unbiased coin.

Well, let's take a more practical example, suppose you order an ATM card 
from a bank and when you get it you find that the associated (computer 
generated) PIN code is "1234". Would you be satisfied with this?

It is true that entropy is in the process and not in the sequence, but in 
practice it is usually the sequence that matters, because the attacker will 
not have any knowledge of the process. If an account has the password "1234", 
the chance that it is broken is independent of the process used to generate 
"1234", whether it has an entropy of 2 bits, 10 bits or 50 bits.

Of course, if we ask ourself before password generation: what is the chance 
that the attacker can guess _a_ password generated by this process, the 
security  is determined by the entropy. But if we ask ourselves after the 
generation of a password, "what is the chance that the attacker will guess 
_this_ password", the  chance will be greater for passwords containing some 
sort of "pattern" and smaller  for the other passwords. However, I can't see 
any practical way of measuring this chance, since it is largely subjective.

The example above shows that a posteriori probability might matter. The bank 
might reason: It doesn't matter if 1/10000 of the customers get the PIN code 
"1234", for there will always be a 1/10000 chance of guessing someones PIN
code. However, you may not be happy to be that 10000th customer.

Does anyone know how banks actually deal with this? Do they avoid PINs such as 
"1111", "1234", etc?

// Niklas




------------------------------

Date: Thu, 6 Apr 2000 04:56:52 -0400
From: Anonymous Sender <[EMAIL PROTECTED]>
Subject: Re: Stolen Enigma

>> Apparently the stolen Enigma was an "Abwehr" Enigma, as
>> described in a recent Cryptologia article.  I can well
>> believe that there are only 3 (or maybe now, 2) Abwehr
>> Enigmas left in the world.  A garden variety Wehrmacht
>> Enigma, like my friend Fred bought a decade or so back,
>> costs as much a new car, I suppose, and is no great rarity.
>> But this one was different.
> 
> Right. The stolen machine was an "Abwehr" 3 rotor Enigma serial number
> G-312 . A full description and photographs of this machine are given
> in David Hamer's article :G-312: An Abwehr Enigma", Cryptologia, Vol.
> 24(1), January 2000."
> 
> Hopefully, with the entire community alerted, the machine can be
> recovered safe and sound.

Hmmmm, history repeats itself as an Enigma machine is stolen. But
aren't the thieves 60 years behind the curve? They can't be planning
to actually USE it unless they already have a similar unit in their
possesion. And why use a system that can be so easily broken now with
the computer tools available?

Stealing an Enigma machine these days is like stealing a Rembrandt --
too tough to fence, too risky to share, only good for keeping to yourself
smugly enjoying the fact that you have it and no one else does.






------------------------------

From: "J.Wallace" <[EMAIL PROTECTED]>
Subject: Re: Q: Simulation of quantum computing
Date: Thu, 06 Apr 2000 10:43:05 +0100

Mok-Kong Shen wrote:
> 
> Many thanks for the information. But could you please also say
> whether one could through such simulation (though very inefficient)
> obtain truly random bits? (With a classical computer and classical
> programming one can only get pseudo-random bits.)

No, you could not obtain truly random bits through this type of 
simulation.  As you say yourself, with a classical computer and 
classical programming you can only get pseudo-random bits.  The 
quantum computer simulators that have been written are classical 
programs running on classical computers.

Julia

------------------------------

From: [EMAIL PROTECTED] (Mark Wooding)
Subject: Re: RNG based on Blum Blum Shub
Date: 6 Apr 2000 10:01:29 GMT

Tom St Denis <[EMAIL PROTECTED]> wrote:

> Well the plus side, the period is known to be at most p-1, since a
> primitive generator mod p will make all elements mod p at some point.

This is true.  In fact, there's a lot of existing theory you can apply
to the generator, because it's an honest old-fashioned linear-
congruential.  See Knuth for its randomness properties and loads of
literature for why you shouldn't use one for cryptography.  See, for
example, `Lattice Reduction: a Toolbox for the Cryptanalyst' by Antoine
Joux and Jacques Stern, <http://www.dmi.ens.fr/~stern/data/JS94.ps>,
which explains why giving out any fixed number of bits from an LCG is a
bad idea.

-- [mdw]

------------------------------

From: [EMAIL PROTECTED]=NOSPAM (Arturo)
Subject: Re: Stolen Enigma
Date: Thu, 06 Apr 2000 10:46:54 GMT

On Thu, 6 Apr 2000 04:56:52 -0400, Anonymous Sender
<[EMAIL PROTECTED]> wrote:


>Stealing an Enigma machine these days is like stealing a Rembrandt --
>too tough to fence, too risky to share, only good for keeping to yourself
>smugly enjoying the fact that you have it and no one else does.
>
>
>
        ... and with a high potential value.  I´m not sure who would want to
buy it in the black market, but IIRC an encryption machine similar to Enigma
was sold last year for a good price.  Perhaps they wanted to get a ransom.


------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Key exchange using Secret Key Encryption
Date: 6 Apr 2000 11:14:13 GMT

>>Consider for instance, the Windows 2000 terminal server. The protocol
>>used to set up a "secure" connection between a thin client
and the
>>terminal server uses the Diffie-Hellman protocol or something similar
>>for the key exchange. No certificates or digital signatures.
>>It is wide open for a man-in-the middle attack.
>>
>>-Erik Runeson
>
>What are you talking about?
>
>The Diffie-Hellman protocol is primarily used to exchange secret session
keys
>amongst parties who already have each others public Diffie-Hellman keys. 

Exactly the problem. Citrix' Secure ICA protocol for thin clients (and
probably 
Microsofts RDP, though the details are not published), use the Diffie-Hellman
protocol without any prior exchange of information (e.g. public keys).
Key exchange messages are not signed and my conclusion is therefore that a
man-in-the-middle attack is easy to implement.

See www.citrix.com/support/solution/sol00044.htm for details on the
implementation of Secure ICA.

-Erik Runeson



     -----  Posted via NewsOne.Net: Free Usenet News via the Web  -----
     -----  http://newsone.net/ --  Discussions on every subject. -----
   NewsOne.Net prohibits users from posting spam.  If this or other posts
made through NewsOne.Net violate posting guidelines, email [EMAIL PROTECTED]

------------------------------

From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Public|Private key cryptography?
Date: Thu, 06 Apr 2000 11:52:27 GMT

What you are seriously missing is the fact that the SPACE required to
factor +1000 bit numbers is insane [over 2^64] and is not possible. 
While you could sieve it faster, you couldn't contain the information
required to solve the puzzle.

So those statistics are meaningless.  It's like saying a 300 bit
symmetric-key is more secure then a 256 bit one, you can't search either
so what's the point?

I do agree that you can get by with smaller keys using ECC such as 160 ~
200 bit keys, but a 200 bit ECC key is not the same as a 20000bit RSA
key like they would want you to believe.

Tom

[EMAIL PROTECTED] wrote:
> 
> ###
> Tom St Denis wrote:
> > This is complete rubish.  Factoring numbers 1024+ bits is not possible.
> > So your 'statistics' are moot.
> 
> maybe,
> but its not *my* 'statistics',
> i told the sources:
>  http://www.users.globalnet.co.uk/~firstcut/pgpfaq.html
>  http://www.Certicom.com/ecc/wecc4.htm
> 
> == <EOF> ==
> Disastry  http://i.am/disastry/
> remove .NOSPAM.NET for email reply
> ### end pegwit v8 signed text
> cc8245695ff62064f70ad19e15233e5de7aabbf74c964c48bad1e78f527e
> 6bbac5206a3d64f14a424c6c698defb481ab98070f9f9b46a7da7cb3f447

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to