Cryptography-Digest Digest #155, Volume #12       Mon, 3 Jul 00 23:13:00 EDT

Contents:
  Re: Tying Up Lost Ends III (John Savard)
  Re: Use of EPR "paradox" in cryptography (John Savard)
  Re: TC5 Completed Paper (David A. Wagner)
  Re: A thought on OTPs ("Douglas A. Gwyn")
  Re: Tying Up Lost Ends (John Savard)
  Re: cray and time needed to attack (Mist)
  Re: cray and time needed to attack (Mist)
  Public-domain Blowfish (Future Beacon)
  Re: Quantum Computing (Was: Newbie question about factoring) (Jerry Coffin)
  Re: Encryption and IBM's 12 teraflop MPC...... (Jerry Coffin)
  Re: cray and time needed to attack (Jerry Coffin)
  Re: Newbie question about factoring (Bob Silverman)
  Re: An encryption protocol, version 2 (Dido Sevilla)
  Re: Tying Up Lost Ends III ("Douglas A. Gwyn")
  Re: Tying Up Lost Ends (SCOTT19U.ZIP_GUY)
  Re: Use of EPR "paradox" in cryptography ("Douglas A. Gwyn")
  Re: Public-domain Blowfish (Boris Kazak)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Tying Up Lost Ends III
Date: Mon, 03 Jul 2000 23:20:12 GMT

On Mon, 3 Jul 2000 18:19:49 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
wrote, in part:
>"SCOTT19U.ZIP_GUY" wrote:

>> After all is light a particle or a wave.

>It seems to be a particle guided by a wave.

Don't get me started on the Copenhagen interpretation.

John Savard (teneerf <-)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Crossposted-To: sci.physics
Subject: Re: Use of EPR "paradox" in cryptography
Date: Mon, 03 Jul 2000 23:18:40 GMT

On Mon, 3 Jul 2000 18:30:10 GMT, "Douglas A. Gwyn" <[EMAIL PROTECTED]>
wrote, in part:
>John Savard wrote:

>> However, the fact that one measurement affects the other,
>> yet at a speed faster than light, ...

>Not really, at witnessed by the fact that one cannot exploit
>EPR to send controlled information faster than light.

Yes, one cannot do that. However, the one measurement still does
affect the other measurement, since if both measurements are of the
same polarity component, the results always match.

But if the polarity of the two measurements differs by an intermediate
amount, instead of the 90 degrees which is the other possibility in
quantum cryptography, the rate at which disagreements (polarizations
that differ by 170 degrees instead of 10 degrees) appear increases too
slowly, as the square of the angle, instead of linearly with the
angle, to allow for a local hidden-variables theory to explain the
perfect agreement for alignment. (This is the relationship known as
Bell's inequality.)

Since the perfect agreement occurs for alignment at any detector
position, the photons can't all have been polarized up or down at the
given alignment to start with.

Conservation of angular momentum, therefore, has led to what should be
two separate wave function collapses, which should be independent if
they are in a spacelike relationship, influencing each other.

John Savard (teneerf <-)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] (David A. Wagner)
Subject: Re: TC5 Completed Paper
Date: 3 Jul 2000 15:35:20 -0700

TC5 is very pretty in its simplicity, but alas, it can be distinguished
from random with 2^33 chosen plaintexts.

TC5 is a four-round Feistel with a bijective round function, and there
are generic attacks on such constructions known in the literature.

Here is one.  Consider the encryption of a plaintext (L,R).  Let X
denote the input to the F-function in the third round, and Y=F(X) the
output.  Then we can recover Y xor F(R) from the plaintext
and its corresponding ciphertext by just xor-ing appropriate quantities.

Consider 2^33 chosen plaintexts of the form (L_i,R), where R is fixed
and L_i varies over 2^33 values.  Let c = F(R); c is unknown but fixed.
Then X_i = R xor F(L_i xor c) is a bijective function of L_i, and hence
all the X_i's are distinct.  Consequently, all the Y_i's should be
distinct if we're using TC5, since the F-function is bijective.

Next we recover Y_i xor c from the plaintexts and their encryptions.
This allows us to verify whether all the Y_i's are distinct.  For the
cipher TC5, the Y_i's will all be distinct, guaranteed.  For an ideal
cipher (a random permutation), the inferred Y_i values will be distributed
like 2^33 independently chosen 64-bit values, so by the birthday paradox,
with good probability there will be collision between the suggested Y_i's
if we've got a cipher on our hands.

This lets us distinguish TC5 from an ideal cipher with 2^33 chosen texts.

As a possible fix, it appears that using 8 rounds at every stage instead
of 4 provides a more than adequate defense against this attack.

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: A thought on OTPs
Date: Mon, 3 Jul 2000 23:00:01 GMT

"Tony T. Warnock" wrote:
> A simple version is to note that the characters of the key are independent
> and identically uniformly distributed. Thus the probability of getting a
> particular cyphertext character is independent of the corresponding
> plaintext character and of the position in the message. This is based on
> the fact that a convolution of a uniform distribution and any other
> distribution on a circle is the uniform distribution.

Unfortunately, that argument can also be applied to less secure systems.

------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Tying Up Lost Ends
Date: Mon, 03 Jul 2000 23:26:02 GMT

On 3 Jul 2000 14:23:35 GMT, [EMAIL PROTECTED]
(SCOTT19U.ZIP_GUY) wrote, in part:
>[EMAIL PROTECTED] (John Savard) wrote in
><[EMAIL PROTECTED]>: 
>>On 30 Jun 2000 16:25:17 GMT, [EMAIL PROTECTED]
>>(SCOTT19U.ZIP_GUY) wrote, in part:

>  Using your logic since 2157 is as likely to be sent as 2156 this
>imples that messages of N+1 bits in lenght are as likely as N bits
>there fore the most common messages would be to big to be stored
>on a hard drive.

Actually, that's a fair rebuttal, since the distribution p(N+1)=p(N)
leads to infinite length, even if not as exponentially as
p(N+1)=2p(N).

However, what I'm actually visualizing in my mind is that the length
of messages follows a normal distribution, you know, where p(n) =
K*(e^(-C((n-N)^2))) where C determines the standard deviation and K is
there to make the integral from 0 to infinity equal to 1, and N is the
average length of a message. So it looks flat for "most" messages.

Hey, maybe we should use my method for average-length messages, and
your method for the very short ones.

John Savard (teneerf <-)
http://www.ecn.ab.ca/~jsavard/crypto.htm

------------------------------

From: [EMAIL PROTECTED] (Mist)
Subject: Re: cray and time needed to attack
Date: Tue, 04 Jul 2000 00:14:14 GMT

my crypto knowledge isnt very good, its even very bad...
i should complete by
every method with the best cray ( i belive the cray is the best thing
to crack a know algorythm )

>The whole question kind of revolves around what you mean by "to crack",
>what methods are being used, etc.  Which Cray?


------------------------------

From: [EMAIL PROTECTED] (Mist)
Subject: Re: cray and time needed to attack
Date: Tue, 04 Jul 2000 00:21:06 GMT

>> 2� 1024 bit key with blowfish
>
>A Cray would NOT be the weapon of choice in either of these attacks.  
>Instead, you'd want to use specialized custom hardware.  Unless 
>somebody finds a HUGE break in the ciphers themselves it doesn't make 
>any real difference though: it would take millions of years to 
>exhaust the keyspace of either one.  Oh, and just FWIW, Blowfish only 
>accepts keys of up to 448 bits, but even at that nothing approaching 
>a practical attack is known at the present time.
>
i spoke about triplefish sorry, a little mistake 

>> 3� 128 bit key with RSA
>
>By contrast, this is so trivial that there's no real reason to use a 
>Cray at all -- with an average desktop or laptop, factoring a 128-bit 
>number will take less time than it takes you to type in the number.  
>
>If you intended to say 128 decimal digits instead of 128 bits, then 
>at least you've got something secure enough that you might consider 
>sing it.  Somebody with plenty of talent and money could still break 
>it, but the best people and computers available would still take a 
>few months to do the job.
>
it was just to give me an idea and to compute with a 4096 bit ( not
byte ) key, for example ( power of 2 per bit in more )

>> 4� 1024 bit key with DH
>
>This borders on being theoretically possible with present technology, 
>but the attacker would have to be willing to dump millions of dollars 
>into the project, and even at that it would NOT be completed very 
>quickly -- think in terms of years, not months.
>


and with the nsa power and money?
does its possible ?



------------------------------

From: Future Beacon <[EMAIL PROTECTED]>
Subject: Public-domain Blowfish
Date: Mon, 3 Jul 2000 20:25:53 -0400




I went to these two sites:

The Blowfish Encryption Algorithm
http://www.counterpane.com/blowfish.html

1994 Blowfish Publication
http://www.counterpane.com/bfsverlag.html

Here is an interesting quote from the 1994 paper:

"Blowfish is unpatented, and will remain so in all countries. The
 algorithm is hereby placed in the public domain, and can be freely
 used by anyone".
===================================================================

This is a rather simple way of placing an algorithm in the public
domain and it doesn't always work.  Here is my question:

To this day, can anybody use this algorithm any way they want?
Can anybody use it as a layer in their own algorithm?  Has there
been restrictions (such as credit to the author) added?

Has anybody else patented it?  Has anybody heard of any disputes?

My interest in encryption is for e-mail privacy within the United
States.

Thank you for your help.


Jim Trek
Future Beacon Technology
http://eznet.net/~progress
[EMAIL PROTECTED]



------------------------------

From: Jerry Coffin <[EMAIL PROTECTED]>
Crossposted-To: comp.theory
Subject: Re: Quantum Computing (Was: Newbie question about factoring)
Date: Mon, 3 Jul 2000 18:36:03 -0600

In article <8jr41b$ii6$[EMAIL PROTECTED]>, [EMAIL PROTECTED] 
says...

[ ... Quantum computers ] 

> I believe that the current state of the art is 4 bits, but the
> limit may have been pushed a bit further since I heard.

It's been pushed up to 7 qubits, though of course that's still _far_ 
from being anywhere close to useful.  Furethermore, quite a few 
researchers think that the method that's been successful so far will 
ultimately turn out to be a dead-end, and an entirely different 
approach (that nobody's ever made work at all) will be necessary 
before there's any chance of producing anything useful.

-- 
    Later,
    Jerry.
 
The universe is a figment of its own imagination.

------------------------------

From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: Encryption and IBM's 12 teraflop MPC......
Date: Mon, 3 Jul 2000 18:35:56 -0600

In article <3961098a$[EMAIL PROTECTED]>, [EMAIL PROTECTED] 
says...
> 
> "Casper H.S. Dik - Network Security Engineer" <[EMAIL PROTECTED]>
> wrote in message news:8jlgi8$n6p$[EMAIL PROTECTED]...
> > "Harvey Rook" <[EMAIL PROTECTED]> writes:
> >
> > >If someone put this thing to work brute forcing passwords, it could break
> > >40 bit RC4 in a second.
> >
> > Only if it had an "rc4 & compare result" instruction (or could do so
> > in 8 instructions)
> >
> > (But a "very short time to crack RC4-40" would be correct)
> >
> 
> True, but the calculations are certainly off by no more than 100 orders of
> magnitude. 100 seconds is still enough to pronounce 40 bit keys as very
> insecure.

The difference between 1 and 100 is TWO orders of magnitude, not 100 
orders of magnitude.  If the estimate was off by 100 orders of 
magnitude, it would mean the machine took considerably longer than 
the estimated age of the universe to test a single key.  If the 
estimate was off by 100 orders of magnitude in the other direction, a 
exhausting a 40-bit keyspace wouldn't even be a challenge -- in fact 
even a 128-bit keyspace would be trivial for it to exhaust.

-- 
    Later,
    Jerry.
 
The universe is a figment of its own imagination.

------------------------------

From: Jerry Coffin <[EMAIL PROTECTED]>
Subject: Re: cray and time needed to attack
Date: Mon, 3 Jul 2000 18:59:13 -0600

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] 
says...

[ ... ] 

> i spoke about triplefish sorry, a little mistake 

Presumably you mean TwoFish -- if so, the same basic comments apply; 
nothing approaching a practical attack on it is known at the present 
time.

[ ... RSA ] 
 
> it was just to give me an idea and to compute with a 4096 bit ( not
> byte ) key, for example ( power of 2 per bit in more )

Using a 4096-bit key with RSA is basically just overkill unless 
you're trying to protect data with an _extremely_ long useful life.  
Data that has high value and a fairly long useful life is probably 
perfectly adequately protected with 2048-bit encryption.  For most of 
us, whose secrets aren't worth millions of dollars, 1024 bits is more 
than enough.  Keep in mind that the largest RSA key publicly known to 
have been factored is less than 500 bits.  I don't think anybody 
would presently advise using a key as small as 512 bits, but in 
reality, it's likely to be a while before even 768 bits is broken.

>From the opposite viewpoint, most people don't use RSA for large 
amounts of data, so the extra time taken for a larger key doesn't 
really hurt them much either.  If you want to use a really big key 
like 4096 bits, it's probably not hurting anything even though it's 
almost certainly NOT really going to make any real difference in the 
overall safety of your data.

> >> 4� 1024 bit key with DH
> >
> >This borders on being theoretically possible with present technology, 
> >but the attacker would have to be willing to dump millions of dollars 
> >into the project, and even at that it would NOT be completed very 
> >quickly -- think in terms of years, not months.
> 
> and with the nsa power and money?
> does its possible ?

Well, as I said, it's approaching what's theoretically possible some 
time relatively soon.  Unless the NSA has been quietly buying a 
fairly substantial portion of the world's production of RAM for a 
while, they couldn't do it right away, but if they wanted to badly 
enough, they could probably start an attack within something like a 
couple of years, and finish it within a few years after that.

I'm basing part of this estimate on things like Gigabit RAM chips.  
Some RAM vendors have designed Gigabit RAM chips, but they haven't 
put them into production due to lack of demand.  If somebody agreed 
to buy up a few hundred million dollars worth of them, that would 
change pretty quickly.

-- 
    Later,
    Jerry.
 
The universe is a figment of its own imagination.

------------------------------

From: Bob Silverman <[EMAIL PROTECTED]>
Crossposted-To: comp.theory
Subject: Re: Newbie question about factoring
Date: Tue, 04 Jul 2000 00:48:32 GMT

In article <8jq5r7$j1i$[EMAIL PROTECTED]>,
  [EMAIL PROTECTED] (Nick Maclaren) wrote:
>
> In article <8jq3vp$fvf$[EMAIL PROTECTED]>, Bob Silverman <bobs@my-
deja.com> writes:
> |> In article <[EMAIL PROTECTED]>,
> |>   Dido Sevilla <[EMAIL PROTECTED]> wrote:
> |> > Bob Silverman wrote:
> |> > >
> |> > > The size of a number IS its number of digits.
> |> > > You contradict yourself.
> |> >
> |> > And you're playing semantics.  How big is a number?  It's it's
> |> > magnitude.  The number of digits is its number of digits.
> |>
> |> The  SIZE of a number is how many bits [or digits] it takes to
write
> |> it down or store it.  This is different from its magnitude.
>
> Firstly, that depends on context and, secondly, it is only true
> even in context for integers

The context here is clear: computational number theory. In this
context the definition of size is clear: number of bits (or digits,
they differ only by a scaling factor) needed to represent the number.

--
Bob Silverman
"You can lead a horse's ass to knowledge, but you can't make him think"


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: Dido Sevilla <[EMAIL PROTECTED]>
Subject: Re: An encryption protocol, version 2
Date: Tue, 04 Jul 2000 09:50:26 +0800

Ichinin wrote:
> 
> Is it something like this?
> 
> http://www.patents.ibm.com/details?&pn=US05428686__
> 
> If it is, Somehow i don't think you'll be able to
> challenge it (Hint: look at the assignee)
> 

Who said anything about a pseudo-random bit sequence generator?  I'm
going to be using a *real* random bit generator, that uses physical
randomness...

--
Rafael R. Sevilla <[EMAIL PROTECTED]>         +63 (2)   4342217
ICSM-F Development Team                         +63 (917) 4458925
University of the Philippines Diliman

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Tying Up Lost Ends III
Date: Mon, 03 Jul 2000 22:08:48 -0400

John Savard wrote:
> Don't get me started on the Copenhagen interpretation.

Hopefully you realize that the Copenhagen interpretation
makes no sense.

------------------------------

From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Tying Up Lost Ends
Date: 4 Jul 2000 02:13:42 GMT

[EMAIL PROTECTED] (John Savard) wrote in 
<[EMAIL PROTECTED]>:

>On 3 Jul 2000 14:23:35 GMT, [EMAIL PROTECTED]
>(SCOTT19U.ZIP_GUY) wrote, in part:
>>[EMAIL PROTECTED] (John Savard) wrote in
>><[EMAIL PROTECTED]>: 
>>>On 30 Jun 2000 16:25:17 GMT, [EMAIL PROTECTED]
>>>(SCOTT19U.ZIP_GUY) wrote, in part:
>
>>  Using your logic since 2157 is as likely to be sent as 2156 this
>>imples that messages of N+1 bits in lenght are as likely as N bits
>>there fore the most common messages would be to big to be stored
>>on a hard drive.
>
>Actually, that's a fair rebuttal, since the distribution p(N+1)=p(N)
>leads to infinite length, even if not as exponentially as
>p(N+1)=2p(N).
>
>However, what I'm actually visualizing in my mind is that the length
>of messages follows a normal distribution, you know, where p(n) =
>K*(e^(-C((n-N)^2))) where C determines the standard deviation and K is
>there to make the integral from 0 to infinity equal to 1, and N is the
>average length of a message. So it looks flat for "most" messages.

    Actually I doubt reality is a normal distribution. But I am sure
we each could dream up situations where each way of ending the 
compression would be better than the other. I gues my main dislike
to your method is the use of random numbers. However even in my
method I would follow with my encryption where I support a random
padding so two identical messages don't look the same. I hate the
random padding mode and never use it in my contests as I feel it
would be unfair. I plan to some day use straight 1-1 compression
to a finitely odd file and then encrypt only up to but not including
the last 1 bit. That way the compression would be 1-1 and the real 
distribution would make little difference to most kinds of files.


>
>Hey, maybe we should use my method for average-length messages, and
>your method for the very short ones.
>

  Well that is one solution. And I am at a loss for words are you
sure you feeling OK. This does not seem to be the John that I have
visualized in my mind, Maybe we can meet some day if Shaw ever has
another small crypto conference. You know the kind where one wear
T-shirts and drinks a few beers and only NSA agents would wear ties.
But I think it might be better to use yours for very long messages
while mine is used for the rest. We could meet cloer to the middle
if you get a prefect random number generator.


>John Savard (teneerf <-)
>http://www.ecn.ab.ca/~jsavard/crypto.htm


David A. Scott
-- 
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
        http://www.jim.com/jamesd/Kong/scott19u.zip
Scott famous encryption website NOT FOR WIMPS **no JavaScript allowed**
        http://members.xoom.com/ecil/index.htm
Scott rejected paper for the ACM
        http://members.xoom.com/ecil/dspaper.htm
Scott famous Compression Page WIMPS allowed ** JavaScript OK**
        http://members.xoom.com/ecil/compress.htm
**NOTE EMAIL address is for SPAMERS***
I leave you with this final thought from President Bill Clinton:
   "The road to tyranny, we must never forget, begins with the destruction 
of the truth." 

------------------------------

From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Crossposted-To: sci.physics
Subject: Re: Use of EPR "paradox" in cryptography
Date: Mon, 03 Jul 2000 22:34:06 -0400

John Savard wrote:
> ... However, the one measurement still does
> affect the other measurement, since if both measurements are of the
> same polarity component, the results always match.

That does not allow one to conclude that one measurement affects
the other measurement, but rather only that events at the two
observers are correlated (which is no surprise, since that is how
the experiment is set up).  The same situation occurs classically;
suppose we all know I have one black bead and one white bead that
I have sealed up (one each) in two light-tight envelopes.  If you
take one envelope far away and open it, the color you find
"instantly" determines what color I will find when I open my
envelope.  Yet before one of us opened an envelope, either of us
had a 50% chance of finding a particular color.

> Since the perfect agreement occurs for alignment at any detector
> position, the photons can't all have been polarized up or down at
> the given alignment to start with.

Of course not, but a given photon pair is (are?) correlated.
(Circular polarization may be easier to work with in practice.)

> Conservation of angular momentum, therefore, has led to what should
> be two separate wave function collapses, which should be independent
> if they are in a spacelike relationship, influencing each other.

If the "wave functions" collapse upon observation, then they
cannot be entirely physical, but must contain a conventional
component as well.  It is the convention that "collapses", or
more precisely, a different context is established (which must
be consistent with constraints imposed by interaction with the
measurement device).

Lewis E. Little has an alternate point of view that you can read
about in his papers on the "Theory of Elementary Waves".  It is
not necessary to accept everything about his theory in order to
see that quantum "paradoxes" really are the historical result of
early entrenchment of some wrong ideas.  http://www.yankee.us.com/TEW/

------------------------------

From: Boris Kazak <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Public-domain Blowfish
Date: Tue, 04 Jul 2000 02:45:02 GMT



Future Beacon wrote:
> 
> I went to these two sites:
> 
> The Blowfish Encryption Algorithm
> http://www.counterpane.com/blowfish.html
> 
> 1994 Blowfish Publication
> http://www.counterpane.com/bfsverlag.html
> 
> Here is an interesting quote from the 1994 paper:
> 
> "Blowfish is unpatented, and will remain so in all countries. The
>  algorithm is hereby placed in the public domain, and can be freely
>  used by anyone".
> -------------------------------------------------------------------
> 
> This is a rather simple way of placing an algorithm in the public
> domain and it doesn't always work.  Here is my question:
> 
> To this day, can anybody use this algorithm any way they want?
        Yes
> Can anybody use it as a layer in their own algorithm?
        Yes
>  Has there
> been restrictions (such as credit to the author) added?
        No
> 
> Has anybody else patented it?
        No 
> Has anybody heard of any disputes?
        No
> 
> My interest in encryption is for e-mail privacy within the United
> States.
        Go ahead (but don't attempt to patent Blowfish)
> 
> Thank you for your help.
> 
> Jim Trek
> Future Beacon Technology
> http://eznet.net/~progress
> [EMAIL PROTECTED]
=====================
Best wishes              BNK

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to