Re: Cruising the stacks and finding stuff

2008-04-24 Thread Sandy Harris
Jack Lloyd [EMAIL PROTECTED] wrote:

  Making a cipher that uses an N bit key but is only secure to 2^M
  operations with MN is, firstly, considered broken in many circles, as
  well as being inefficient (why generate/transmit/store 512 bit keys
  when it only provides the security of a ~300 bit (or whatever) key
  used with a perfect algorithm aka ideal cipher - why not use the
  better cipher and save the bits).

Saving bits may not matter, or may not be possible. For example,
if you are ealing with a hybrid system -- say, using RSA to transmit
the symmetric cipher key or Diffie-Hellamn to construct it -- then for
any symmetric cipher key size less than the public key size, your
overheads are the same.

-- 
Sandy Harris,
Nanjing, China

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-24 Thread Ian G

Allen wrote:

Add Moore's Law, a bigger budget and a more efficient machine, how long 
before AES-128 can be decoded in less than a day?


It does make one ponder.



Wander over to http://keylength.com/ and poke at their 
models.  They have 6 or so to choose from, and they have it 
coded up in the webapplication so you can get the 
appropriate comparisons.


Each model is reasonably well-founded (some work was put in 
by somebody who knows something) and they've been doing it 
for a few years now.


iang

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-23 Thread Allen

Hi,

I find it odd that the responses all seem to focus on pure brute 
force when I did mention three other factors that might be in 
play: a defect in the algorithm much like the attack on MD5 which 
reduces it to an effective length of about 80 bits, if I recall 
correctly, and/or a different analytical tool/approach much like 
differential analysis has had an affect on cryptanalysis as a 
whole, and a purpose built machine.


As to using DES as a measuring stick, it was first cracked in 
1997 using a software approach which was state as not being as 
fast as a hardware solution. In the process of straight brute 
force Rock Verser came up with a faster method:


http://www.cs.cmu.edu/~dkindred/des/rocke-alg.html

which uses some of the weaknesses of DES to speed the crack

At the end of the original challenge they were trying about 7 
billion keys per second when the time that the solution was found 
in June 1997. Granted this was a whole bunch of low end machines 
working in parallel.


Then there is a table at:

http://www.interhack.net/projects/deschall/what.html

Then they say, So, while it's infeasible for DESCHALL to crack a 
72 bit key, it seems that 64 might be within reach, by adding 
more machines. (We probably used between 15,000 and 20,000 
machines.) Consider that the RC5-32/12/5 (40 bits) key crack took 
three and a half hours. The distributed computer we put together 
could do it in about 78 seconds.


The RC5-32/12/6 (48 bits) key crack took 13 days. A 
DESCHALL-sized effort could do it in 5 hours.


They have a table that estimates that a $300 million AISC machine 
could crack DES in 38 seconds back in 1995!


In 1998 DESIII did it in less than 24 hours.

http://www.distributed.net/des/

The key point is: Despite the immense power of the EFF Deep 
Crack, distributed.net's thousands of deployed clients still 
surpassed the EFF hardware by more than a factor of 2 in speed.


So Moore's law since then, 9 years 3 months: 111 months, say 
about 64 times as powerful (actually it is more but let's stick 
with a strict Moore's Law) and now factor in drop in prices at 
the same time. If we assume the same factor as Moore's law, and 
divide the price by 64, let's say 60 for simplicity. So not even 
counting 1995 to 1999, the machine would take about a half second 
on a $5 million dollar machine today. Probably both less time 
and money.


Today running the RSA-72 DNETC on a single 2.8G dual core machine 
that is almost three years old it is getting 13^6/second with a 
software program, not hardware.


Also the largest known group of cryptanalysts is at NSA with a 
big budget to find weaknesses so I would not assume none will be 
found, just not made public. Sure, it took 400 years to figure 
out an answer to Fermat's Last Theorem. But we know more today 
and have more tools so progress (if we can call it that) is 
faster now.


Given all of this, I'm not sure of the value of arguing 128 bit 
is good enough when 256 is not all that much harder to implement 
and with in a couple of years will be just as fast in processing 
while even now, for the size of files being protected, such as 
credit card data and such, is small enough that the wait time 
probably wouldn't be noticed in network latency.


I see the argument as much like the way the Titanic was built. 
The double hull stopped short of the waterline and the breach was 
above it. Total fluke, but it the double hull had been about 8 
feet higher up the side we wouldn't have had so many stories to 
tell and adventures to watch in awe on the tube. The reality is 
it was not the technology that failed, but rather human error in 
not going further to meet the risk than was seen at the time. The 
bizarre thing is the same basic error was the cause of the Exon 
Valdez disaster. Not protecting against a well know risk, drunk 
captains. Funny how almost all tankers have double hulls now. But 
that still didn't prevent the Busan from spilling 58,000 gallons 
of bunker oil in the San Francisco Bay. If they hadn't had a 
double hull, how much would the have spilled?


Oh, well, given how risk adverse we tend to be it is odd the 
choices we make.


Best,

Allen

Leichter, Jerry wrote:

| ...How bad is brute force here for AES? Say you have a chip that can do
| ten billion test keys a second -- far beyond what we can do now. Say
| you have a machine with 10,000 of them in it. That's 10^17 years worth
| of machine time, or about 7 million times the lifetime of the universe
| so far (about 13x10^9 years).
| 
| Don't believe me? Just get out calc or bc and try

|   ((2^128/10^14)/(60*60*24*365))
| 
| I don't think anyone will be brute force cracking AES with 128 bit

| keys any time soon, and I doubt they will ever be brute forcing AES
| with 256 bit keys unless very new and unanticipated technologies
| arise.
| 
| Now, it is entirely possible that someone will come up with a much

| smarter attack against AES than brute force. I'm just speaking of how
| bad 

no possible brute force Was: Cruising the stacks and finding stuff

2008-04-23 Thread Alexander Klimov
On Tue, 22 Apr 2008, Leichter, Jerry wrote:
 Interestingly, if you add physics to the picture, you can convert
 no practical brute force attack into no possible brute force
 attack given known physics.  Current physical theories all place a
 granularity on space and time:  There is a smallest unit of space
 beyond which you can't subdivide things, and a smallest unit of
 time.

I guess you are talking about Planck units, so let's make the
calculations:

 Planck length is L = \sqrt{hbar G / c^3} ~ 1.6E-35 m,

 Planck time T = L/c ~ 5.4E-44 s,

so a cubic-meter-computer can have

 N = (1/L)^3 ~ 2.4E104 elements

and thus

 N/T ~ 4.5E147 operations per second

that is approximately 2^{490} operations per second in a cubic meter
of Planck computer. Given a year (3.15E7 s) on a Earth-size
(1.08E21 m^3) Planck computer we can make approximately 2^{585}
operations.

OK, it was amusing to do these calculations, but does it mean
anything? I think it is more like garbage-in-garbage-out process.

If I understand correctly, L and T are not non-divisible units of
space-time, but rather boundaries for which we understand that our
current theories do not work, because this scale requires unification
of general relativity and quantum mechanics.

Even more bizarre is to think about the above processing element as
representing a single bit: according to quantum mechanics, states of a
system are represented by unit vectors in associated Hilbert space of
the system and each observable is represented by a linear operator
acting on the state space. Even if we have only two qubits they are
not described by two complex numbers, but rather by 4:

 a_0|00 + a_1|01 + a_2|10 + a_3|11

and thus description of the state of quantum registers requires
exponential number of complex numbers a_i and each operation
simultaneously process all those a_i. Since we cannot measure
them all, it is an open question whether quantum computations
provide exponential speed-up and all we know right now is that
they give at least quadratic one.

By the way, if you have a computer with the processing power larger
than that of all the cryptanalysts in the world, it makes sense to use
your computer to think to find a better attack than a brute-force
search :-)

 One place this shows up, as an example:  It turns out give a volume
 of space, the configuration with the maximum entropy for that volume
 of is exactly a black hole with that volume

[OT] This is a strange thesis because all black hole solutions in
general relativity can be completely characterized by only three
externally observable parameters: mass, electric charge, and angular
momentum (the no-hair theorem) and thus in some sense they have zero
entropy (it is not necessary a contradiction with something, because
it takes an infinite time for matter to reach the event horizon).

 and its entropy turns out to be the area of the black hole, in units
 of square Planck lengths.

[OT] Although Hawking showed that the total area of the event horizons
of a collection of black holes can never decrease, which is *similar*
to the second law of thermodynamics (with entropy replaced by area),
it is not clear that it allows to conclude that area *is* entropy.

-- 
Regards,
ASK

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-23 Thread Perry E. Metzger

Allen [EMAIL PROTECTED] writes:
 I find it odd that the responses all seem to focus on pure brute force
 when I did mention three other factors that might be in play: a defect
 in the algorithm much like the attack on MD5 which reduces it to an
 effective length of about 80 bits, if I recall correctly, and/or a
 different analytical tool/approach much like differential analysis has
 had an affect on cryptanalysis as a whole, and a purpose built
 machine.

I think everyone replying mentioned that possibility.

However, if your message really was centered on AES possibly being
defective, which was not obvious from the text, your calculation was
even more inaccurate. If AES is defective, all bets whatsoever are off
-- it is possible one might be able to break an arbitrary defective
algorithm even faster than, say, A5/1. Making up numbers about how
fast a crack might be is even less justified than speculation on how
fast computers will be in 100 years. The crack might take order
microseconds. The crack might never come at all.

The way to (effectively) worry about AES being defective is to look
for defects. We are all adults and know there may be defects and that
we merely don't know of any defects so far.

 I see the argument as much like the way the Titanic was built.

Again, I think most people around here really do understand the issues
fairly well. We all know that AES might be defective, and many
people spend time worrying about issues like algorithm
agility. (Several of our list members had lots of work to do when MD5
started looking weak and have long memories.)

 Given all of this, I'm not sure of the value of arguing 128 bit is
 good enough when 256 is not all that much harder to implement and with
 in a couple of years will be just as fast in processing while even
 now, for the size of files being protected, such as credit card data
 and such, is small enough that the wait time probably wouldn't be
 noticed in network latency.

There are a variety of issues. Smart cards have limited capacity. Many
key agreement protocols yield only limited amounts of key
material. I'll leave it to others to describe why a rational engineer
might use fewer key bits, but suffice it to say, there are quite
rational reasons. I'll agree that if you have no tradeoffs, you might
as well use longer keys, but if you really have no tradeoffs, you
would prefer to use a one time pad, too. All real engineering is about
tradeoffs.


Perry

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-23 Thread Jack Lloyd
On Wed, Apr 23, 2008 at 08:20:27AM -0400, Perry E. Metzger wrote:

 There are a variety of issues. Smart cards have limited capacity. Many
 key agreement protocols yield only limited amounts of key
 material. I'll leave it to others to describe why a rational engineer
 might use fewer key bits, but suffice it to say, there are quite
 rational reasons. I'll agree that if you have no tradeoffs, you might
 as well use longer keys, but if you really have no tradeoffs, you
 would prefer to use a one time pad, too. All real engineering is about
 tradeoffs.

I think one point worth making is that we probably don't really know
how to make a cipher that is secure to, say, 2^512 operations (or
2^1024 or 2^4096 or whatever). For instance if you took Serpent or AES
or Twofish and modified it to support 512-bit keys, I don't believe
the resulting cipher would actually be secure to 2^512
operations... to guess completely at random, I'd say they would be
more like 2^300 or so. (Have any block ciphers with 256-bit
block/512-bit key been proposed/studied? I have not been following FSE
and similar conferences of late)

Making a cipher that uses an N bit key but is only secure to 2^M
operations with MN is, firstly, considered broken in many circles, as
well as being inefficient (why generate/transmit/store 512 bit keys
when it only provides the security of a ~300 bit (or whatever) key
used with a perfect algorithm aka ideal cipher - why not use the
better cipher and save the bits).

-Jack

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: no possible brute force Was: Cruising the stacks and finding stuff

2008-04-23 Thread Leichter, Jerry


On Wed, 23 Apr 2008, Alexander Klimov wrote:

| Date: Wed, 23 Apr 2008 12:53:56 +0300 (IDT)
| From: Alexander Klimov [EMAIL PROTECTED]
| To: Cryptography cryptography@metzdowd.com
| Subject: no possible brute force Was: Cruising the stacks and finding stuff
| 
| On Tue, 22 Apr 2008, Leichter, Jerry wrote:
|  Interestingly, if you add physics to the picture, you can convert
|  no practical brute force attack into no possible brute force
|  attack given known physics.  Current physical theories all place a
|  granularity on space and time:  There is a smallest unit of space
|  beyond which you can't subdivide things, and a smallest unit of
|  time.
| 
| I guess you are talking about Planck units, so let's make the
| calculations:
| 
|  Planck length is L = \sqrt{hbar G / c^3} ~ 1.6E-35 m,
| 
|  Planck time T = L/c ~ 5.4E-44 s,
| 
| so a cubic-meter-computer can have
| 
|  N = (1/L)^3 ~ 2.4E104 elements
|
| and thus
| 
|  N/T ~ 4.5E147 operations per second
| 
| that is approximately 2^{490} operations per second in a cubic meter
| of Planck computer. Given a year (3.15E7 s) on a Earth-size
| (1.08E21 m^3) Planck computer we can make approximately 2^{585}
| operations.
Except that this doesn't quite work.  You can't actually have anywhere
near that many distinct states in that volume of space.  The physics
does indeed get bizarre here:  The maximum number of bits you can
store in a given volume of space is determined by that space's
*surface area*, not its volume.  So you actually only get around
1E70 elements and 4.5E112 operations.  :-)

| OK, it was amusing to do these calculations, but does it mean
| anything? I think it is more like garbage-in-garbage-out process.
| 
| If I understand correctly, L and T are not non-divisible units of
| space-time, but rather boundaries for which we understand that our
| current theories do not work, because this scale requires unification
| of general relativity and quantum mechanics.
Kind of.

| Even more bizarre is to think about the above processing element as
| representing a single bit: according to quantum mechanics, states of a
| system are represented by unit vectors in associated Hilbert space of
| the system and each observable is represented by a linear operator
| acting on the state space. Even if we have only two qubits they are
| not described by two complex numbers, but rather by 4:
| 
|  a_0|00 + a_1|01 + a_2|10 + a_3|11
| 
| and thus description of the state of quantum registers requires
| exponential number of complex numbers a_i and each operation
| simultaneously process all those a_i. Since we cannot measure
| them all, it is an open question whether quantum computations
| provide exponential speed-up and all we know right now is that
| they give at least quadratic one.
For simple search problems, where there are no shortcuts and you really
can only do brute force, quadratic is the best you can do.

| By the way, if you have a computer with the processing power larger
| than that of all the cryptanalysts in the world, it makes sense to use
| your computer to think to find a better attack than a brute-force
| search :-)
Of course.  But the mice are already running that computation.

|  One place this shows up, as an example:  It turns out give a volume
|  of space, the configuration with the maximum entropy for that volume
|  of is exactly a black hole with that volume
| 
| [OT] This is a strange thesis because all black hole solutions in
| general relativity can be completely characterized by only three
| externally observable parameters: mass, electric charge, and angular
| momentum (the no-hair theorem) and thus in some sense they have zero
| entropy (it is not necessary a contradiction with something, because
| it takes an infinite time for matter to reach the event horizon).
You're looking at this backwards.  Suppose I want to store a large
amount of data in a given volume of space.  To be specific, I have a
unit diameter sphere in which to build my memory device.  Initially, I
fill it with magnetic domains, and see how many distinct bits I can
store.  That gets me some huge number.  I replace that with
racetrack-style memory, with information on the edges of the domains.
Then with single electrons, with bits stored in multiple quantum states.
Can I keep going indefinitely?  The answer is no:  There is in fact a
limit, and it turns out to be the number of bits equivalent to the
entropy of a black hole with a unit diameter sphere.  The entropy
isn't given by the number of measurements I can make on my black
hole - it's given by the number of possible precursor states that
can produce the given black hole.  In fact, if you try to shove that
much information into your initial volume, you will inevitably
produce a black hole - not a good idea if what you want is a storage
device, since the data will be there in some sense, but will be
inaccessible!

|  and its entropy turns out to be the area of the black hole, in units
|  of square Planck lengths

Re: Cruising the stacks and finding stuff

2008-04-22 Thread Leichter, Jerry
| ...How bad is brute force here for AES? Say you have a chip that can do
| ten billion test keys a second -- far beyond what we can do now. Say
| you have a machine with 10,000 of them in it. That's 10^17 years worth
| of machine time, or about 7 million times the lifetime of the universe
| so far (about 13x10^9 years).
| 
| Don't believe me? Just get out calc or bc and try
|   ((2^128/10^14)/(60*60*24*365))
| 
| I don't think anyone will be brute force cracking AES with 128 bit
| keys any time soon, and I doubt they will ever be brute forcing AES
| with 256 bit keys unless very new and unanticipated technologies
| arise.
| 
| Now, it is entirely possible that someone will come up with a much
| smarter attack against AES than brute force. I'm just speaking of how
| bad brute force is. The fact that brute force is so bad is why people
| go for better attacks, and even the A5/1 attackers do not use brute
| force.
Interestingly, if you add physics to the picture, you can convert no
practical brute force attack into no possible brute force attack given
known physics.  Current physical theories all place a granularity on
space and time:  There is a smallest unit of space beyond which you
can't subdivide things, and a smallest unit of time.  One place this
shows up, as an example:  It turns out give a volume of space, the
configuration with the maximum entropy for that volume of is exactly a
black hole with that volume, and its entropy turns out to be the area
of the black hole, in units of square Planck lengths.  So, in effect,
the smallest you can squeeze a bit is a Planck length by Planck length
square.  (Yes, even in 3-d space, the constraint is on an area - you'd
think the entropy would depend on the volume, but in fact it doesn't,
bizarre as that sounds.)

So suppose you wanted to build the ultimate computer to brute-force
DES.  Suppose you want your answer within 200 years.  Since information
can't propagate faster than light, anything further than 100 years
from the point where you pose the question is irrelevant - it can't
causally affect the result.  So you computer is (at most, we'll
ignore the time it takes to get parts of that space into the
computation) a 100-light-year diameter sphere that exists for 200
years.  This is a bounded piece of space-time, and can hold a huge,
but finite, number of bits which can flip at most a huge, but finite,
number of times.  If a computation requires more bit flips than that,
it cannot, even in *physical* principle, be carried out.

I ran across a paper discussing this a couple of years back, in a
different context.  The authors were the ones who made the argument
that we need to be wary of in principle arguments:  What's
possible in principle depends on what assumptions you make.  Given
an appropriate oracle, the halting problem is in principle easy to
solve.

The paper discussed something else, but I made some rough estimates
(details long forgotten) of the in principle limits on brute force
attacks.  As I recall, for a 100-year computation, a 128-bit key
is just barely attackable; a 256-bit key is way out of the realm of
possibility.  Given all the hand-waving in my calculation, I didn't
try to determine where in that range the cut-over occurs.  Someone
better than me at the physics should be able to compute much tighter
bounds.  Even if I'm off by quite a bit, it's certain that the key
lengths we are using today are already near fundamental physical
limits.  Brute force is simply not an interesting mode of attack
against decently engineered modern systems.

Of course, this says - and can say - absolutely nothing about the
possibility of analytic or side-channel or any of a variety of other
intelligent attacks
-- Jerry

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-22 Thread Sandy Harris
Perry E. Metzger [EMAIL PROTECTED] wrote:

  Now, it is entirely possible that someone will come up with a much
  smarter attack against AES than brute force. I'm just speaking of how
  bad brute force is. The fact that brute force is so bad is why people
  go for better attacks, and even the A5/1 attackers do not use brute
  force.

  I'd suggest that Allen should be a bit more careful when doing back of
  the envelope calculations...

Another back-of-the-envelope estimate would be to look at the EFF
machine that could brute force s 56-bit DES key in a few days and
cost $200-odd thousand. That was 10 years ago and Moore's Law
applies, so it should be about 100 times faster or cheaper now.

Round numbers are nice. Overestimating the attacker a bit is
better than underestimating. So assume an attacker can brute
force a a 64-bit key (256 times harder than DES) in a second
(a few 100 thousand times faster).

Brute force against a 96-bit key should take 2^32 times as long.
Since pi seconds is a nano-century, that's somewhat over a
century. For a 128-bit key, over 2^32 centuries. If brute force
is the best attack, this is obviously secure.

-- 
Sandy Harris,
Nanjing, China

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-21 Thread Victor Duchovni
On Fri, Apr 18, 2008 at 08:02:28PM -0700, Allen wrote:

 Granted A5/1 is known to be very weak, but how much weaker than 
 AES-128? Ten orders of magnitude? I haven't a clue ...

This is usually the point where I stop reading. Of course 10 orders of
magnitude is ~33 bits, so unless the A5 attacks crack a cipher with ~95
bits security, the estimate is grossly wrong.

If (generously) A5 is 64 bits of work, AES is ~20 orders of magnitude
stronger.

-- 

 /\ ASCII RIBBON  NOTICE: If received in error,
 \ / CAMPAIGN Victor Duchovni  please destroy and notify
  X AGAINST   IT Security, sender. Sender does not waive
 / \ HTML MAILMorgan Stanley   confidentiality or privilege,
   and use is prohibited.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Cruising the stacks and finding stuff

2008-04-21 Thread Perry E. Metzger

Victor Duchovni [EMAIL PROTECTED] writes:
 On Fri, Apr 18, 2008 at 08:02:28PM -0700, Allen wrote:

 Granted A5/1 is known to be very weak, but how much weaker than 
 AES-128? Ten orders of magnitude? I haven't a clue ...

 This is usually the point where I stop reading. Of course 10 orders of
 magnitude is ~33 bits, so unless the A5 attacks crack a cipher with ~95
 bits security, the estimate is grossly wrong.

 If (generously) A5 is 64 bits of work, AES is ~20 orders of magnitude
 stronger.

Oh, what the heck. Here's my expanded version of Victor's remark.

The effective key length of A5/1 is actually 54 bits because 10 of the
64 key bits are fixed at 0. However, the attacks that have been done
recently are not, in fact, mere brute force but are far more
sophisticated than that. Thus, the time difference between
(intelligently) attacking A5/1 and brute forcing AES with 128 bit keys
is far worse than 20 orders of magnitude.

How bad is brute force here for AES? Say you have a chip that can do
ten billion test keys a second -- far beyond what we can do now. Say
you have a machine with 10,000 of them in it. That's 10^17 years worth
of machine time, or about 7 million times the lifetime of the universe
so far (about 13x10^9 years).

Don't believe me? Just get out calc or bc and try
  ((2^128/10^14)/(60*60*24*365))

I don't think anyone will be brute force cracking AES with 128 bit
keys any time soon, and I doubt they will ever be brute forcing AES
with 256 bit keys unless very new and unanticipated technologies
arise.

Now, it is entirely possible that someone will come up with a much
smarter attack against AES than brute force. I'm just speaking of how
bad brute force is. The fact that brute force is so bad is why people
go for better attacks, and even the A5/1 attackers do not use brute
force.

I'd suggest that Allen should be a bit more careful when doing back of
the envelope calculations...


-- 
Perry E. Metzger[EMAIL PROTECTED]

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]