Cryptography-Digest Digest #137, Volume #9       Thu, 25 Feb 99 11:13:03 EST

Contents:
  Re: Snake Oil (from the Feb 99 Crypto-Gram) (Lutz Donnerhacke)
  Re: Testing Algorithms (Alan Braggins)
  Re: Snake Oil (from the Feb 99 Crypto-Gram) ([EMAIL PROTECTED])
  Re: What do you all think about the new cipher devised by a 16 year old? ("Vonnegut")
  Re: Take my hand, PLEASE ([EMAIL PROTECTED])
  Re: Define Randomness (R. Knauer)
  Re: Testing Algorithms (Patrick Juola)
  Re: Randomness based consciousness?. (Was: Re: *** Where Does The Randomness Come 
From ?!? *** ) (R. Knauer)
  Re: True Randomness - DOES NOT EXIST!!! (Coen Visser)
  Re: Define Randomness (R. Knauer)
  DSS Keys ("Nicholas Cole")
  Re: Define Randomness (R. Knauer)
  Re: Define Randomness (R. Knauer)
  Re: Testing Algorithms (Patrick Juola)
  Re: True Randomness - DOES NOT EXIST!!! (R. Knauer)
  Re: Define Randomness (R. Knauer)
  Re: RC4 40 bit compared to RC4 128 bit. (fungus)
  Re: Testing Algorithms (fungus)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Lutz Donnerhacke)
Subject: Re: Snake Oil (from the Feb 99 Crypto-Gram)
Date: 25 Feb 1999 12:15:36 GMT

* Peter Gutmann wrote:
>[EMAIL PROTECTED] (Lutz Donnerhacke) writes:
>>>3)  1024 bits- "Military" grade, slow, highest security
>
>>That's why PGP 2.6.3(i)n changed this to:
>>  1024 bit - User grade
>>  1535 bit - SubCA and RA grade
>>  2048 bit - CA grade
>
>... which those for whom it's most important (nontechnical types) will have 
>absolutely no understanding of.

Should not. ;-)

>Although the term "military-grade security" 
>is meaningless, it seems to be one of the better ways to tell J.Random Luser 
>that this is the strongest level of security available in a program.

Which will be wrong.

------------------------------

From: Alan Braggins <[EMAIL PROTECTED]>
Subject: Re: Testing Algorithms
Date: 25 Feb 1999 10:17:49 +0000

"Trevor Jackson, III" <[EMAIL PROTECTED]> writes:
> A superstring computer is certainly conceivable with modern theory, given
> some room for TBDs in the specs.  But a computer that violates the speed
> of light is in the same class as divine inspiration.  If you assume any
> rules you want then you can get any output you want.  By tomorrow. 

Once you assume you can violate light speed, you can get the answer
not merely tomorrow, but yesterday[1]. Then you needn't bother
calculating it before sending it back in time to yourself. On the
other hand your opponent can then go back in time and decide to send a
different message in the first place, so you still haven't cracked it.

[1] unless FTL is possible in some reference frames but not others

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Snake Oil (from the Feb 99 Crypto-Gram)
Date: 25 Feb 1999 12:39:18 GMT

PGP 5.0 has ressurected the term, however ...

>
>3)  1024 bits- "Military" grade, slow, highest security
>
>--
>=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/=/
>Mark Andreas <[EMAIL PROTECTED]>     http://www.sky.net/~voyageur
>PGP key 77EF76B1 available via key server, finger or webpage
>=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\

------------------------------

From: "Vonnegut" <[EMAIL PROTECTED]>
Subject: Re: What do you all think about the new cipher devised by a 16 year old?
Date: Thu, 25 Feb 1999 08:10:19 -0500

>It seems pretty simple.  It uses a
>2X2 matrices.  I wonder how on earth can something so simple been
>overlooked?


Even if she did find a new way to implement the matrices, I do know that I
have seen a simple public key encryption algorithm which used matrices in a
Pre-Calculus books.

The book suggested using a method for encoding letters to numbers in the
plain text.  They used A=1, B=2,.... ' '(space)=0, but ASCII or any other
method would be fine.  Anyhow, the numbers are popped into a matrix and
multiplied by a square matrix M, your public key.  Obviously, the private
key you would use is the inverse of M.  This is quite a simple problem for a
2x2 matrix, though, so I doubt this new algorithm is the same thing.
However, the method just explained is somewhat secure for matrices of order
higher than 10 or 20.  Not only does it take a reasonably long amount of
time to find the inverse of such a matrix, but your program could always
swap certain columns of the output or add some constant matrix.

I don't know if anyone found this information useful, as I am only beginning
in this field.  ( I just happen to 16, myself )  Please forgive me if this
is all basic stuff you all have seen before.

By the way, thanks for reminding me about that matrix idea.  I have a snow
day today, so I think I'll write a C++ implementation of it.  Anyone who
wants it can email me at :

    [EMAIL PROTECTED]

If you just want the compiled program or source code please specify,
otherwise, I'll send both.

Later,
-Vonnegut



------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Take my hand, PLEASE
Date: Wed, 24 Feb 1999 09:36:49 GMT


>
> Check out: http://www.humboldt.edu/~jrg1/zodiac/
>
The link returns "Not found" try:
http://crimelibrary.com/zodiac/zodiac/zodiacmain.htm

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Wed, 24 Feb 1999 14:05:13 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 23 Feb 1999 23:05:32 -0500, Nicol So <[EMAIL PROTECTED]>
wrote:

>Equal probability of outcome is not necessary for randomness.  Even a
>source with a very skewed distribution of outcomes can be random--it
>just has less entropy.

That all depends on what kind of randomness you are talking about.
Some kinds of random numbers must be normal in the Borel sense,
therefore there can be no skew.

But I agree with you that there can be numbers produced by a TRNG that
are highly skewed. In fact, if you filter such numbers out, then your
TRNG has lost some of its ability to produce unbreakable ciphers.

BTW, what does it mean to speak of a given number having an entropy.
Randomness and entropy both apply to the process by which numbers are
generated, not the actual numbers themselves.

>What you called "randomness" in your first remark is properly termed
>"pseudorandomness".  Your second remark is incorrect.  A computer, as an
>example of a deterministic device, cannot produce *true* random
>numbers.  Knowledge of the computer's state will cause the supposedly
>random numbers to lose their random appearance.

Indeed!

True random numbers cannot be computable numbers, by definition. Only
uncomputable numbers can be truly random. If you can compute a number
you can determine it, and therefore it is not random.

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: Testing Algorithms
Date: 25 Feb 1999 09:07:12 -0500

In article <[EMAIL PROTECTED]>,
Trevor Jackson, III <[EMAIL PROTECTED]> wrote:
>Patrick Juola wrote:
>
>> >
>> >Scientific limits are a different issue.  If we project computation
>> >speeds based on the current model of reality we will hit limits such as
>> >the Plank length, speed of light, and the number of particles available
>> >in the observable universe.  Projections that stay within the known
>> >scientific limits are of a different class than those that violate those
>> >limits.
>> >
>> >A superstring computer is certainly conceivable with modern theory, given
>> >some room for TBDs in the specs.  But a computer that violates the speed
>> >of light is in the same class as divine inspiration.
>>
>> Is it?  I don't recall a single scientific experiment disproving the
>> possibility of FTL communication -- and a lot of Bell-type inequalities
>> that suggest it.
>
>OK, technically you are correct.  Neither general relativity nor quantum
>mechanics forbid FTL communication.  However, researchers have been looking
>for exactly such a mechanism for over 60 years and failed to find it.  I admit
>that they may find it in the next 60 years, or even tomorrow.  But I won't bet
>on it at any odds.

The more fool you, then.  8-)

You're assuming that the scientific limitations of today represent
hard and unarguable limitations on reality; this line of reasoning
in 1900 would have specifically excluded transmutation of elements
from the realm of the possible.   Transmutation had been looked for for
what, four hundred years?, before it was finally found; this doesn't
encourage me to believe that we've explored every possible avenue in
the past sixty.

If FTL communication is possible, then (according to some versions of
physics) it can be used for time travel; I seem to recall that Marvin
the Paranoid Android was some fifteen times older than the universe itself
by the end of the Hitchhiker's Guide series.  Give him a problem, allow
him to work on it for several million times the lifetime of the universe,
and then have him come back to early March, 1999 to tell you the answer.

>The universe may not be infinite.  If it is finite, then there is some number
>that we cannot count up to.

... unless we get multiple passes at or can superimpose computations on
each other or something.

Unrealistic?  Perhaps.  But my gut feeling is that transmutation of lead
into gold is also pretty unrealistic.

        -kitten

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Crossposted-To: 
sci.skeptic,sci.philosophy.meta,sci.psychology.theory,alt.hypnosis,sci.logic
Subject: Re: Randomness based consciousness?. (Was: Re: *** Where Does The Randomness 
Come From ?!? *** )
Date: Thu, 25 Feb 1999 14:16:57 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 24 Feb 1999 22:09:13 -0600, "Alex Avila" <[EMAIL PROTECTED]>
wrote:

>>There is only one real known reason for anything to exist:  You exist.
>>If you did not exist, nothing would exist for you.

>doesn't your argument beg the question ?

I do not believe so. Try using a reductio ad absurdum argument on it -
try proving that reality does not exist.

Within the framework of the worldview called Realism, you can't. You
have to invoke non-Realism systems like Mysticism or Idealism or
Phenomenalism, etc., and even then the non-reality of existence is
just an axiom.

If nothing existed, then and only then could you claim that reality
does not exist. But you would not be around to make that assertion.

This is explained in Thomas Aquina's book "On Being And Essence".

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: [EMAIL PROTECTED] (Coen Visser)
Subject: Re: True Randomness - DOES NOT EXIST!!!
Date: 25 Feb 1999 13:34:49 GMT

[EMAIL PROTECTED] (Matthias Meixner) writes:
>BRAD KRANE ([EMAIL PROTECTED]) wrote:

>> True randomness does not exist. It always depends on some variable
>> at some *FIXED* time. FIXED times are not anywhere near random.
>> **EVERY** thing that goes on in the universe is hapening because of all
>> the forces of **EVERY** thing else in the entire universe. If you where
>> to take mesurements at one place in time in one universe and recorded
>> it.

[...]

>So you say everything is predetermined, since it totally depends on all 
>forces of every thing else in the universe.
>So your post and my answer are predetermined.
>If I kill you, its not my fault, since it is also predetermined.
>You can make a religion out of it and nobody can prove it to be wrong of 
>course, it is just a matter of believe. 

Hmm, well chaotic behaviour clashes with determinism. To be able to predict
the positions of N objects (N > 2) on time T that exert a gravitational force on
each other you need to measure their initial values to a certain precision.
With every step in time the precision you need increases exponentially. So
I guess you'll need something better than a Turing Machine.

Regards,

        Coen Visser

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 14:55:18 GMT
Reply-To: [EMAIL PROTECTED]

On 24 Feb 1999 17:33:41 -0500, [EMAIL PROTECTED] (Patrick Juola)
wrote:

>i) Algorithmic doesn't mean that they produce a pattern.

Sure it does. The very algorithm itself is the source of the pattern.
That's why algorithms cannot produce truly random numbers.

>It just
>means that they operate in a predictable fashion in the patterns
>randomly (and spuriously) found in data.

There is no such thing as a "random pattern", if by random you mean
Kolmogorov randomness. By definition randomness is patternless.
Otherwise the pattern could be exploited to make a program shorter
than the number itself, in which case it is not Kolmogorov random.

That's why pi is not random, despite the fact that it passes all
statistical tests for randomness.

>ii) The von Neuman pairwise transformation provably does not introduce
>a bias into a stream *if* the stream is composed of independent events.

I agreed with you about that earlier. Now I am asking if it (or any
other anti-skewing procedure) introduces correlations.

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: "Nicholas Cole" <[EMAIL PROTECTED]>
Subject: DSS Keys
Date: Thu, 25 Feb 1999 14:53:16 -0000

PGPi uses  DH/DSS keys.  For encryption purposes, anything less than a 2048
key is no longer considered secure.  Does this mean that DSS signing keys
should also be lengthened (currently limited to only 1024) to avoid the
forging of signatures?

Best wishes,

Nick



------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 15:05:56 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 24 Feb 1999 23:53:30 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:

>> But, I do not trust algorithmic post processing. You will have to
>> convince me that anti-skewing does not introduce unwanted correlations
>> which can wreck the security of the TRNG significantly.

>Rather than demanding that someone prove a negative for you, consider offering
>an example of the problem you suspect exists.

I do not know - my question was posed to find out the answer one way
or the other.

I have no reason to believe that anti-skewing will introduce
correlations. But I have no reason to believe it will not, either. I
have a suspicion that it might, because it is algortihmic.

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 15:26:40 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 24 Feb 1999 23:05:35 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:

>It is necessary to distinguish between the generator and the output to
>be able to measure the raw generator itself.  It is only by measuring
>the raw generator that we can convince ourselves that it really is
>such a generator.  
>
>If that means you need to modify your definition, then so be it.

The point that I was trying to make is that if a generator needs its
output to be anti-skewed, then the anti-skewing procedure is contained
in the overall makeup of the generartor itself.

For example, the HotBits TRNG uses circuitry to do the anti-skewing,
but flippflopping the measurements to remove bias.

>The von Neumann method does nothing about longer-range correlations.
>If you are willing to assume such do not exist, then you can use the
>simple algorithm.  But then there is "always the opportunity" for
>longer-range correlations to exist unhandled.  

That is what I was alluding to. How do you know that such longer-range
correlations will not be enhanced by the anti-skewing procedure?

After all, the bitstream is being concentrated with the removal of 00
and 11 patterns. IOW, if the range of a certain kind of correlation,
for example one caused by 60 Hz noise, is 50 bits apart, and you
remove 25 bits between it, then the correlation is now 25 bits apart,
which makes it a stronger correlation than originally.

>>If you start throwing out bits like in the von Neumann method, who knows
>>what patterns are left behind.

>If they are "left behind," who cares?

If they can be exploited to break the cipher, then someone would care,
namely the person wanting to keep his messages secret.

>What "anti-skewing" procedures would you call *not* algorithmic?

I do not know of any.

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: Testing Algorithms
Date: 25 Feb 1999 10:30:05 -0500

In article <[EMAIL PROTECTED]>,
fungus  <[EMAIL PROTECTED]> wrote:
>
>
>Coen Visser wrote:
>> 
>> Have you done the maths with a computer running on vacuum tubes? Why not?
>> you'll reach limits far sooner! My guess is that the heat dissipation from
>> such a computer would burn the earth before you could crack DES with brute
>> force. So if you did your maths in the 50's, before the transistor you
>> would say: look we can not brute force 56 bits because the heat from a
>> computer would burn the earth and there is not enough matter in the universe
>> to make enough vacuum tubes et cetera et cetera. The same story goes for
>> silicon now. But who knows what other computing principles can be thought
>> of in the future?
>> 
>
>Beutiful argument...just what I wanted to hear.
>
>You're arguing based on technical skills of the computer makers. The
>past has indeed shown that technical difficulties *will* be overcome,
>and that new processes *will* be invented. This is the basis of Moore's
>trend[1], and history bears this out.
>
>I'm arguing using fundamentals like "the size of an atom", "the speed
>of light", "energy in a photon". These limits are real, hard limits
>which cannot be overcome.

You mean fundamentals like "atomic number" and "rate of time flow,"
don't you? 8-)

        -kitten


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness - DOES NOT EXIST!!!
Date: Thu, 25 Feb 1999 15:56:21 GMT
Reply-To: [EMAIL PROTECTED]

On 24 Feb 1999 21:42:33 -0500, [EMAIL PROTECTED] (Alan
DeKok) wrote:

>>No where near a joke random does not exist!!! Every thing that happens
>>relies completly on something else.

>  Everything that happens depends on things which are to some extent
>unknown, and unknowable.  Add enough of that "unknown" up over time,
>and you've got complete unpredictability.  Which is to say, randomness.

I think it is important to distinguish between causality and
knowability. There can be causes for events which are completely
unknowable.

For example, there is a cause for a computer program which does or
does not halt, yet it can be completely unknowable. For example, a
program could be made to generate digit expansions of pi and search
for a specific pattern, and halt when it finds it. If the pattern is
completely random, we have no way of knowing if it will ever be found
in the expansion of pi, so we can't ever know if it will be found and
whether the program will halt. But we know that if the program does
halt, it is caused by having found that random sequence.

>"Thus we can conjecture that Special Relativity may ultimately be derived from
>a simpler and more fundamental principle of _Conservation of Computational
>Resources_." - Complexity, Entropy, and the Physics of Information, p. 315.

I recall seeing that theory touted many years ago. The idea was that
physics is an expression of calculations that can come to a
conclusion because of the regularity in those calculations. IOW,
calculations that take forever do not contribute to physical reality. 

The finite speed of light arises because of some underlying
calculation that is minimized for that value of the speed of light. If
light went any slower or any faster, the underlying calculations would
not be minimal.

That minimalness is the result of an inherent regularity in physical
reality that is enhanced by that particular value of the speed of
light. IOW, the speed of light fits into that pattern of regularity
built into physcial processes which depend on light for communication
of forces, etc.

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Define Randomness
Date: Thu, 25 Feb 1999 15:11:29 GMT
Reply-To: [EMAIL PROTECTED]

On Wed, 24 Feb 1999 23:05:19 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:

>Certification by committee?

OK, call it peer review. That's what is done in science.

As an engineer, you can appreciate what I am driving at.

>Cryptanalysis by committee?

Of course. How many times have you heard that such and such an
algorithm is believed to be crypto-grade secure because no one has
been able to break it in the crypto community.

>>4) They measure an information leakage which gives you an indication
>>of how much you can use the TRNG before it begins to cause a problem.

>What makes you think there is such an analysis?


It is my understanding that Bayesian inference can be used to break
stream ciphers if they leak enough information. And that is not the
only kind of inferential procedure. There is also Solomonoff
inference.

>First you have to convince all these people to help you....

My procedure is an academic exercise, not a practical suggestion.

Bob Knauer

"Democracy is the theory that the common people know what they
want, and deserve to get it good and hard."
--H.L. Mencken


------------------------------

From: fungus <[EMAIL PROTECTED]>
Subject: Re: RC4 40 bit compared to RC4 128 bit.
Date: Fri, 26 Feb 1999 01:29:06 +0100



Rats wrote:
> 
> Hi all
> 
> I've been looking through the "supposed RC4 algorithm" and I believe
> I've come to grips with how it works.
> 
> However what puzzles me is the referrence sometimes used to describe
> RC4 i.e. RC4 40 bit and 128bit. What I don't understand is the
> relevance of the bit values since the algorithm itself doesn't seem
> to make any mention of it.
> 

RC4 has a variable sized key. You can seed the generator with any size
key (although some sizes don't make much sense), so the algorithm
doesn't mention it.

Other algorithms like DES *require* a 56 bit key for them to work.
For this reason, they will specify a definite key size.


FWIW, 40 bits and 128 bits are often mentioned because:

1) 40 bits is what the US Government allows you to export freely
   (ie. it's worthless)

2) 128 bits is believed to be a safe keysize for the forseeable future.
   We think that nobody will ever crack a 128 bit key using computers
   based on transistors and electrons. If somebody invents a new type
   of computer then we don't know what key size to use, even a million
   bits might not be enough. For this reason we stick to 128 bits.


> R Knauer please don't bother replying to this posting!

Just killfile him and be done with it... ;-)


-- 
<\___/>
/ O O \
\_____/  FTB.


------------------------------

From: fungus <[EMAIL PROTECTED]>
Subject: Re: Testing Algorithms
Date: Fri, 26 Feb 1999 01:20:01 +0100



Coen Visser wrote:
> 
> Have you done the maths with a computer running on vacuum tubes? Why not?
> you'll reach limits far sooner! My guess is that the heat dissipation from
> such a computer would burn the earth before you could crack DES with brute
> force. So if you did your maths in the 50's, before the transistor you
> would say: look we can not brute force 56 bits because the heat from a
> computer would burn the earth and there is not enough matter in the universe
> to make enough vacuum tubes et cetera et cetera. The same story goes for
> silicon now. But who knows what other computing principles can be thought
> of in the future?
> 

Beutiful argument...just what I wanted to hear.

You're arguing based on technical skills of the computer makers. The
past has indeed shown that technical difficulties *will* be overcome,
and that new processes *will* be invented. This is the basis of Moore's
trend[1], and history bears this out.

I'm arguing using fundamentals like "the size of an atom", "the speed
of light", "energy in a photon". These limits are real, hard limits
which cannot be overcome.

Surely you must see the difference between the two ways of thinking,
and that CPU speeds will reach a plateu in the not-too-distant future.
Beyond this you'll need a revolutionary[2] new principle, and this is
*very* unlikely. Beyond classical physics we tend to find chaos, not
the order needed to build reliable computers.


-- 
<\___/>
/ O O \
\_____/  FTB.

[1] I hesitate to call it a "law"...

[2] I said "revolutionary", not "evolutionary", as all advances
have been so far. The transistor was an evolution of the vacuum
tube. The vacuum tube was an evolution of the relay, etc.



------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to