Cryptography-Digest Digest #890, Volume #8       Tue, 12 Jan 99 12:13:03 EST

Contents:
  Re: On the Generation of Pseudo-OTP (Mok-Kong Shen)
  Re: On the Generation of Pseudo-OTP (Mok-Kong Shen)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: Practical True Random Number Generator (Mok-Kong Shen)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Re: On the Generation of Pseudo-OTP (Patrick Juola)
  Re: On the Generation of Pseudo-OTP (R. Knauer)
  Random numbers in C: Some suggestions. (George Marsaglia)
  Re: Practical True Random Number Generator ("Tony T. Warnock")
  Re: On the Generation of Pseudo-OTP (R. Knauer)

----------------------------------------------------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Date: Mon, 11 Jan 1999 17:35:45 +0100

R. Knauer wrote:
> 
> On Mon, 11 Jan 1999 14:55:35 +0100, Mok-Kong Shen
> <[EMAIL PROTECTED]> wrote:
> 
> >> A crypto-grade random number is one which resists a successful
> >> Bayesian Attack. There is no more to it than that, other than to point
> >> out that a properly constructed TRNG will generate such numbers.
> 
> >Let's see how you PROVE that random numbers obtained from hardware
> >noise 'resist Bayesian attacks' independent of the quality of
> >that (bias etc.).
> 
> I am not going to attempt to prove anything regarding hardware noise,
> since I have no idea what the source of the noise is. As I have
> already discussed on at least two previous occasions already today, I
> would chose readioactive decay for the TRNG, and measure the interval
> between decays forward and backward in time to remove any bias due to
> the depletion of the isotope with time.
> 
> Specifically I would measure one interval and then a second subsequnet
> interval. I would compare the times for each interval in one time
> direction, say the second interval minus the first. If the second
> interval is longer than the first I would emit a 1 and if the opposite
> a 0. Then I would measure two more consecutive intervals but do the
> comparison in reverse. That way any bias would cancel from one
> measurement to the next.
> 
> Since the decay of selected radioactive isotopes is completely random
> in time, the probability that the TRNG outputs a 1 or a 0 is the same
> for all measurements. IOW, there is no reason for the decay process
> and the method of measurement to favor one bit output over the other.
> That means that all bit sequences are possible and equiprobable. There
> are neither any excluded bit sequences nor any sequences that are more
> or less probable than any others.
> 
> Therefore an OTP cipher based on the output of such a TRNG would be
> impervious to a Bayesian Attack, because all sequences of a given
> finite length are both possible and equiprobable. There is no
> periodicity, bias or correlation present in any of the sequences, nor
> any other discernable pattern. There is no way to distinguish one
> sequence from any other. There is no way for a Bayesian Attack to take
> hold, since no information is being leaked by the TRNG.
> 
> Does that convince you that I can PROVE what I said?

It is NOT a rigorous proof in the mathematical sense. It is a
'plausible' proof. I never argued that hardware noises were bad.
I believe they are mostly excellent. But perfect are they NOT,
nonetheless. But we don't (never) need perfectness in practice, so 
they (assuming that bias are sufficiently reduced) can for all 
practical purposes be substitutes for the (practically unobtainable)
ideal OTP. But the justification of using these has ultimately to come
from a risk and cost analysis of the user. Since software made
'intended approximations of ideal OTP' presumably can also attain
negligible correlations, they have also a chance of substituting
the hardware noises in certain applications, IF this can be
justified from a risk and cost analysis.

M. K. Shen

------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: On the Generation of Pseudo-OTP
Date: Tue, 12 Jan 1999 12:02:06 +0100

R. Knauer wrote:
> 
> On Mon, 11 Jan 1999 17:00:53 +0100, Mok-Kong Shen
> <[EMAIL PROTECTED]> wrote:
> 
> >There has been some debate, if I correctly remember, whether a proof
> >obtained with a program on a computer is a genuine proof. (I feel
> >myself too ignorant to take position on one side or the other.)
> >A fact is that software may have bugs and hardware faults. But
> >there were also accepted humans proofs that were later found to be
> >falacious.
> 
> If such faults prevent us from proving things experimentally, we might
> as well close shop.

If one is pedantic, yes. But we are living in a practical world.
So we can accept things that are non-perfect, for example an
'intended approximation to an ideal OTP'.

> 
> >If any assertion derived from experiment, physical or computational,
> >is in terms of probability, i.e. associated with an appropriately
> >estimated confidence interval, then everything is o.k.  Absolute
> >assertions are possible if they are deduced from axioms by logical
> >rules.
> 
> Even that has been challenged on several fronts. In terms of axiomatic
> formal systems, you run into Godel's Theorem. In terms of computation,
> you run into Turing's Halting problem. And in terms of elementary
> number theory, you run into Chaitin's Matematical Indeterminancy.

What did you mean by 'challenge' relative to what I wrote? Did you
refer to 'experimental assertions are in terms of confidence
intervals' or did you refer to 'absolute assertions are only
obtainable from logical deductions'? Where are challenges to the
one or the other to be found?

M. K. Shen

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Tue, 12 Jan 1999 14:57:30 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 12 Jan 1999 11:41:42 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>Is this term 'Bayesian Attack' employed in any text books on 
>cryptology?

I could not find it in Schneier's main book. But the term was used
several times a year ago in these discussions.

As I understand it, it is a probablistic attack that makes inferences
from slight abnormalities in distributions - of the form "if this
happens, then such and such is likely to happen. A cursory check of
the Web using AltaVista came up with many references to Bayesian
statistics, Bayesian networks, etc. In fact, there seems to be a cult
following of sorts.

If you peruse amazon.com (keyword: "bayesian") you will find all sorts
of books on Bayesian forecasting with time series - so it appears to
be a method of prediction. Included in the same search are books on
information, entropy, statistics - all the kinds of topics familiar to
the cryptographer.

The man to ask about the Bayesian Attack is Patrick Juola, who graces
us with his presence on sci.crypt from time to time. He was the one
who discussed that method most a year ago.

>Another question: Using frequency analysis dates back to 
>the very beginning of cryptology. Since you are VERY sensitive to
>new terminologies (against my 'pseudo-OTP', for example), tell
>me why have you accepted that new term 'Bayesian Attack' without
>significant inner resistance.

Because it was introduced and explained by expert cryptographers a
year ago in the context of randomness in the OTP system. Patrick was
not the only person who was touting the Bayesian Attack - one person
even posted a long-winded mathematical treatise on it.
 
>From now on in this thread I'll in all discussions with you (only)
>use the longer term 'intended approximation to an ideal OTP'.

Why not just use the terminology that is employed by the crypto
community? Why invent new and confusing terminology? I t serves no
useful purpose and just obscures things.

What you are describing already has a widely accepted name - Stream
Cipher. Since nobody in his right mind would use the same stream
twice, the "one time" part is not really all relevant to an
understanding of the essence of the OTP - and is included in the
description for historical reasons, namely the fact that physical key
sheets were torn off the pad and discarded after use.

>Does
>that serve to avoid wasting time on debating about this point??
>(Tell me if you have objections to the new term.)

You are free to use whatever terms you want. My objection isthat there
is no need to rename a stream cipher as a "pseudo-OTP" - nothing
significant is gained by it, and it could cause confusion.

If you want to emphasize that your techniques are new and exciting,
then come up with modifiers to "stream cipher", like "random text
stream cipher" or "digit expansion stream cipher". Relying on the
prestige of the OTP as proveably secure is not going to make your
proposed cryptosystem any stronger.

The reason I am harping on this is that you could come up with some
good ideas that get lost in the confusion caused my the misapplication
of terminology. Clear exposition of complex ideas requires clear
terminology - using confusing terminology only obscures whatever you
have to present.

</end of lecture>

Bob Knauer

"Anyone that can get elected, is not qualified to serve."
--Lance Bledsoe


------------------------------

From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Practical True Random Number Generator
Date: Tue, 12 Jan 1999 15:16:51 +0100

KloroX wrote:

> 
> A simple fix comes to mind. If we write your original loop as
> (disregarding the case of t1 = t2 for simplicity)
> 
> while(true)
> {
>         get(t1, t2); // get the intervals t1 and t2 between detected
> fissions
>         if(t1 > t2)
>                 output(0);
>         else
>                 output(1);
> }
> 
> then we could modify the loop to invert the rule at every reading:
> 
> while(true)
> {
>         get(t1, t2);
>         if(t1 > t2)
>                 output(0);
>         else
>                 output(1);
>         get(t1, t2);
>         if(t1 < t2)
>                 output(0);
>         else
>                 output(1);
> }

I don't see why the modified version is better than the original.

M. K. Shen

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Tue, 12 Jan 1999 15:10:15 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 12 Jan 1999 07:45:43 -0500, "Trevor Jackson, III"
<[EMAIL PROTECTED]> wrote:

>The commentary surrounding these announcements indicates that the results may
>be compressible.  If so transcendental numbers are probably not appropriate
>sources of entropy.

Why is that a big problem? The outputs of a TRNG are also compressible
- some of them anyway - and that does not stop the TRNG from being
proveably secure.

Bob Knauer

"Anyone that can get elected, is not qualified to serve."
--Lance Bledsoe


------------------------------

From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: On the Generation of Pseudo-OTP
Date: 12 Jan 1999 10:17:04 -0500

In article <[EMAIL PROTECTED]>,
Trevor Jackson, III <[EMAIL PROTECTED]> wrote:
>Patrick Juola wrote:
>
>> In article <[EMAIL PROTECTED]>,
>> R. Knauer <[EMAIL PROTECTED]> wrote:
>>
>> >Is there a reason to believe that if I chose an arbitrary offset into
>> >the digit expansion of Pi, that the sequence that is generated from
>> >that offset is not one of all possible sequences of that length, and
>> >that it is equiprobable with all of those other possible sequences? If
>> >not, what is it about the digit expansion of Pi that causes such
>> >limitations?
>>
>> There is, actually.  There's a known closed form solution for the
>> digits of pi (in base 16) that I find troublesome.  But more troublesome
>> than that is the fact that I rather doubt you'll use a truly
>> arbitrary output.
>
>That's interesting.  I'm somewhat curious about the closed form solution, but
>I'm bewildered by the base 16 clause.  What has base 16 got to do with
>it????????

The citation was posted somewhere.  In short form, some cleverdick whose
name I have forgotten found a simple technique for generating
digits of pi -- but the technique only works in base 16.  

        -kitten

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Tue, 12 Jan 1999 15:24:28 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 12 Jan 1999 11:52:16 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>A true OTP is thought to obtain the same undecidability.

"True OTP" is redundant - an OTP is either "true" or it is not an OTP.

And yes, the whole idea behind the OTP cryptosystem is that every
possible plaintext, including those that are intelligible, has the
same probability of occurance in the cipher - and therefore the
cryptanalyst has no reason to choose one over the other.

That is the reason the OTP is proveably secure.

>Even there is a truly random physical process (I 'assume' that a
>certain defition of true randomness can be verified, though I
>doubt that),

I already discussed this a couple times earlier on this very thread.

In general radioactive decay processes follow a first order rate
equation precisely - that can be verified experimentally to any level
of precision you are willing to pay to get. And since scientists use
OPM, they have done it to levels of precision way beyond anything we
need to be concerned about in crypto. Give those guys a few megabucks
and they will give you ten significant figures if you want.

If the decay process follows a first order rate equation, then the
decay process is completely, totally, 100% bona fide, no doubt about
it, unpredictably random with regard to when a particular decay will
occur. The reason is that a first order rate equation is of the form:

dN/dt = - k N, where k is a constant.

The fact that k is a constant says that the probability that a decay
occur in the interval t -> t + dt is independent of t, that is, it can
happen any time, and therefore is completely unpredictable.

That is about as random as it gets, folks.

>the actual registration of the process, i.e. the 
>obtaining of data, needs measuring appratus. Since no man-made 
>apparatus is perfect, if follows that the sequence obtained is 
>subject to errors, hence not perfect. Hence one can at best accept 
>the sequence as truly random with a certain statistical confidence 
>level.

If your TRNG is designed properly, you can achieve the level of
randomness inherent in the radioactive decay process to practical
levels. What is the significance of an error that is so small that it
makes no practical difference to the security of the OTP?

Bob Knauer

"Anyone that can get elected, is not qualified to serve."
--Lance Bledsoe


------------------------------

From: George Marsaglia <[EMAIL PROTECTED]>
Crossposted-To: sci.stat.math,sci.math,sci.math.num-analysis,sci.physics
Subject: Random numbers in C: Some suggestions.
Date: Tue, 12 Jan 1999 09:37:37 -0500


This posting ends with  17  lines of
C code that provide eight different
in-line random number generators, six for
random 32-bit integers and two for uniform
reals in (0,1) and (-1,1).
Comments are interspersed with that
code. Various combinations of the six in-line
integer generators may put in C expressions to
provide a wide variety of very fast, long period,
well-tested RNG's. I invite comments, feedback,
verifications and timings.

First, there is narrative giving background
for this posting; you may want to skip it.

Narrative:

Having had experience in producing and
testing for randomness in computers,
I am frequently asked to suggest good
random number generators (RNG's), test
RNG's, or comment on existing RNG's.  Many
recent queries have been in two areas:
(1) requests for implementations in C and
(2) comments on generators with immense periods,
particularly the Mersenne Twister.

This posting is mainly for category (1),
for which I suggest a set of C implementations
of RNG's I have developed.  C implementations
of my DIEHARD battery of tests will be
discussed elsewhere, and Narasimhan's GUI
version is expected to be released soon.

For (2), I merely repeat what I have said
in response to various queries: the Mersenne
Twister looks good, but it seems to be essentially
a lagged Fibonacci RNG using the exclusive-or
(xor) operation, and experience has shown that
lagged Fibonacci generators using xor provide
unsatisfactory 'randomness' unless the lags are
very long, and even for those with very long lags,
(and even for those using + or - rather than xor),
many people (I among them) are inclined to be
cautious about sequences based on such a simple
operation as: each new integer is the xor, (or sum,
or difference), of two earlier ones.  To be sure,
the resulting integers can be "twisted", but not,
I think, as simply or as safely as combining, say
by addition, with members of a sequence from a
(shorter period) generator that has itself passed
extensive tests of randomness.

I also reply that it does not take an immense
program (as, for example, in posted listings
of Twister) to produce a more satisfactory RNG
with an immense period, and give this example,
on which I will expand below: Inclusion of

#define SWB ( t[c+237]=(x=t[c+15])-(y=t[++c]+(x<y)) )

together with suitably initialized seeds in

static unsigned long x,y,t[256]; unsigned char c;

will allow you to put the string SWB in any C
expression and it will provide, in about 100 nanosecs,
a 32-bit random integer with period  2^7578. (Here
and below, ^ means exponent, except in C expressions,
where it means xor (exclusive-or).

Now for the (2) part, in which I suggest a number
of C implementations and invite comment and feedback.
Most of these were previously developed and tested
via Fortran versions.  I list eight RNG's, each of
them by means of C's powerful #define device. This
provides  fast, compact implementation, allows
insertion of the required random variable directly
into an expression, and, finally, provides a good
selection of RNG's for use individually or in
combination.  The latter makes it possible to
further confirm what empirical results suggest:
combining two or more RNG's provides better,
(or no worse) randomness, and for encryption enthusiasts:
combination generators are harder to "crack".

For those wishing to try these eight RNG's:

At the top of your C program, include these
definitions and the static variables that follow.
Everything past this line is either C code or comment.
==================================================

#define UL unsigned long
#define znew  ((z=36969*(z&65535)+(z>>16))<<16)
#define wnew  ((w=18000*(w&65535)+(w>>16))&65535)
#define MWC   (znew+wnew)
#define SHR3  (jsr=(jsr=(jsr=jsr^(jsr<<17))^(jsr>>13))^(jsr<<5))
#define CONG  (jcong=69069*jcong+1234567)
#define KISS  ((MWC^CONG)+SHR3)
#define LFIB4 (t[c]=t[c]+t[c+58]+t[c+119]+t[++c+178])
#define SWB   (t[c+237]=(x=t[c+15])-(y=t[++c]+(x<y)))
#define UNI   (KISS*2.328306e-10)
#define VNI   ((long) KISS)*4.656613e-10
/*  Global static variables: */
 static UL z=362436069, w=521288629, jsr=123456789, jcong=380116160;
 static UL t[256];
 static UL x=0,y=0; static unsigned char c=0;

/* Random seeds must be used to reset z,w,jsr,jcong and
the table t[256]  Here is an example procedure, using KISS: */

 void settable(UL i1,UL i2,UL i3,UL i4)
 { int i; z=i1;w=i2,jsr=i3; jcong=i4;
 for(i=0;i<256;i++)  t[i]=KISS;        }

/*  End of C code;  Only comments follow. Stick the above
   17 lines in your simulation programs, initialize the table,
   and have a variety of promising RNG's at your disposal.  */

/* You may want use more complicated names for the
   above simple 1-letter variable names: z,w,x,y,t,c,
   to avoid clashing with favorites in your code.    */

/* Any one of KISS, MWC, LFIB4, SWB, SHR3, or CONG
   can be used in an expression to provide a random
   32-bit integer, and UNI in an expression will
   provide a random uniform in (01), or VNI in (-1,1).
   For example, for int i, float v; i=(MWC>>24); will
   provide a random byte, while v=4.+3.*UNI; will
   provide a uniform v in the interval 4. to 7.
   For the super cautious, (KISS+SWB) in an expression
   would provide a random 32-bit integer from
   a sequence with period > 2^7700, and would only
   add some 300 nanoseconds to the computing
   time for that expression.                         */

/* The KISS generator, (Keep It Simple Stupid), is
   designed to combine the two multiply-with-carry
   generators in MWC with the 3-shift register SHR3
   and the congruential generator CONG, using
   addition and exclusive-or. Period about 2^123. It
   is one of my favorite generators.   */

/* The  MWC generator concatenates two 16-bit
  multiply-with-carry generators, x(n)=36969x(n-1)+carry,
  y(n)=18000y(n-1)+carry  mod 2^16, has  period  about
  2^60 and seems to pass all tests of randomness. A favorite
  stand-alone generator---faster than KISS, which contains it.*/

/* SHR3 is a 3-shift-register generator with
   period 2^32-1. It uses
   y(n)=y(n-1)(I+L^17)(I+R^13)(I+L^5), with the
   y's viewed as binary vectors, L the 32x32
   binary matrix that shifts a vector left 1, and
   R its transpose.  SHR3 seems to pass all except
   the binary rank test, since 32 successive
   values, as binary vectors, must be linearly
   independent, while 32 successive truly random
   32-bit integers, viewed as binary vectors, will
   be linearly independent only about 29% of the time.   */

/* CONG is a congruential generator with the
   widely used 69069 as multiplier:
   x(n)=69069x(n-1)+1234567.  It has period 2^32.
   The leading half of its 32 bits seem to pass
   all tests, but bits in the last half are too
   regular.                               */

/* LFIB4 is an extension of the class that I have
   previously defined as  lagged Fibonacci
   generators: x(n)=x(n-r) op x(n-s), with the x's
   in a finite set over which there is a binary
   operation op, such as +,- on integers mod 2^32,
   * on odd such integers, exclusive-or (xor) on
   binary vectors. Except for those using
   multiplication, lagged Fibonacci generators
   fail various tests of randomness, unless the
   lags are very long.  To see if more than two
   lags would serve to overcome the problems of 2-
   lag generators using +,- or xor, I have
   developed the 4-lag generator LFIB4:
   x(n)=x(n-256)+x(n-179)+x(n-119)+x(n-55) mod 2^32.
   Its period is 2^31*(2^256-1), about 2^287, and
   it seems to pass all tests---in particular,
   those of the kind for which 2-lag generators
   using +,-,xor seem to fail.  For even more
   confidence in its suitability,  LFIB4 can be
   combined with KISS, with a resulting period of
   about 2^410: just use (KISS+LFIB4) in any C
   expression.                               */

/* SWB is a subtract-with-borrow generator that I
   developed to give a simple method for producing
   extremely long periods:
   x(n)=x(n-222)-x(n-237)-borrow mod 2^32.
   The 'borrow' is 0 unless set to 1 if computing
   x(n-1) caused overflow in 32-bit integer
   arithmetic. This generator has a very long
   period, 2^7098(2^480-1), about 2^7578. It seems
   to pass all tests of randomness, but,
   suspicious of a generator so simple and fast
   (62 nanosecs at 300MHz), I would suggest
   combining SWB with KISS, MWC, SHR3, or CONG. */

/* Finally, because many simulations call for
   uniform random variables in 0<v<1 or -1<v<1, I
   use #define statements that permit inclusion of
   such variates directly in expressions:  using
   UNI will provide a uniform random real (float)
   in (0,1), while VNI will provide one in (-1,1).  */

/* All of these: MWC, SHR3, CONG, KISS, LFIB4,
   SWB, UNI and VNI, permit direct insertion of
   the desired random quantity into an expression,
   avoiding the time and space costs of a function
   call. I call these in-line-define functions.
   To use them, static variables z,w,jsr and
   jcong should be assigned seed values other than
   their initial values.  If LFIB4 or SWB are
   used, the static table t[256] must be
   initialized.  A sample procedure follows. */

/* A note on timing:  It is difficult to provide
   exact time costs for inclusion of one of these
   in-line-define functions in an expression.
   Times may differ widely for different
   compilers, as the C operations may be deeply
   nested and tricky. I suggest these rough
   comparisons, based on averaging ten runs of a
   routine that is essentially a long loop:
   for(i=1;i<10000000;i++) L=KISS; then with KISS
   replaced with SHR3, CONG,... or KISS+SWB, etc.
   The times on my home PC, a Pentium 300MHz, in
   nanoseconds: LFIB4=64; CONG=90; SWB=100;
   SHR3=110; KISS=209; KISS+LFIB4=252; KISS+SWB=310.     */







------------------------------

From: "Tony T. Warnock" <[EMAIL PROTECTED]>
Subject: Re: Practical True Random Number Generator
Date: Tue, 12 Jan 1999 08:15:39 -0700
Reply-To: [EMAIL PROTECTED]



R. Knauer wrote:

> On Mon, 11 Jan 1999 16:48:45 -0500, Nicko van Someren
> <[EMAIL PROTECTED]> wrote:
>
> >Another method is to time the two intervals between three decay
> >events.  If the first interval is longer than the second then emit a
> >zero and if the second is longer than the first emit a one.  If your timer
> >says that they are the same do it again.  This has as little bias as
> >you can get.
>
> That is the preferred method because it cancels any slowly varying
> systematic errors, such as clock errors or detector errors -  as long
> as the the interval lengths are shorter in duration than the time
> constants for the errors.
>
> It was my understanding that you needed to measure each interval
> independently, that is, you need to time the two intervals between 4
> decay events, not 3.
>
> Bob Knauer
>
> "Anyone that can get elected, is not qualified to serve."
> --Lance Bledsoe

There is also dead time of the detector. One protocol that I have seen
suggested is:

I. On the occurence of a decay, wait t0 (micro-) seconds then start counting.

II. A. If an event occurs in the next t1 (micro-) seconds output "1."
    B. If no event the output "0"

III. Go to I.

von Neumann's trick can be used to remove bias. The above can be done offline.

Tony

newsfodder
newsfodder
newsfodder


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: On the Generation of Pseudo-OTP
Date: Tue, 12 Jan 1999 15:40:06 GMT
Reply-To: [EMAIL PROTECTED]

On Tue, 12 Jan 1999 12:02:06 +0100, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:

>>Absolute
>>assertions are possible if they are deduced from axioms by logical
>>rules.
 
>> Even that has been challenged on several fronts. In terms of axiomatic
>> formal systems, you run into Godel's Theorem. In terms of computation,
>> you run into Turing's Halting problem. And in terms of elementary
>> number theory, you run into Chaitin's Matematical Indeterminancy.

>What did you mean by 'challenge' relative to what I wrote?

The statement you made above has been challenged many times before.

>Did you
>refer to 'experimental assertions are in terms of confidence
>intervals' or did you refer to 'absolute assertions are only
>obtainable from logical deductions'?

I do not believe I said either.

>Where are challenges to the
>one or the other to be found?

The challenges to your original statement above are to be found in
such places that discuss Godel's Theorem, Turing's Halting Problem and
most recently, Chaitin's Mathematical Indeterminancy. For example, it
is not possible to prove the truth of the following statement using
formal axiomatic systems: "This statement is false."

Douglas Hofstadter's book, "Godel, Escher, Bach" (aka "GEB") is a good
place to start if you are interested in these kinds of issues. Also be
sure to see Chaitin's papers where things get really indeterminate.

He has an equation where it is impossible to decide formally if there
is a finite number of solutions or an infinite number for a given
single parameter of the equation. The answer to that question on an
case by case basis (by varying the parameter) is completely random.

That goes against all our prior understanding of how mathematics
should behave.

Bob Knauer

"Anyone that can get elected, is not qualified to serve."
--Lance Bledsoe


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to