Cryptography-Digest Digest #18, Volume #9         Tue, 2 Feb 99 00:13:03 EST

Contents:
  Re: Random numbers generator and Pentium III (R. Knauer)
  Re: *** Where Does The Randomness Come From ?!? *** ("PAC")
  Re: Sanity check take 2 (Edward Keyes)
  Re: *** Where Does The Randomness Come From ?!? *** ("Tom Norback")
  Historical basis for BLOWFISH (karl_m)
  Re: irrational idea continued (Frank)
  Re: Historical basis for BLOWFISH (Bruce Schneier)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Random numbers generator and Pentium III
Date: Tue, 02 Feb 1999 01:07:55 GMT
Reply-To: [EMAIL PROTECTED]

On Mon, 01 Feb 1999 17:44:52 -0600, Jim Felling
<[EMAIL PROTECTED]> wrote:

> On the other hand the odds of
>such a thing happening will be very low as over a sufficiently long run a TRNG
>should have nearly perfect properties if E is small enough it will reject very
>few TRNG's.

It is not so much rejecting properly working TRNGs that is the concern
as it is not rejecting improperly working ones.

>This is also true.  You can end up accepting an insufficient generator if it
>just happens to have really great properties in the run being examined. The
>probability of this can also be made very low by selecting an appropriate E
>before running this.

As I read further in Li and Vitanyi's book on Kolmogorov Complexity, I
am finding that random numbers have all sorts of regularity in their
substrings.

And the authors point out that characterizing numbers as random using
stochastic tests only works for algorithmic complexity randomness,
which is not suitable for proveably secure crypto. Now I am finding
out that only prefix complexity, K, is strong enough to characterize
complexity randomness. The usual complexity randomness, C, has
deficiencies.

>Also agreed.    However, a physical generator is also vulnerable to the
>arguments being made.

The generator must meet the specifications for a TRNG, or it is not a
TRNG.

>How can one be certain that one's physical process based
>generator is not mis calibrated slightly,

By calibrating it carefully.

>or subject to EF noise being leaked
>into it from another part of the system,

By designing it carefully.

>or doesn't have some internal
>components being weakly coupled by some electromagnetic effect.

By testing the design carefully.

>One can only
>state, Based on what I know of equipment and design and the rigorous physical
>testing process I performed,

That you must do.

>and the rigorous statistical testing of the
>outputs generated

That is not reliable compared to the tests described above.

>that the probability of this device being flawed is less than
>some arbitrary E.

That value is not a probability which is based on statistical tests of
the output. The proper analysis is based on the instrument itself.

>All we really have for any generator are confidence limits.

That confidence is based on something other than inappropriate
stochastic theory.

>Trusting that one has thought of all possible attacks in the testing and design
>phase for either a physical or an algorithmic device is self deception.

That's why there are standards committees and beta tests.

>( it is often the unanticipated flaw that gets you)

Then you have to make sure that such flaws do not get into your TRNG.

>and the fact that there is a 1 in
>1000000 chance that your system is flawed in way X always leaves you vulnerable
>to drawing the 'lucky' millionth system.

I'll take those odds any day.

Anyway, the TRNG can be designed to fail into a known state if it acts
out of specification.

>No system should be regarded as
>secure without a thorough exam of its components, and a thorough testing and
>evaluation process having been performed on its outputs.

Characterizing the proveable security of a TRNG using statistical
tests is not part of that evaluation.

>> Unfortunately algorithmic conplexity is unsuited for crypto-grade
>> randomness..

>Why? I agree that it is harder to make use of than other forms, but that still
>does not make it useless.

In a uniform distribution there are "regular" and "complex" strings.
Algorithmic complexity rejects "regular" strings - and that rejection
is not permitted in the specification for a TRNG.

The fact that the fraction of regular strings that are rejected is
very small for long strings does not change the fact that for a
proveably secure system you cannot reject any strings.

>Depends on your filtering methodology.  Any bijection can be applied to  random
>values without destroying randomness

I do not understand the term "bijection". Is it a typo?

Whatever the terms meand, any filtering of strings is unsuitable for
the OTP system. The output of the TRNG must not be filtered.

>and there are information reducing
>methods (such as take n bits of output, compute the parity (m) of that n-bit
>value, output the m,  discard the n bits -- or discarding every non-prime
>numbered bit of output ) that will also preserve randomness while potentially
>strengthening your system against certain potential avenues of attack.

Any tampering with the strings is not permitted if you are to be
compliant with the TRNG specification. It is that specification which
is necessary for proveable security.

>I agree that if you modify the probability of any string  showing up in the
>output  this exposes you to potential attacks, but not all filtering will do
>so.

Any filtering has the potential to do so and is therefore not
permitted.

Bob Knauer

"Sometimes it is said that man cannot be trusted with the government
of himself.  Can he, then, be trusted with the government of others?"
--Thomas Jefferson


------------------------------

From: "PAC" <[EMAIL PROTECTED]>
Crossposted-To: sci.philosophy.meta,sci.physics,sci.skeptic
Subject: Re: *** Where Does The Randomness Come From ?!? ***
Date: Mon, 1 Feb 1999 14:27:05 -0800


PAC wrote in message <793ihq$7jf$[EMAIL PROTECTED]>...
>
>PAC wrote in message <793hne$g5o$[EMAIL PROTECTED]>...
>>
>>Marty Fouts wrote in message ...
>>>>>>>> Tom Norback pounded silicon into:
>>>
>>>  > Marty Fouts wrote in message ...
>>>  >> >[Tom Norback suggested:] It may be that likening reality to a
>>>  >> system with discrete configurational states introduces
>>>  >> difficulties into our analysis.
>>>  >>
>>>  >> possibly.  However, if one assumes that time is quantized then it
>>>  >> should follow.
>>>  >>
>>>  > Quantized time (or epochal duration) is a feature of some
>>>  > metaphysics (Bergson's and Whitehead's in particular, vaguely in
>>>  > William James) but as far as I know it is not currently a feature
>>>  > of any scientific theory.
>>>
>>>I don't recall where I've read the formulation, but the conjecture is
>>>that the planck constant puts a boundary on the smallest possible
>>>length, and therefor, since time can be seen as a measure of l/c, upon
>>>the smallest quantity of time.
>>
>>    This would obviously be the case being that something without a
>>duration, minimally defined by Planck's constant, would also not have a
>time
>>factor.  Unless there is some kind of fractional leftover from plank's
>>constant that implies its own duration, but that would not certainly be a
>>given.
>>    It very well might end up as being totally artificial (able this way
to
>>differentiate between time and space) for any mapping involved in
>coordinate
>>space-time events, but for any specific object or event, the time duration
>>just for that object, not relating necessarily relativistically to
anything
>>else, would be entirely wrapped up in its duration whether that duration
is
>>spatial, energy state, or whatever.
>
>    I'm making to great a distinction between QM-type events here and
>relativistic ones, since anything with any duration must be, in some way,
>relativistic, properly speaking; but this more accentuates the difference
>between time as a mapping system and time involved intrinsically by any
type
>of duration.

    One last example for those who are _really_ intereseted (>;

    The trick is to see instead of two separate events determined
relativistically, just one object, including the space in between, connected
to the two frames that define each other as giving relativistic time.
    Time is a self relating system, whereby any duration relates to itself
whether separated by space or in a self-relating sub-atomic particle.  A
discreet fundamental unit such a plank�s constant would have it�s duration
defined differently, not being able to relate to itself as there is no
duration within itself, but only a relation to what it is not - then
existing as something instead of nothing and more likely to be infinity
related since unable to relate to itself as being a fundamental particle.
    This is a different aspect than relating to the cosmos as a whole except
when forced to re-enter that relation dealing with fundamental particles.

>
>
>
>>    Without duration, no space, time nor nothing.
>>
>>>
>>>  > My understanding is as follows (I'm no expert so I welcome
>>>  > correction on these points):
>>>  > Relativistically the notion of a discrete (and absolute) order of
>>>  > succession from C(n) to C(n+1) is problematic.  In some frames of
>>>  > reference the separation between the two configurations would be
>>>  > timelike while in others the separation would be spacelike (making
>>>  > C(n) and C(n+1) "simultaneous"). In extraordinary frames of
>>>  > reference C(n+1) would actually preceed C(n).  As Roger Penrose
>>>  > likes to point out, gravity--unlike the other fundamental
>>>  > forces--influences causality.
>>
>>
>>    I would still argue the differnce between a mapping coordinate
>>relativity between events, and the durational aspect that defines any
>>object, being when that duration is there then a time factor must be there
>>also, as well as a spatial factor or energy state.
>>
>>>
>>>Yes.  I've been avoiding relativistic effects because I was trying to
>>>get to a set  of definitions Ron could agree to so that we could
>>>explore the QM aspects of causality and determinism.  It is clear that
>>>in order to define the successive configuration I must impose a
>>>special frame of reference.  For the moment, let us simply consider
>>>that the sequence represents a single observer's sequencing, and
>>>ignore the issues of multiple reference frames until we've established
>>>the non-relativistic definitions.
>>>
>>>  > Relativity makes F(C(n)) = C(n+1) merely a local sort of
>>>  > "perceived determinism". QM makes F(C(n)) = C(n+1) something of a
>>>  > crap shoot.
>>>
>>>Well, yes, that's sort of what I'm trying to get to.  Formally, in
>>>this model, the ordering of C must be that from a particular reference
>>>frame, and F() is a multi-valued function.  That is, F(C_(n)) = U'
>>>where U' = { C such that C is an element of U and can be reached from
>>>C_(n)} and C_(n+1) is a member of U'. F() can not be made
>>>deterministic without the use of hidden variables, which we know is
>>>prohibited.
>>>
>>
>>
>>    One approach might be to think of multi-valued function variables in
>the
>>QM approach to be similar to multiple reference frames in the relativistic
>>approach.
>
>Something like this for making relativistic undetermined:
>
>"One of the easiest ways of getting out of selection would be to return
>to a Bohm-type // universe thing, but the only intent of the // universe
>thing would be that the third-party frame or perspective �selects� the
prior
>causal event, therefore the selection process itself has nothing to do with
>f(c_n) = f(c_n+1), but how a floating frame from a third-party looking from
>whatever aspect would �see� a connection whereby from another aspect (all
>photons being free to be you and me) another connection would be
>experienced."
>
>
>
>>
>>
>>    Phil C.
>>
>>
>
>



------------------------------

From: [EMAIL PROTECTED] (Edward Keyes)
Subject: Re: Sanity check take 2
Date: Mon, 01 Feb 1999 22:26:01 -0500

In article <[EMAIL PROTECTED]>,
Paul Crowley <[EMAIL PROTECTED]> wrote:

> [EMAIL PROTECTED] (Edward Keyes) writes:
> >      A --> B:  A, R_A
> >      B --> A:  { R_A + K , {R_A}_K , {R_B}_K }_S
> >      A --> B:  R_B
> 
> Do A and B authenticate subsequent messages between each other using a
> MAC based on the shared secret K?  If so, wouldn't it be better to
> have different keys for the stream cipher and the MAC to avoid
> unpleasant interactions?  If not, how are those messages protected
> from tampering?

Yes, subsequent messages are suitably protected from tampering via
an encrypted checksum.  Thanks for reminding me of that.

> I don't think it's wise to try and defend against a known plaintext
> attack to recover S.  If the block cipher is secure, and S is 256 bits
> and random, then there should be no way in here.

A known plaintext attack is not really a problem, I agree.  In the
revised protocol I mainly wanted to defend against a *chosen*
plaintext attack, since the client initially gets to choose a nonce.
I'm not at all familiar with how well various algorithms stack up
against chosen plaintext attacks as opposed to known plaintext
attacks, but I feel a lot better not giving a cryptanalyst that much
extra leverage, just in case.

>If S is based on a
> passphrase or other such not-very-random quantity, then this protocol
> is as vunerable to a dictionary attack as any challenge/response, and
> AFAIK only public-key techniques such as SRP can really save you;
> since you can't afford the computrons for PK there may be a
> next-best-thing but it will complicate the protocol somewhat.  Key
> stretching will certainly help.

If all I have to worry about is a dictionary attack against a
passphrase, I'll be quite content... my main worry at this stage is to
not accidentally create a protocol that is easier to attack than the
shared secret itself.

Thanks for your feedback.

+------------ Edward Keyes, [EMAIL PROTECTED] -------------+
|................ http://web.mit.edu/eakeyes/www/ ................|
|.... DaggerWare: "small, sharp, and with a heck of a point!" ....|
+- "A little inaccuracy saves a world of explanation." C.E.Ayres -+


------------------------------

From: "Tom Norback" <[EMAIL PROTECTED]>
Crossposted-To: sci.skeptic,sci.philosophy.meta
Subject: Re: *** Where Does The Randomness Come From ?!? ***
Date: Tue, 02 Feb 1999 03:51:25 GMT



>
>  > When Bohr was asked for the complementary property of "truth" he
>  > thought about it for a while and then said "clarity".
>
>Cool.  Do you have a reference for that quote? That's a Bohr-ism I
>didn't know before and like a lot.
>


It's from a footnote in Steven Wienberg's "Dreams of a Final Theory".  I
don't own the book or I'd give you the page number.  (If I recall, one of
the chapters made a remarkably good case for reductionism.  Other than that
I remember thinking that Wienberg is clearly a better physicist than a
philosopher.)

As far as your causality/acausality inquiry goes, I think I've already said
too much.  It just seems to me that the C(n) formalism implicitly assumes
that things can "just exist" when QT only seems to allow "existing for".

Like Feynman said, "Science hasn't found any stuff yet."

Later,

Tom Norback



------------------------------

From: karl_m <[EMAIL PROTECTED]>
Subject: Historical basis for BLOWFISH
Date: Tue, 02 Feb 1999 03:22:31 GMT

As I understand the history of BLOWFISH, Bruce_S attempted to correct
shortcomings from his earlier work, published in his 1993 book, where the key
schedule progressed via XOR accumulation from a large array of precomputed
keys by linear selection using each bit of plaintext.  The problem was the
subsequent weakness to LINEAR analysis, e.g. encrypt/normalize
0x8000000...000 to expose key 0, etc.

Why were S-BOXES chosen v. a non-linear method of key progression?  Are
S-BOXES in fact a proven non-linear means of KEY scheduling?  Where might I
fill in my understanding in this area????  Thanks, Karl M.

============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/       Search, Read, Discuss, or Start Your Own    

------------------------------

From: Frank <[EMAIL PROTECTED]>
Subject: Re: irrational idea continued
Date: Mon, 01 Feb 1999 20:22:16 -1000

almis wrote:
> 
> As an interested amateur i don't respond as often as i should,
> but the responses to 'too simple to be safe' warrent some comment.
> 
> First: The idea of the square root of a prime is poor because the
> continued fraction of such a number is periodic. Perhaps the cube root
> or a combination of this and a transendental.

Too Simple To Be Safe was an interesting discussion. Thank you for trying 
to improve upon it. The simplicity was attractive since it could be 
taught or memorized easily.


> 
> Second: The algorithm itself affords protection.
> As an example lets expand the cube root of 3 a little.
> k=1.44224957 ...
> The algorithm states that you throw away the leading digit and
> treat the rest as a string of numbers.
> Take each number in turn and turn it into its bit representation.
> That is 4 =>100
> Add this to the key K.
> Take the next number, and do it again. i.e. 4=>100
> Now K=100100
> Contine till the end.
> So K=100100101010010011011101
> Use K and XOR to encrypt your data.
> One can already see the problem.
> Given ciphertext and associated plaintext one easily
> derives the correct key K.
> However the algorithm gives only a partial answer to the question:
> What numbers were used to generate this key ?
> How about 204105004635  ?
> Sure, but this does not match the expansion for the correct integer.
> And there are a lot more...
> This looks enough like a one-way function to warrent further investigation.
> 
> Lastly: I await a reprint of a Lenstra article concerning the non-randomness
> of some irrational and trancendental numbers and some ideas on Lattice
> reduction.

Once you read Lenstras' article you can make an algorithm that is not 
Simple so that their math does not break your cipher. Taking the root of 
a prime and deleting every other digit might be resistant to their 
calculations.


> So the issue still remains; given that God has wispered in your ear the
> correct number sequence,
> starting probably at some offset from the begining. Can one find the correct
> integer whose
> cube root was used to generate the sequence?
> 
> Some more ideas to mull.
> Given that the irrational space is dense. One can generate any random
> sequence of integers
> and there will exist an irrational number that will expand to the same
> sequence.
> There also exists an irrational number, A, whose expansion is the string of
> prime numbers.
> That is  A=.2357111317...
> It would be nice to know if A can be written in another way, you know, a+SQR
> T(b) or someting
> like that. (i tried a fractional expansion, but it didn't look very
> promising.)
> 
> And when all's said and done we still achieve perfection in the one-time
> pad.
>   ...al

You should be able to succeed by adding a little complexity to your 
original Simple proposal.

------------------------------

From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Re: Historical basis for BLOWFISH
Date: Tue, 02 Feb 1999 05:02:47 GMT

On Tue, 02 Feb 1999 03:22:31 GMT, karl_m <[EMAIL PROTECTED]> wrote:

>As I understand the history of BLOWFISH, Bruce_S attempted to correct
>shortcomings from his earlier work, published in his 1993 book, where the key
>schedule progressed via XOR accumulation from a large array of precomputed
>keys by linear selection using each bit of plaintext.  The problem was the
>subsequent weakness to LINEAR analysis, e.g. encrypt/normalize
>0x8000000...000 to expose key 0, etc.

Wow.  Where did this Blowfish history come from?  It has nothing to do
with reality (as I know it).

Blowfish was invented after I wrote the first edition of Applied
Cryptography.  It was first presented at the 1st Fast Software
Encryption workshop on December 1993, and has never been modified.

The key schedule never had any weakness to linear analysis (at least
none that has ever been published).  There are classes of weak keys
that can be detected in reduced round variants (but not in the full 16
round Blowfish), and there is an attack against 5-round Blowfish.

>Why were S-BOXES chosen v. a non-linear method of key progression?  Are
>S-BOXES in fact a proven non-linear means of KEY scheduling?  Where might I
>fill in my understanding in this area????  Thanks, Karl M.

The S-boxes were chosen the way they were because that provides a
random (as far as cryptanalysis is concerned) set of key-dependent
S-boxes.  There are no proofs, but I don't know any other cipher with
provable key schedules.

I'm not sure where to send you for more information, although I am
curious where your Blowfish history came from.

Bruce
**********************************************************************
Bruce Schneier, President, Counterpane Systems     Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN  55419      Fax: 612-823-1590
           Free crypto newsletter.  See:  http://www.counterpane.com

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to