Cryptography-Digest Digest #34, Volume #9         Thu, 4 Feb 99 16:13:04 EST

Contents:
  UK Postgraduate Workshop in Combinatorics and Applications (Simon Blackburn)
  Re: Implementation of SHA1 (DJohn37050)
  Re: Cracking 128bit (Jim Felling)
  Re: Who will win in AES contest ?? (Bruce Schneier)
  Re: Sanity check on authentication protocol (Paul Onions)
  Re: On a Method of Session Key Generation (revised) (Terry Ritter)
  Announce: Order Maven 1.0 (Kent Briggs)
  Re: RNG Product Feature Poll (Terry Ritter)
  Re: Who will win in AES contest ?? (wtshaw)
  Re: hardRandNumbGen (Herman Rubin)
  Re: *** Where Does The Randomness Come From ?!? *** ("PAC")

----------------------------------------------------------------------------

From: Simon Blackburn <[EMAIL PROTECTED]>
Crossposted-To: sci.crypt,sci.math.research
Subject: UK Postgraduate Workshop in Combinatorics and Applications
Date: 4 Feb 1999 07:56:11 -0800


COMBINATORICS AND COMMUNICATIONS APPLICATIONS
Royal Holloway, University of London, UK
 
A two day workshop (jointly sponsored by the EPSRC and the LMS under
the Mathfit programme) on Combinatorics and Communications
Applications will be held at Royal Holloway, University of London on
14 and 15 April 1999. The workshop is particularly aimed at PhD
students who would like to know more about the area, although any
interested participants are welcome. The registration and
accommodation are free for EPSRC students. The workshop is timed so
that participants can easily stay for the Postgraduate Combinatorial
and Group Theory Conferences that take place on 15, 16 and 17 April.
 
The programme:
 
Professor Fred Piper, Royal Holloway
  An introduction to cryptography
 
Professor Doug Stinson, University of Waterloo
  Recent results on key distribution patterns and related structures
 
Dr Ray Hill, University of Salford
  An introduction to error correcting codes
 
Dr Kenny Paterson, Hewlett-Packard Laboratories, Bristol
  Codes, correlations and exponential sums
 
Professor Mike Walker, Vodafone / Royal Holloway
  Authentication - Theory and practice
 
Professor Jim Massey, formerly of ETH Zurich
  The interplay between algebraic coding theory and cryptography
 
Professor Jack van Lint, Technical University of Eindhoven
  The identifiable parent property
 
Professor Dominic Welsh, University of Oxford
  Channel assignment problems
 
 
REGISTRATION DETAILS
 
The deadline for registration is March 12th 1999. For further
information please see the web page:
http://www.isg.rhbnc.ac.uk/Mathfit.htm
or contact Dr. Steven Galbraith either by email:
[EMAIL PROTECTED]
or by post at:
Mathematics Department,
Royal Holloway, University of London,
Egham TW20 0EX, United Kingdom.

------------------------------

From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: Implementation of SHA1
Date: 4 Feb 1999 16:10:46 GMT

Look on NIST website as www.nist.gov for specs for SHA-1
Don Johnson

------------------------------

Date: Thu, 04 Feb 1999 10:59:36 -0600
From: Jim Felling <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Cracking 128bit



Anonymous wrote:

> I guess my point is I assumed that the 128bit algorithms like RC4
> used in the US versions of Netscape, IDEA and CAST used in PGP, and even
> 3DES were safe against brute force, and all known forms of cryptanalysis
> for, at minimum, the duration of the senders life, if not much longer.
> The 6 month time frame and half billion dollars is nothing to groups
> such as the NSA who sweep up all data traffic for analysis.
> I may be wrong but I thought even groups like distributed.net couldn't
> tackle 128bit encryption.

They cant really.  The thing is most users do not use secure password/phrase
setups, and/or have other implementational weaknesses.  Those
implementational weaknesses are what would allow such an attack to work that
quickly.  If the users password/phrase has sufficient entropy, and brute
force is the only reasonable attack then the 6 month figure is them using
FUD marketing tactics


------------------------------

From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Re: Who will win in AES contest ??
Date: Thu, 04 Feb 1999 16:52:47 GMT

On 4 Feb 1999 14:19:29 GMT, [EMAIL PROTECTED] (DJohn37050) wrote:

>I DID say over 25.  I just wanted to indicate that there were not just a
>handful submitted.
>Don Johnson

Sorry.  (Isn't there a discussion elsewhere in this newsgroup about a
128-bit brute force search taking "over six months"?)

Bruce
**********************************************************************
Bruce Schneier, President, Counterpane Systems     Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN  55419      Fax: 612-823-1590
           Free crypto newsletter.  See:  http://www.counterpane.com

------------------------------

From: [EMAIL PROTECTED] (Paul Onions)
Subject: Re: Sanity check on authentication protocol
Date: Thu, 04 Feb 1999 14:19:28 GMT

On Tue, 02 Feb 1999 23:21:38 -0600, Eric Norman
<[EMAIL PROTECTED]> wrote:

>The point I was trying to make is that you don't need any fancy
>back and forth protocol to get authentication when Alice and Bob
>share a secret.  Just one message from Alice to Bob will do as long
>as the message is carefully constructed and the mechanism for such
>construction is known by both.  Details about "careful construction"
>aren't specified, but additional protocol exchanges seem unnecessary.

Not sure I'd fully agree with this in the context of "on-line"
communications.

If Bob receives a "carefully constructed" authentication-message, how
does he know that Alice has just sent it?  If the message incorporates
a timestamp field then Bob can check this, but because their clocks
will not be exactly synchronised and transmission is not instantaneous
this will only stop replays outside of a specific window of
opportunity.  This may be OK for person-to-person communication, but
not for machine-to-machine communication.

On the other hand, if the authentication-message incorporates some
kind of "counter mechanism" then Alice and Bob will both have to
maintain the state of this counter between all of their communication
sessions.  This may be impractical for large networks of "transient"
communicants, and anyway could be easily disrupted by an active
adversary removing the authentication-message and so causing Alice's
and Bob's counters to get out of sync.  (and if Alice waits to hear
from Bob before updating her counter, then we effectively have a
"ping-pong" message protocol anyway!).

Using both of the above methods together would probably allow Bob to
detect the active removal/insertion of a message, but again we face
the synchronisation problem if an adversary removes
authentication-messages before they get to Bob.

Also, building such a system would require (accurate) trusted
time-sources and maintaining long-term ever-changing state
information.  To me this seems clumsy and prone to implementation
weaknesses, and is likely to degenerate to a "ping-pong" protocol
anyway.  The additional complexity may not be at all suitable for
small mobile/embedded devices.

I guess what I'm saying is that I think the "carefully constructed
message" approach is probably fine for off-line person-to-person
communications, but for chattering little mobile agents for example,
then I think that explicit nonce-based challenge/response type
protocols are more appropriate (and more easily scalable).

Paul(o)

------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: On a Method of Session Key Generation (revised)
Date: Thu, 04 Feb 1999 18:35:48 GMT


On Thu, 04 Feb 1999 12:19:24 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (R. Knauer) wrote:

>[...]
>2) There cannot be any reasons, since correlation cannot be removed
>algorithmically. Once the cryptanalyist discovers your method you are
>no better off than before you used it.

This is clearly false when the algorithm does not retain all of the
input information.

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM



------------------------------

From: Kent Briggs <[EMAIL PROTECTED]>
Subject: Announce: Order Maven 1.0
Date: Thu, 04 Feb 1999 18:45:28 GMT

Order Maven is a 32-bit Windows application that creates standalone
order form programs for online merchants.  These order forms have
built-in public key encryption and SMTP e-mail functions for securely
collecting credit card orders.

The full press release is available here:
http://www.briggsoft.com/promav1.txt

The shareware download page is here:
http://www.briggsoft.com/omaven.htm

--
Kent Briggs, [EMAIL PROTECTED]
Briggs Softworks, http://www.briggsoft.com



------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: RNG Product Feature Poll
Date: Thu, 04 Feb 1999 18:35:30 GMT


On 4 Feb 1999 09:48:52 -0000, in
<[EMAIL PROTECTED]>, in sci.crypt Paul Crowley
<[EMAIL PROTECTED]> wrote:

>[EMAIL PROTECTED] (Terry Ritter) writes:
>> On 3 Feb 1999 08:01:03 -0000, in
>> <[EMAIL PROTECTED]>, in sci.crypt Paul Crowley
>> <[EMAIL PROTECTED]> wrote:
>
>> >The Right Way to do randomness, as I understand it, is to get your
>> >device to produce output as raw as possible and use that output to
>> >drive a CPRNG.  
>
>> Leaving aside, for the moment, the idea that what we really need is a
>> hash and not a RNG, isn't it a bit silly to use a supposedly-strong
>> and thus "cryptographic" transformation on true random data?
>
>What's the actual disadvantage?  
>
>If the CPRNG works, the stream of numbers you get out is fully as
>unguessable as it can be, and entirely free from detectable bias.
>What more could you want?  You have to put about zero thought into any
>property of the input data besides its entropy, and if it gets fed a
>ton of zeroes for some reason halfway through its life it's not a
>problem.  It seems to me that using a CPRNG here is guaranteed to make
>your life easier, with no downside.  What downside do you foresee?

1.  We are trying to process biased really random values to unbiased
really random values.  A hash accumulates randomness, the CPRNG does
not.  

2.  A CPRNG has an internal state or seed of a particular size.
Absent a hash, we must fill that state with the biased data.  This
means that we get a smaller than apparent state-space, one with some
internal bias.  Now we depend not upon fundamental randomness of the
source but upon the designation "cryptographic" to hide a bias we know
exists.  

3.  A CPRNG deludes us by being called "cryptographic."  Basically it
forces us to start out by *assuming* that the mechanism cannot be
broken (for, if it could, it would not be "cryptographic" so that must
not be what we were talking about).  But in most cases, strength is
not a proven property.  Indeed, we know that in practice many of the
supposedly "cryptographic" RNG's designed for stream ciphers have in
fact been broken.  Using a CPRNG thus forces us to *hope* that our
particular choice is "strong."  This fails to recognize the whole
point of using real randomness.  

4.  Even an unbroken CPRNG is not necessarily "cryptographic" under
these circumstance.  If a randomness source produces relatively few
unique values, we will have only a few possible CPRNG sequences.  This
means that some sequences will be repeated in practice, which is
extremely dangerous and hardly "cryptographic."  In contrast, we just
put more randomness data into the hash, and eventually get a
full-randomness value out.  

5.  In comparison to other hashes, CRC is simple, fast, and
transparent to mathematical analysis.  What it does is clear, and
everything it does is captured in the basic concept of modulo.  CRC
does not have a lot of ad-hoc junk, or fancy math operations which are
"thought" to be strong, or "thought" to be complex.  CRC thus does not
confuse about what is being done, or what is needed.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM



------------------------------

From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Who will win in AES contest ??
Date: Thu, 04 Feb 1999 12:36:58 -0600

In article <79bse8$70b$[EMAIL PROTECTED]>, [EMAIL PROTECTED] wrote:

> >
> 
>   Me so happy!
>   For awhile I was afraid I was posting on a "dignified forum" I guess
> now that it is really an "undignified forum" it will actaully be a
> better and more productive forum. By the way I wont pronounce any
> findings in the spotlight of the mainstream press in Rome. You will
> have to read it here.
> 
> Dave Scott
> 
Some people confuse dignity with formality.  There is nothing more
undignified than to hide behind formality.
-- 
A much too common philosophy: 
It's no fun to have power....unless you can abuse it.

------------------------------

From: [EMAIL PROTECTED] (Herman Rubin)
Subject: Re: hardRandNumbGen
Date: 4 Feb 1999 14:36:11 -0500

In article <794bno$qdu$[EMAIL PROTECTED]>,
Patrick Juola <[EMAIL PROTECTED]> wrote:
>In article <[EMAIL PROTECTED]>,
>R. Knauer <[EMAIL PROTECTED]> wrote:
>>On 29 Jan 1999 08:59:41 -0500, [EMAIL PROTECTED] (Patrick Juola)
>>wrote:

>>>>You throw out the suspect bad generators and argue that such a
>>>>practice is safe. But what is your argument for the generators that
>>>>you do not throw out?

>>>That the amount of the bias *that I measured* is less than some
>>>threshhold.  If I can live with that threshhold and I believe that
>>>no other (untested) source of bias is likely to be present, then
>>>the cypher is safe to use.

>>>If I don't believe that -- or the threshhold is too high -- then
>>>I can't place any reliance on a negative result.

>>Then you accept bad generators and reject good ones.

>Absolutely.  With a particular, known, measured probability.

>Statisticians call these Type I and Type II errors.  The trick is
>to make the cost of those errors as small as possible while still
>keeping testing costs under control.

>       -kitten

If one can assume independence between generated sets, one can make
both of these errors quite small, if the point null is not assumed.
That is, the user must specify the inaccuracy allowed, which cannot
be zero.


-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558

------------------------------

From: "PAC" <[EMAIL PROTECTED]>
Crossposted-To: sci.philosophy.meta,sci.physics,sci.physics.relativity,sci.skeptic
Subject: Re: *** Where Does The Randomness Come From ?!? ***
Date: Thu, 4 Feb 1999 12:43:29 -0800

    I didn't take near enough time to fully answer your complicated stuff,


R. Knauer wrote in message <[EMAIL PROTECTED]>...
>On Wed, 3 Feb 1999 20:51:21 -0800, "PAC" <[EMAIL PROTECTED]> wrote:


>
>> but dealing with the real world and
>>not just algorithms, I still think that randomness/determinism has a
causal
>>factor beyond a viewer perspective, for both  QM and relativity.
>
>In algorithmic complexity theory, that causal factor, namely the
>"reason" for a non-random string being simple, is that it can be
>computed by an algorithm that is substantially smaller compared to the
>size of the number itself. IOW, it is "algorithmic reduction" that
>provides the reason for the simplicity of the number.
>
>That appears circular at first - "the reason numbers are simple is
>because they are simple" - but it isn't. The algorithm that is
>responsible for the reduction is not the same as the number. It is a
>separate number from the one being simplified. So the apparent
>tautology goes away. The correct statement is:
>
>"The reason a number is simple is that there is a simplifying
>algorithm that can reporduce it." That simple algorithm is the "cause"
>of the number being simple. The fascinating thing is that this cause
>is contained almost completely contained in the number itself.

    This is circular in a way, but reality might have to be viewed as
circular, so I wouldn't consider it as a knock rather than something that
has to be examined.
    If all reality merges into fundamental relations that render it in such
a way then that's just a nature of how reality must work, rather than saying
we must keep all elements separated.

>>    But Kolmogorov-Chaitin algorithms might fit perfectly with an
>>order/disorder interpretation of the universe where the additive constant
>>would be more easily seen as viewing the difference between order form
>>disorder.
>
>Not true. The additive constant is of order unity. The distinction
>between order and disorder is much larger than that.
>
>Typically numbers of size N can be reduced by an algorithm of size
>log2(N) + c, where c is that constant of order unity. That quantity is
>smaller by orders of magnitude than N itself - IOW it is exponentially
>smaller. The fact that there are very few numbers that can be
>algorithmically reduced points to the fact that order is a rare
>occurance in reality.
>
>
>The reason that order comes from chaos is because there is an Ordering
>Principle at work, namely the algorithm which reduces the complexity
>of number.
>
>The concept of algorithmic complexity reduction is quite simple on the
>surface, and is consistent with Occam's Razor. The reason it is
>important is because it results in many new discoveries about the
>properties of numbers that are not available anywhere else. It is
>intimately tied up with Godel's Incompleteness problem, Turing's
>halting problem and Chaitin's undecideablility problem.
>
>Chaitin claims that it goes to the very heart of mathematics and
>results in fundamental undecideability in number theory. That is,
>there are problems in mathematics which are not decidable - they have
>no reason for why they happen in terms of their solutions. That is
>completely at odds with millennia of formal mathematical theory, which
>has always held that problems in mathematics can always be solved if
>one puts enough effort into finding the solution.
>
>Chaitin exposes problms in mathematics that are formally impossible to
>solve - their solutions must be found experimentally, computationally.
>And the results of those experiments are completely random. That is,
>there is no cause or reason for why the solution took the form they
>did.
>
>One example which Chaitin gives is his exponential diophantine
>equation, a polynomial equation in 17,000 variables and one parameter
>that takes 200 pages to write down. For each value of the parameter,
>there are either a finite number of solutions to the equation or an
>infinite number of solutions. For each parameter value, whether the
>number of solutions is finite or infinite is completely random. That
>is, there is no reason for why the solutions are finite in number or
>infinite in number - they just are.
>
>Those solutions are tied up with the Turing halting problem, in that
>they are related to whether a corresponding Turing machine will halt
>or not. If you knew how to predict whether the equation has a finite
>or infinite number of solutions for a given parameter value, you could
>then predict whether the corresponding Turing machine would halt or
>not. And you could also know the bits of Chaitin's Halting
>Probability, Omega. But none of those are formally decideable, so
>there is undecideability in pure mathematics, just like there is
>undecideability in the real world.
>


    I would agree with a lot of this, that Okkams razor is as well served by
finding similarities as it is by lopping off excess steps,
    Looking at any physical process I guess the point your making regarding
simplifying algorithms would be that causality must be determined by the
more fundamental relations that are across the board more simplified and
similar in nature therefore able to act on the other.
    But the physical reasons have to be found for this.
    Then this might go with the idea that disparate entities couldn't act on
each other unless having common ground.  Therefore the determination of
common ground gives physical process the mutability necessary to determine
each other as opposed to disparate process which would leave things
unmutable.
    SO I can see that a common ground implies a fundamental simplicity, but
not yet that there has to be a functioning algorithm simplifying this beyond
the physical realties.
    But having an algorithm to express this apart from a natural process
would be against the grain of what that algorithm is expressing, keeping
things simple.  I'm not sure if an underlying algorithm would express this
more than a simple equation for everything to go to ground zero due to
fundamental mutability as opposed to keeping an ever-running equation that
somehow does this simplest easier.
    However, a fractal or some holographic thing is easiest seen how the
smallest part could be a reflection of the whole but to visualize the
process you're talking about would be different.  The best way I would think
of an interpretation of that type of simplifying algorithm would be to look
at the universe as a whole defining the focal point of a singular entity or
emergent structure.  Then we would have a cone-type shape consisting of the
simplified object at the point and the cone of the entire of the universe
using a simplifying algorithm to focus on the structure.  In this case the
algorithm could be as simple as the point because only one structure is
implied, not the multiplicity of the universe itself but how the universe in
represented by an algorithmic structure being nothing but the simplest
structure in the Universe, a single cone, yet focusing down to simplify and
_CAUSE_ the emergence o the said object at the focal point.
    This would seem to fit to what you're saying, other than that you' have
to give me a more exact representation on how the algorithm transfers to the
real world.

>>which I’m
>>pretty sure isn’t the standard determination —  here proposing something
>>more radical than our casual/randomness discussion, that causal and
>>randomness are in flux according to a viewer (or other reference point)
>>position.  I accept, as talked about early, the varying frames would give
a
>>casual identity instead of a determined or random one, but that would be a
>>constant and not open to change.
>
>I missed the early parts of this discussion, but from what I see you
>saying above, I can comment thusly. It sounds as if you believe in the
>world view called Phenomenology, made popular by German philosophers
>and also by the French philosopher Jean Paul Sartre. That world view
>is at odds with another world view called Realism.


    Or you can say that the theory just doesn't make sense, but what I meant
here that if an event could be defined differently from varying frames then
the event would not be determined from specific factors, yet causal because
the event has occurred.

>>    We had an identity thread a while ago that dealt with this, the main
>>gist of my moire philosophical viewpoint being since we are separate
>>entities and unable to merge with an object examined, unable then to find
>>absolute truth, that object examined will always be a representation:
>
>That is pure Sartre - "existential phenomenology". You are describing
>"being for itself". But Sartre is not a Realist - he denies the
>existence of order in the external world. The only thing that exists
>in the external world is "being in itself" - raw chaotic random
>primordial goo.

    Actually Sartre thought that Einstein fit very well in with his
theories, he considered his brand of existentialism similar to a space-time
line.  So he wasn't opposed to science just more interested in how our own
consciousness is separated from what it perceives by basically a nothingness
which would then imply the application of science being that the separating
element (consciousness) itself being reduced to null, and the external world
still shining through.

>
>He maintained in his book "Being and Nothingness" that order arises
>from consciousness, which in turn arises from not being a part of that
>goo. Your being conscious and not being a part of "being in itself"
>results in interpreting what is presented to your consciousness,
>namely the phenomenon of "being in itself", as ordered, namely  "being
>for itself". The Western scientist rejects the world view of Sartre,
>because it results in the denial of existence other than the purely
>random "being in itself".
>
>>     “I should rephrase this better using “consciousness” instead of
>>“knowledge”, which, in my opinion, is a more inclusive term:
>>    There’s no doubt that we can only experience what consciousness
presents
>>to us, but consciousness is always conscious of something, that something
>>being necessary to give actuality to this awareness.  Since the object is
>>always present when dealing with consciousness, then there must be an
>>object, physical or conceptual, within even the most minor thought process
>>that is beyond our ability to know – even though it is conditioned by
>>consciousness itself.  IF there is no dichotomy then there is no awareness
>>for conscious beings, but merely pure in-itself nature.”
>
>That is pure Sartre, pure Existential Phenomenology. And it is wrong
>if you believe in Western science. If you do not believe in Western
>science, then you have a lot of explaining to do about its incredible
>predictive capabilities. May I suggest you try explaining away that
>predictive power while sitting on the top of an atomic bomb that is
>about to go off.
>
>If you defuse the bomb before it goes off, that shows that you believe
>in science, because that atomic bomb came from the predictive
>capabilities of science. I have never seen anything predictive come
>from any other world view except Realism.

    In this context, all this philosophy implies is a separation rather than
no predictability.  If we couldn't predict events then reality would be
impossible in any way shape or form, but that does not mean we "know" the
object in-itself.  TO know the object in-itself would mean to totally 100%
merge with the object, something that math can't do for instance.
Predictability in this situation then means just the probability of an event
occurring, therefore not 100% determined therefore the same problems of
separation.

>
>>    Math being a closed system,
>
>I do not understand what you mean. If you mean that math is a complete
>system, then that is not correct. Mathematics suffers from its own
>indeterminancy (Chaitin's Theorem)  just as formal axiomatic systems
>do (Godel's Theorem) and computers programs do (Turing's Theorem).

    Never a complete system even when dealing with simple infinity, but
whatever answers that come to math will still be mathematical and resolved
by itself and nothing beyond that.  Otherwise leaving the ends open I
suppose can imply that even the simplest variable can be non-mathematical
being that it's never determined and the equations vary with the variables.
It has internal relations that will always resolve things mathematically.
    But this does imply that math=universe being that the universe would be
considered complete and math not, but this is obvious since math is not the
all of reality, yet the most approachable // to the universe that we have.
Absolute completeness would never be a part of math unless it encompassed
all of reality.

    Phil C.




------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to