Cryptography-Digest Digest #169, Volume #9        Mon, 1 Mar 99 18:13:02 EST

Contents:
  Re: Can the quantum computer determine the truth from a lie? (Paul Kinsler)
  Re: Testing Algorithms [moving off-topic] ("Trevor Jackson, III")
  Re: compression?security
  Re: Scramdisk/DATMAN ("hapticz")
  Re: My Book "The Unknowable" (Terry Ritter)
  Re: Why Physical Randomness? (John Savard)
  Re: Randomness based consciousness?. (Was: Re: *** Where Does The Randomness Come 
From ?!? *** ) ("james d. hunter")
  Re: My Book "The Unknowable" (R. Knauer)
  Re: Why Physical Randomness? (R. Knauer)
  Hardware Random Numbers: Not an *explicit* feature (John Savard)
  Re: My Book "The Unknowable" (Bennett Standeven)
  Re: Musings on the PKZip stream-cipher (Terry Ritter)

----------------------------------------------------------------------------

Crossposted-To: alt.privacy,talk.politics.crypto
From: [EMAIL PROTECTED] (Paul Kinsler)
Subject: Re: Can the quantum computer determine the truth from a lie?
Date: Mon, 1 Mar 1999 20:05:48 +0000 (GMT)

In alt.privacy R. Knauer <[EMAIL PROTECTED]> wrote:
> On Mon, 1 Mar 1999 10:32:44 +0000 (GMT), [EMAIL PROTECTED]
> (Paul Kinsler) wrote:
> >There is nothing "spooky" or questionable about the theoretical basis 
> >of quantum computing.  
> There is when you start talking about a QC that uses quantum
> entanglement, like in teleporation, quantum compression and superdense
> codes.

> Those "ebits" (entanglement bits) are very spooky indeed.

Just so I make myself clear: no matter how 'spooky" or wierd you
or I might find quantum mechanics, and it's sub-field of quantum
computing; this has no bearing on the rigour of the theory, and 
does not mean that predictions made by the theory are in any way 
unreliable.

If you want to talk about flaws that you've detected in published 
qc algorithms or computing schemes, I'll see you in sci.physics.research. ;-)


#Paul.


------------------------------

Date: Mon, 01 Mar 1999 15:40:45 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Testing Algorithms [moving off-topic]

Patrick Juola wrote:

> In article <[EMAIL PROTECTED]>,
> Darren New  <[EMAIL PROTECTED]> wrote:
> >> > > > Think of it this way -- what's the minimum amount of energy necessary
> >> > > > to move a brick five feet (horizontally)?
> >> >
> >> > One photon?
> >>
> >> A photon of what energy?  Are we talking gamma rays or radio waves?
> >
> >My point is that if your key is 2^300 long, there are fewer than 2^300
> >photons in the universe, regardless of the energy involved.
>
> Yes, but if you get your photon back at the end of the process, you
> can use it over and over again.
>

Consider a photon bouncing between two mirrors.  Every reflection provides some
impulse to the mirrors.  If they were in free space they would gradually
accelerate away from each other.  This process is limited by the gradual relative
reddening of the photon as perceived by each mirror.  There may be some limiting
velocity < c, but I do not see one.

Given less then perfect mirrors there is some changce of absorption rather than
reflection at each meeting of photon and mirror.  The expected wait for absorption
may limit the terminal velocity in some way, but again, I do not see it.

Conservation of energy demands that the photon disappear when the mirrors have
gain energy matching it.  What is the mechanism?





------------------------------

From: [EMAIL PROTECTED] ()
Subject: Re: compression?security
Date: 1 Mar 1999 20:28:45 GMT

[Posted and mailed]

In article <[EMAIL PROTECTED]>,
        Medical Electronics Lab <[EMAIL PROTECTED]> writes:
> alex wrote:
>>   Can any experts tell me that what is the relationship between dat
>> compression and data security?  I am new in security and so where can I
>> pick up some basic materia?
> 
> There are lots of books on security, cryptography (they are not the
> same thing!!) and compression.  Head to your nearest library or book
> store.
> 
> data compression removes duplicate data.  Sophisticated data compression
> squashes out duplictions down to the bit level.  To get a very simple
> idea, the sequence abababab can be squashed to 5ab.  That's all there
> is to compression.  It has nothing to do with security, except it's
> a very good first step.
 Not true- see my other post. Real compression uses far better techniques than that. 
One measures H(X), the entropy, to decide how far down the data can be compressed. 
Typically, one does not gain much (if anything) by compressing in the manner you 
suggest. How it works, is by measuring entropy via the probability distributions of 
the data. i.e., suppose an alphabet of {a, b, c, d} is used in a message and has the 
following distribution:
P(X=a) = .5
P(X=b) = .25
P(X=c) = .125
P(X=d) = .125
Then a binary representation that would give savings would be
a |--> 0
b |--> 10
c |--> 110
d |--> 111
The average # of bits per symbol needed to encode the data this way would then be 
(1)(.5) + (2)(.25) + (3)(.125) + (3)(.125) = 1.75, a savings of .25 bits/symbol over 
the natural encoding.
Cryptography and compression both fall under the same part of information theory- 
namely source coding. We use many similar tools in the study of each.
We use the entropy of the data to determine how much compression is possible using 
smart coding. See my other post for a bit on the relation to cryptography.

Cheers,
Chris

------------------------------

From: "hapticz" <[EMAIL PROTECTED]>
Subject: Re: Scramdisk/DATMAN
Date: Mon, 1 Mar 1999 16:21:40 -0500

setup was:

create 2megbyte .svl file on hardrive, (blowfish default, 10 character
password)

copy (not moved) .svl file to datman system tape using datman interface

attempt access to .svl file on datman virtual drive

got the password entry , entered password (known correct)

wait on delay time as datman accesses the .svl file

suddenly (unexpectedly!) the scramdisk password entry window opens with
"selected" hidden password still in the box

retry, retry, retry.......

drop out of scramdisk, then have a look at the .svl file on the datman
drive.

it was zeroed on 5 separate attempts i made to utilize.

my datman setup has 4megs of cache space located on c: drive for rapid
access to datman catalogs,file, other stuff, & unknowns

perhaps extreme datman access time is timing out something in the scramdisk
process?  then somewhere the two are trying to access the same "open" file??

and maybe it gets left open and the "file handle" ends up getting lost??

--
best regards
[EMAIL PROTECTED]





------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Crossposted-To: sci.math,sci.physics,sci.logic
Subject: Re: My Book "The Unknowable"
Date: Mon, 01 Mar 1999 20:00:49 GMT


On Mon, 01 Mar 1999 19:19:00 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (R. Knauer) wrote:

>[...]
>The post processing consists in the usual anti-skewing and hashing to
>distill the approximate 1 bit of entropy in text. One poster on
>sci.crypt suggests a CRC hash, so you would have to feed it 32
>characters to get 4 back if it were CRC-32. 

I think this is too tight:  If we are willing to assume an "entropy"
of 1 bit per character, processing only 32 chars to get a 32-bit
result does not discard enough.  32 bits of entropy through linear
processing is at least theoretically reversible.  

I would like to see 2 or 3 times that amount of input for the same
output.  I want to throw away 1/2 or 2/3 of the entropy that we are
willing to guarantee, and if the guarantee is only statistical, we
probably need to throw out even more.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------

From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Why Physical Randomness?
Date: Mon, 01 Mar 1999 21:55:04 GMT

Anthony Naggs <[EMAIL PROTECTED]> wrote, in part:

>I've been watching Intel's website, and in the last couple of days a
>number of Pentium III manuals and such have started to trickle out.
>Despite filling my hard drive with these PDF files I can't find any
>description of the hardware RNG.  Not even in the P-III datasheet; which
>fits with Terry's suggestion that the RNG could be on one of the support
>chips.

Looking at the one .pdf document that lists the instructions, I didn't
see one. Hopefully, this will sort itself out in a few days - I would
be dismayed if the new Pentium III chip were to be lacking the one
feature that made it really interesting to me.

John Savard (teneerf is spelled backwards)
http://members.xoom.com/quadibloc/index.html

------------------------------

From: "james d. hunter" <[EMAIL PROTECTED]>
Crossposted-To: 
sci.skeptic,sci.philosophy.meta,sci.psychology.theory,alt.hypnosis,sci.logic
Subject: Re: Randomness based consciousness?. (Was: Re: *** Where Does The Randomness 
Come From ?!? *** )
Date: Mon, 01 Mar 1999 17:12:49 -0500
Reply-To: [EMAIL PROTECTED]

Seisei Yamaguchi wrote:
 > 
 > Hi, this is Seisei.
 > 
 > In <[EMAIL PROTECTED]>, james d. hunter
<[EMAIL PROTECTED]>  wrote {
 >  Seisei Yamaguchi wrote:
 >   > In article <[EMAIL PROTECTED]>,
 >  [EMAIL PROTECTED] wrote:
 >   >
 >   > >I don't know where this idea of Random based conscioussness
comes
 >   > >from, Random Consciousness is an oxymoron...
 >   >
 >   > You are right. And I wrote {
 >   >
 >   >         If the link ---adaptive network--- pattern of the brain
cells
 >   >          (include pattern generating routine (distributed system)
)
 >   >         is TRUE RANDOM,
 >   >
 >   >         it means our consciousness
 >   >           ---cells network organized from astronomical number of
 >   >         pulses come from the interface and self feedback
system---
 >   >         is controled by TRUE RANDOMNESS.
 >   > }.
 >     That has to be a priori wrong.
 >     It is impossible for TRUE RANDOMNESS to CONTROL *anything*.
 > }
 > 
 > Wow, nice indication.
 > Then, I fix the bug, {
 > 
 >         s/is controled by TRUE RANDOMNESS/is TRUE RANDOMNESS/
 > 
 > }.


   That's not really the bug. The bug fix is:
   "TRUE RANDOMNESS" is a sequence of 15 letters
   that makes cryptograhpers and coders think that
   they're hiding something.

   ---
   Jim

------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Crossposted-To: sci.math,sci.physics,sci.logic
Subject: Re: My Book "The Unknowable"
Date: Mon, 01 Mar 1999 22:18:50 GMT
Reply-To: [EMAIL PROTECTED]

On Mon, 01 Mar 1999 20:00:49 GMT, [EMAIL PROTECTED] (Terry Ritter) wrote:

>I would like to see 2 or 3 times that amount of input for the same
>output.  I want to throw away 1/2 or 2/3 of the entropy that we are
>willing to guarantee, and if the guarantee is only statistical, we
>probably need to throw out even more.  

Is there a quantitative reason for wanting 2 or 3 times the amount
calculated linearly?

IOW, if it is assumed that there is one bit of entropy per 8-bit
character of English text, why pick  64 or 96 characters for just 4
characters out of the CRC-32?

Or is that just an engineering fudge factor, like designing a bridge
to withstand 3 times the anticipated load just to be safe?


Bob Knauer

"There is much to be said in favour of modern journalism. By giving us the opinions
of the uneducated, it keeps us in touch with the ignorance of the community."
--Oscar Wilde


------------------------------

From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: Why Physical Randomness?
Date: Mon, 01 Mar 1999 22:21:45 GMT
Reply-To: [EMAIL PROTECTED]

On Mon, 01 Mar 1999 21:55:04 GMT, [EMAIL PROTECTED]
(John Savard) wrote:

>Looking at the one .pdf document that lists the instructions, I didn't
>see one. Hopefully, this will sort itself out in a few days - I would
>be dismayed if the new Pentium III chip were to be lacking the one
>feature that made it really interesting to me.

I read today that AMD has more machines with their chip sold than
Intel.

Bob Knauer

"There is much to be said in favour of modern journalism. By giving us the opinions
of the uneducated, it keeps us in touch with the ignorance of the community."
--Oscar Wilde


------------------------------

From: [EMAIL PROTECTED] (John Savard)
Crossposted-To: talk.politics.crypto,comp.sys.intel
Subject: Hardware Random Numbers: Not an *explicit* feature
Date: Mon, 01 Mar 1999 22:43:05 GMT

At

http://developer.intel.com/design/PentiumIII/prodbref/

and

http://developer.intel.com/procs/perf/PentiumIII/brief/summary.htm

it is noted that the Pentium III chip contains a special diode, on the
chip itself, which can be used to check that the chip is not getting
too hot.

That feature is probably the source of the claims - not echoed in the
list of chip features - that the chip has a built-in random number
generation capability.

I haven't yet, however, located the instruction to access that diode,
but hopefully, with this lead, someone will do so shortly.

John Savard (teneerf is spelled backwards)
http://members.xoom.com/quadibloc/index.html

------------------------------

From: Bennett Standeven <[EMAIL PROTECTED]>
Crossposted-To: sci.math,sci.physics,sci.logic
Subject: Re: My Book "The Unknowable"
Date: Mon, 1 Mar 1999 11:58:04 CST



On Mon, 1 Mar 1999, Neil Nelson wrote:

> In article <[EMAIL PROTECTED]>,
> 
> Neil Nelson wrote:
>  
> > On  _true_  randomness,  mathematically,  randomness  is  only  with
> > respect to some  non-random  viewpoint; it says something  _appears_
> > random as against  saying  something  _is_  random.   _True_ or pure
> > randomness  does not have an  identifying  procedure  because  there
> > would be no  non-random  vantage  point  from which to make sense of
> > anything at all including any randomness.   _True_  randomness is an
> > incoherent,  self  contradictory  notion by  definition.   Those two
> > words can be combined but no definite sense can be made of them.
>  
> Bob Knauer wrote:
>  
> [ If you try to describe randomness as that which cannot be described,
> [ you run afoul of the Berry Paradox.  Take your statement above:
>  
> [ " _True_ randomness is an incoherent, self  contradictory  notion by
> [ definition."
>  
> [ Notice that you have  described  True  Randomness  in a  paradoxical
> [ manner, by the very fact that you described it.
>  
> That's right.  _True Randomness_ is a self contradictory (paradoxical)
> notion.
>  
> [[ For  example  the  sequence  101010...10  is not prefix  complexity
> [[ random,  although  it is a true  random  number  by  virtue  of its
> [[ generation by a TRNG.
>  
> > it  needs to be asked in what  manner  this  sequence  is used  with
> > respect to cryptography.  It would clearly not be wise to use such a
> > string  of  sufficient  length  as a bit  overlay  since the  prefix
> > complexity is small and that sequence could be easily attacked.
>  
> [ Not if it were used in a properly  implemented OTP cryptosystem, one
> [ that is provably secure.
>  
> [ It is one of the  strings of the same length as the  message, so the
> [ attacker  cannot  decide if the  message he detects is the  intended
> [ message,  under the  hypothesis  that that is the  correct  key.  He
> [ needs more  information  to go on than just the  regularity  of that
> [ keystring.
>  
> [ For example, what if there are two intelligible messages:
>  
> [ A = "ATTACK AT DAWN"
>  
> [ B = "ATTACK AT DUSK"
>  
> [ and I XOR them together to get a key K:
>  
> [ K = A XOR B.
>  
> [ Now I send "cipher" A openly and send the key K by a secure channel.
> [ You intercept the cipher A, but not the key.
>  
> [ Which message is the intended  message?  My correspondent  will know
> [ unambiguously because he has the key.
>  
> I am having trouble with this sequence.   There is: (1) the message to
> be  secured,  (2)  the  key  sent  by a  secure  channel,  and (3) the
> encrypted message sent openly.  If I needed to run an XOR each time to
> get my key that would be the length of my  intended  message and could
> send the key  securely,  I should  just send my message  securely  and
> forget encryption.
>  
> Traditionally the sequence would be: A is the message to be encrypted,
> B is the  key, XOR A and B and send K in the open and B  secure.   The
> correspondent  reverses  the XOR with B to get A.  But as I  mentioned
> previously  if the key (a part of the  cryptosystem)  is at  least  as
> complex  as  the  message  then  the  message  cannot  likely  not  be
> decrypted.
>  
> [ To make  this  point  even  clearer  let's say my two  messages  are
> [ identical:
>  
> [ A = "ATTACK AT DAWN"
>  
> [ B = "ATTACK AT DAWN"
>  
> [ Now the key K is:
>  
> [ K = A XOR B = 000...0
>  
> [ But you don't  know that, so you still do not know what my  intended
> [ message is, not with the scant amount of  information I have allowed
> [ you to have.
>  
> [ Since  K  can  be  any   sequence,   even   000...0,  and  they  are
> [ equiprobable,  you have nothing to cause you to decide that either A
> [ or B is my intended message.
>  
> Clearly,  if you can send a key the  length of the  encrypted  message
> after the message has been  encrypted,  then the encrypted  message is
> technically  redundant as you have shown.  But  traditionally, the key
> is  relatively  small with respect to the entire  message set used for
> that key and the key is sent  before  any  particular  messages  to be
> encrypted are known, making the previous  sequence  incompatible  with
> common circumstances.
>  
> > Apparently   non-random  sequences  appear  randomly  in  relatively
> > complex (relatively random) sequences,
>  
> [ How do you know that those  sequences are  non-random?   By your own
> [ "definition",  you cannot tell if  something is random, so you can't
> [ tell if it is non-random either.
>  
> Non-random is defined as a string that can be completely defined via a
> smaller string (prefix code) within a given string  generating  system
> (language).   E.g., I could write a sequence of 1000 0's here, which I
> will not do as it would  interrupt  the reader's  comprehension,  or I
> could just say,  ``consider  a sequence  of 1000 0's", which is a much
> shorter  expression  meaning the same thing.  One  expression  is very
> long with respect to the other yet they both mean the same thing.  The
> long  expression  (the actual 1000 0's) is non-random  relative to the
> English  expression.  The English expression is non-random relative to
> the language  (English) I am using,  though it is  essentially  random
> with respect to any other string that will give the same meaning as it
> is not expected any shorter  expression  in English will give the same
> meaning.
>  
> So initially,  there are strings, then a language on strings, and then
> a compression  of strings using  language.  The language is defined to
> be non-random,  incompressible  strings are random with respect to the
> language,  and  compressible  strings  non-random  with respect to the
> language.    Note  this  is  not  _true_   randomness  but  _apparent_
> randomness with respect to a language.
>  
> [ If you could  state that  something  is  non-random  because of some
> [ necessary  and  sufficient  property,  then  you  could  state  that
> [ something is random  because it lacks that  necessary and sufficient
> [ property.    Yet   you   have   stated   that   randomness   is  not
> [ characterizeable,  at least not  formally.   That Ol' Berry  Paradox
> [ sure is a bummer, eh.
>  
> Randomness   (apparent   randomness)  with  respect  to  a  non-random
> perspective  (a  language)  is  definable,  but  randomness  without a
> perspective  (true  randomness)  is not  definable  except in the self
> contradictory manner noted above.
>  
> [ Those  "non-random"  sequences must appear out of necessity.  Random
> [ numbers  must be  normal in the Borel  sense.   Therefore  every bit
> [ group must be present, even ones that flunk  simplistic  statistical
> [ tests for bias.
>  
> [ That means every finite sequence of numbers, even those with highest
> [ regularity,  like 000...0, must be present in a true random  number,
> [ even those  sequences  where the run length of one particular bit is
> [ nearly infinite.
>  
> First we must  define  what it is to have a random  number,  which was
> just  indicated  to  be  according  to  a  non-random  perspective  (a
> language).  If, according to the previous  discussion,  random means a
> string  sequence not  compressible in the language then a sufficiently
> long run of 0's would be compressible and hence that string not random
> wherever it might appear.

Careful; consider: Let f(n) map a natural number n onto the maximum
runtime for all programs of length less than n. Clearly this must increase
faster than any computable function of n. Now define a number x by:

Sum(1..oo) 2^(-f(n))

x is not computable, and so is random relative to computable descriptions,
but the probability that a string of digits chosen from the number
consists entirely of zeroes is 1.



------------------------------

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Musings on the PKZip stream-cipher
Date: Mon, 01 Mar 1999 22:54:17 GMT


On Mon, 01 Mar 1999 11:25:23 -0700, in
<[EMAIL PROTECTED]>, in sci.crypt Sundial Services
<[EMAIL PROTECTED]> wrote:

>Although I gave up on the PKZip stream-cipher long ago as a basis for a
>securable software-distribution mechanism (itself now abandoned), I
>continue to be interested in stream ciphers and would very much like to
>find more on what has been published about them.  (Nearly all of the
>literature is on block ciphers.)  
>While my technical interest in this cipher is retired, I'd also like to
>stimulate any discussion I can on ways of analyzing and breaking such
>ciphers.  I'd also like to know more about exactly why this cipher was
>constructed the way that it was.
>
>The core algorithm of the cipher is Update_Keys(char):
>  Key(0) <- crc32(key(0),char)
>  Key(1) <- Key(1) + (Key(0) & 000000ffH)
>  Key(1) <- Key(1) * 134775813 + 1
>  Key(2) <- crc32(key(2),key(1) >> 24)

I have several times reported here that some years ago I was able to
resolve this part of the PKzip cipher.  There is also an output step
not shown here which I did not resolve.

The core is linear.  Yes, it is a couple of different styles of
linear, which is basically the strength argument used in IDEA.
Nevertheless, I was able to reconstruct the 12 bytes of internal state
given exactly 12 bytes of known output (also the known input from
feedback, of course).  In retrospect, this is what we expect from any
linear system.  And once the state is resolved, we can run it forward
or backward as desired.  

This was a long time ago, before the web, and before I was doing
somewhat formal articles.  And without the last step, it is not really
a complete solution anyway.  I'd have to get back into the work to
discuss details, and that seems unlikely now.  

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and sci.crypt) via:

    Internet: [EMAIL PROTECTED]

End of Cryptography-Digest Digest
******************************

Reply via email to