Cryptography-Digest Digest #380, Volume #10 Sat, 9 Oct 99 12:13:02 EDT
Contents:
Re: Ritter's paper ("Trevor Jackson, III")
Re: US Crypto Policy: free speech? (CT Franklin)
Re: US Crypto Policy: free speech? ("Trevor Jackson, III")
Re: Compression of encrypted data
Re: Compression of encrypted data ("Douglas A. Gwyn")
Re: radioactive random number generator (Hironobu SUZUKI)
Re: radioactive random number generator (Harry H Conover)
Re: David A. Huffman, "father" of compression died yesterday (Mathew Hendry)
Re: radioactive random number generator (SCOTT19U.ZIP_GUY)
Re: radioactive random number generator (Herman Rubin)
Re: Ritter's paper
Re: Ritter's paper
----------------------------------------------------------------------------
Date: Fri, 08 Oct 1999 23:56:47 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: Ritter's paper
wtshaw wrote:
> In article <[EMAIL PROTECTED]>, "Trevor Jackson, III"
> <[EMAIL PROTECTED]> wrote:
>
> > OK, there are separate application domains whose metrics will always be
> distinct. So
> > pick a single domain, without making it trivially small, and define a
> domain-specific
> > metric.
> >
> > Pitchers can be compared to pitchers along several (~10?) dimensions.
> These comparisons
> > are based on the premise that each pitchers past performance is
> indicative of his future
> > performance. The permise does not appear to apply to cipher strength.
> In any domain.
>
> Come on now, there are lots of ciphers of relative strengths, some
> stronger than others by any measure you might take.
I disagree. We can inventory the weaknesses of ciphers, and thus measure the
upper bounds on their strength, but I do not see any way to establish a lower
bound on strength, which is the operational meaning of "known strength".
I am not referring to key length, number of rounds, complexity of rounds,
diffusion rate, etc. I'm referring to the ratio of the work factor between
having and lacking the key. If we could rule out catastrophic algorithmic
failures induced by new attacks such as differential, linear, slide,
boomerang, etc. we could thereby establish the minimal work factor ratio. But
it does not appear that we can rule out those kinds of failure except in cases
where key size meets or exceeds message size.
PRNG-based ciphers are considered insecure for two reasons. 1) The potential
unknown of the initial state is, in principle, reduced by half for each bit of
output generated. 2) In practice, we have methods of detemining the initial
state given the necessary number of bits of output, e.g., Berklecamp-Massey.
Note that #1 above applies to all modern ciphers given a known plaintext. The
discovery of a new attack upon a cipher is the discovery of a practical method
of untangling the initial state given some amount of output > the amount of
key. By this line of reasoning we could define the "efficiency" of an attack
as the ratio of the number of bits of output required over the number of bits
of state to be discovered.
Since all modern ciphers are weak in the sense of #1 above, they may always be
vulnerable to a discovery that reduces the theory to practice. This is the
sense in which, AFAICS, we cannot prove strength (lower bound), but we can
prove weakness (upper bound).
> To deny the importance
> of looking at these dimension means that *you* will not find them.
>
> Ritter has said that there are no good measures of strength...so, that
> points to something that needs attention, not that we should adopt a
> self-fulfilling prophesy that we will never know.
Agreed. I'm not claiming that we will never know. I'm claiming that it looks
pretty bleak.
------------------------------
From: CT Franklin <[EMAIL PROTECTED]>
Subject: Re: US Crypto Policy: free speech?
Date: Sat, 09 Oct 1999 03:08:31 +0000
"Trevor Jackson, III" wrote:
> > and valuable use of GPS, but don't expect the DoD to think that way.
> > They think of missiles.
>
> Well the 100 MPH GPS limit is supposed to prevent commercial GPS from being
> used by terrorists in missiles. But it reduces the utility of the device
> for some personal security applications.
>
> Someone thought very hard about missiles!
I think there are various limits on GPS receivers. But, commercial units
perform well above 100 MPH. For example the Trimble 8100 is speced at 0-800
knots (3g acceleration). See
http://www.trimble.com/products/specs/frm_av10.htm.
I think some GPS devices are limited to < 100 knots or so in order to prevent
their use in aviation and thereby avoid liability problems for manufacturers.
Regards,
CT
------------------------------
Date: Fri, 08 Oct 1999 23:38:06 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: US Crypto Policy: free speech?
CT Franklin wrote:
> "Trevor Jackson, III" wrote:
>
> > > and valuable use of GPS, but don't expect the DoD to think that way.
> > > They think of missiles.
> >
> > Well the 100 MPH GPS limit is supposed to prevent commercial GPS from being
> > used by terrorists in missiles. But it reduces the utility of the device
> > for some personal security applications.
> >
> > Someone thought very hard about missiles!
>
> I think there are various limits on GPS receivers. But, commercial units
> perform well above 100 MPH. For example the Trimble 8100 is speced at 0-800
> knots (3g acceleration). See
> http://www.trimble.com/products/specs/frm_av10.htm.
>
> I think some GPS devices are limited to < 100 knots or so in order to prevent
> their use in aviation and thereby avoid liability problems for manufacturers.
I believe that frame-mounted devices, e.g, in an airplane, are exempt, but
hand-held devices are restricted. Or so we were told by several manufacturers
around 1996. I'm not familiar with the Trimble 8100. Is it hand-held?
------------------------------
From: [EMAIL PROTECTED] ()
Crossposted-To: comp.security.pgp.discuss,alt.security.pgp
Subject: Re: Compression of encrypted data
Date: 9 Oct 99 03:39:17 GMT
Joseph Ashwood ([EMAIL PROTECTED]) wrote:
: Under any concievable language used to transfer data of any form, whether it
: is ASCII text, or raw binary data that has been compressed to the nth
: degree, there are going to be patterns that can be exploited to perform the
: compression, this is a fact of structure, and recurrence of data.
Strictly speaking, the sentence above is true. However, without
significant qualification, it is *extremely* misleading.
Binary data that has been compressed by any currently known method, where
that data was compressible to begin with, being something like text or
images, certainly can be compressed still further. All one has to do is
have a more elaborate model of the source.
So, one can replace Huffman coding of text based on letter frequencies to
coding based on word frequencies, and then take sentence structures into
account. But to do that, *diminishing returns* set in.
Some types of very simply encrypted data can still be compressed directly.
In general, though, the first step in efficiently compressing an encrypted
file would be to reverse - or effectively reverse - the encryption.
The fact that it may remain possible to compress - slightly - a file the
"hard" way doesn't in any way contradict the conventional wisdom that the
time to compress a file is before it is encrypted, and on the basis of a
model of its properties that has a _reasonable_ amount of detail, rather
than striving to improve compression slightly by going to a vastly more
complicated model.
And it is _definitely_ true that throwing away information about a file -
trying to compress it after it was compressed or encrypted without knowing
exactly what sort of compression or encryption was applied - is NOT the
way to achieve effective compression results.
John Savard
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Compression of encrypted data
Date: Sat, 09 Oct 1999 04:15:37 GMT
ashwood wrote:
> Douglas A. Gwyn (IST/CNS) <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]...
> > Actually, substitution of syllables, words, and/or phrases by
> > shorter code groups has been done for many centuries.
> But those all required prior knowledge of what would be encoded,
> Huffman does not, although to build the tree does require some
> knowledge.
Seems like a distinction without a difference. Huffman codes are
built from an estimation of the source statistics, just as are
codebooks.
> > > .... Now assuming that it takes a mere 50 years of time to
> > > develop a good huffman encoding for something the complexity
> > > of English ...
> > Why on Earth would we make such an erroneous assumption in the
> > first place?
> The statement I made was not erroneous, even now, ...
Sure it is. I can generate a good Huffman coding for English with
a modest amount of my own work, plus a lot of automated work, only
a tiny fraction of that "50 years". But Huffman coding of N-grams
isn't a very effective way to compress English (past N=1). The
whole approach is based on an erroneous operational assumption.
> Basically I think what I failed to make clear is that fact that
> there is a very basic rule of compression, a single static
> compression tree can only compress 1/2 of the texts, the other
> half will be enlarged by it.
Not even close. Use any existing implementation of simple Huffman
compression on as many English texts as you wish; you'll find that
they practically all are compressed (considerably), not expanded.
The UNIX System V "pack" utility is a simple bytewise Huffman
compressor that you can use to perform the experiment.
I don't know where you got the idea that it is necessary or even
useful to sample a large amount of English text; because use of
the source characteristics is necessarily statistical, you only
need a "good enough" approximation, which usually can be obtained
from a modest sample. (Apply Good-Turing smoothing in order not
to attribute excessive importance to non-occurring combinations.)
------------------------------
From: Hironobu SUZUKI <[EMAIL PROTECTED]>
Crossposted-To: sci.electronics.design,sci.electronics.equipment
Subject: Re: radioactive random number generator
Date: 09 Oct 1999 13:22:24 +0900
donsolo <[EMAIL PROTECTED]> writes:
> You don't need GM tubes or high voltage. A small silicon
> solar cell (or almost any diode) makes a good detector of
> background gamma.
> Shield it and use a low noise, high gain op-amp.
Good information! Thanks.
It's easy to get GM tubes and electronic parts in Japan, actually
Akiabara where is known as a world-famous electronics city.
I don't know many people believe that Pocket Giger Mullar counter DIY
kit has been sold at Akihabara since long time ago. It's a kind of
electronic DIY kit for the novice and cheap, only around US$50. I
have 2 set of it :-)
It sounds a kind joke for the people who don't know Akihabara :-)
You can order Pocket GM counter DIY kit by e-mail if you read
Japanese, See
http://www1.tomakomai.or.jp/akizuki/sokutei4.htm
Pocket GM counter has a small tube, HAMAMATSU D3372, "compact,
Halogen-Quenched GM counter tubes for measurement of Gamma and High
energy Beta rays (was written on its manual)".
Pocket GM counter make a sound 'pi' when GM tube detects Gamma or Beta
rays.
In my experiments, D3372 detects 2 or 3 conunts per a minute in my
room, I mean to detect background radiation which can be detected
anywhere on the earth. If more bigger GM tube were prepared, it would
have detected 10 times or more countns per a minute than small one.
Because this Pocket GM counter kit haven't A/D converter, we need to
make a A/D converter to connect computer. But it's not matter. It's
also easy to find A/D converter DIY kit in Akihabara!
In theorically, number of detections depend on Posisson's
distribution. So we can't use whole of number of detections in a
mintue.
But modulo 2 (odd/even) of numeber of detections may be random(?)
We can get 1 bit random, 0/1, in every minutes. Random number stream
will be grown very slow and I agree that it is NOT practical random
number generator device. But it can generate real random number.
--hironobu
------------------------------
From: [EMAIL PROTECTED] (Harry H Conover)
Crossposted-To: sci.electronics.design,sci.electronics.equipment
Subject: Re: radioactive random number generator
Date: 9 Oct 1999 05:00:43 GMT
Dave VanHorn ([EMAIL PROTECTED]) wrote:
:
: Ross <[EMAIL PROTECTED]> wrote in message
: news:[EMAIL PROTECTED]...
: > Some time ago, Mike Rosen put a paper on his web page which describes
: > in fair detail how to use the radioactive source from a commercial
: > smoke detector to generate true random numbers. Seemed a great
: > constructional project to me - I wish an electronics hobby magazine
: > would put it out in kit form. Mike's description is fairly detailed,
: > but if a non-engineer wants to construct it, more details are
: > required. Also, I wondered if different constructors would obtain
: > different number distributions, due to variation in dimensions of the
: > housing and other such parameters.
:
: This is an idea I put forth in circuit Cellar discussions years ago.
: Everyone freaked out over using radioactives, even though it's only alpha
: particles that can be stopped by paper.
I'm curious. Does the radioactive method differ substantially from the
old technique of using a light bulb in combination with a photomultiplier
to accomplish the same effect?
Harry C.
------------------------------
From: [EMAIL PROTECTED] (Mathew Hendry)
Crossposted-To: comp.compression
Subject: Re: David A. Huffman, "father" of compression died yesterday
Date: Sat, 09 Oct 1999 10:38:42 GMT
On Fri, 08 Oct 1999 11:46:41 -0700, Sundial Services
<[EMAIL PROTECTED]> wrote:
>Only the Internet would allow any of us to know that Mr. (Dr.?) Huffman
>had died, or prompt any of us to see him as -- not just a figure in a
>textbook we all read in college, but a human being who lived and
>breathed... and thought, in ways that no one else did at the time.
>
>Only the Internet would prompt us to ... offer our condolences to David
>Huffman's nephew. :-/
The obituary modestly understates the importance of Huffman coding - I
would guess that Ken's message has already been Huffman coded
thousands, perhaps millions of times all over the world.
-- Mat.
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Crossposted-To: sci.electronics.design,sci.electronics.equipment
Subject: Re: radioactive random number generator
Date: Sat, 09 Oct 1999 13:58:56 GMT
In article <[EMAIL PROTECTED]>, Hironobu SUZUKI
<[EMAIL PROTECTED]> wrote:
>
>
>donsolo <[EMAIL PROTECTED]> writes:
>> You don't need GM tubes or high voltage. A small silicon
>> solar cell (or almost any diode) makes a good detector of
>> background gamma.
>> Shield it and use a low noise, high gain op-amp.
>
>Good information! Thanks.
>
>It's easy to get GM tubes and electronic parts in Japan, actually
>Akiabara where is known as a world-famous electronics city.
>
>I don't know many people believe that Pocket Giger Mullar counter DIY
>kit has been sold at Akihabara since long time ago. It's a kind of
>electronic DIY kit for the novice and cheap, only around US$50. I
>have 2 set of it :-)
>
>It sounds a kind joke for the people who don't know Akihabara :-)
>
>You can order Pocket GM counter DIY kit by e-mail if you read
>Japanese, See
>
> http://www1.tomakomai.or.jp/akizuki/sokutei4.htm
>
>Pocket GM counter has a small tube, HAMAMATSU D3372, "compact,
>Halogen-Quenched GM counter tubes for measurement of Gamma and High
>energy Beta rays (was written on its manual)".
>
>Pocket GM counter make a sound 'pi' when GM tube detects Gamma or Beta
>rays.
>
>In my experiments, D3372 detects 2 or 3 conunts per a minute in my
>room, I mean to detect background radiation which can be detected
>anywhere on the earth. If more bigger GM tube were prepared, it would
>have detected 10 times or more countns per a minute than small one.
>
>Because this Pocket GM counter kit haven't A/D converter, we need to
>make a A/D converter to connect computer. But it's not matter. It's
>also easy to find A/D converter DIY kit in Akihabara!
>
>In theorically, number of detections depend on Posisson's
>distribution. So we can't use whole of number of detections in a
>mintue.
>
>But modulo 2 (odd/even) of numeber of detections may be random(?)
>
>We can get 1 bit random, 0/1, in every minutes. Random number stream
>will be grown very slow and I agree that it is NOT practical random
>number generator device. But it can generate real random number.
>
> --hironobu
>
But since Japan does such a bad job of manageing its nuclear resources
as the number of spills go up so will the background radioactiveity. The
counts will go up faster and faster so maybe you can use it after a while
for a random source of bits after all.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: [EMAIL PROTECTED] (Herman Rubin)
Crossposted-To: sci.electronics.design,sci.electronics.equipment
Subject: Re: radioactive random number generator
Date: 9 Oct 1999 08:48:49 -0500
In article <7tmi5r$[EMAIL PROTECTED]>,
Harry H Conover <[EMAIL PROTECTED]> wrote:
>Dave VanHorn ([EMAIL PROTECTED]) wrote:
>: Ross <[EMAIL PROTECTED]> wrote in message
>: news:[EMAIL PROTECTED]...
>: > Some time ago, Mike Rosen put a paper on his web page which describes
>: > in fair detail how to use the radioactive source from a commercial
>: > smoke detector to generate true random numbers. Seemed a great
>: > constructional project to me - I wish an electronics hobby magazine
>: > would put it out in kit form. Mike's description is fairly detailed,
>: > but if a non-engineer wants to construct it, more details are
>: > required. Also, I wondered if different constructors would obtain
>: > different number distributions, due to variation in dimensions of the
>: > housing and other such parameters.
>: This is an idea I put forth in circuit Cellar discussions years ago.
>: Everyone freaked out over using radioactives, even though it's only alpha
>: particles that can be stopped by paper.
>I'm curious. Does the radioactive method differ substantially from the
>old technique of using a light bulb in combination with a photomultiplier
>to accomplish the same effect?
It does not differ, IF the light bulb were dim enough that
there would be at least a fair probability that no photon
hit the photomultiplier during the relaxation time of the
photomultiplier or the response dead time of the counter.
Radioactive materials give a source of particles, the
precise type not being important, such that the number
striking a detector whose response is to single particles
(they do not all have to be detected, but one should not
build up the chance for another) which appear as
approximately a Poisson process of a slow enough rate
that discreteness is immediately apparent.
With radioactives, it is easy to get such a rate, and
adjust it to get what can reasonably done given the
characteristics of the electronic. The parity of counts
actually becomes better with moderate dead time.
--
This address is for information only. I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054 FAX: (765)494-0558
------------------------------
From: [EMAIL PROTECTED] ()
Subject: Re: Ritter's paper
Date: 9 Oct 99 14:13:24 GMT
Trevor Jackson, III ([EMAIL PROTECTED]) wrote:
: wtshaw wrote:
: > Ritter has said that there are no good measures of strength...so, that
: > points to something that needs attention, not that we should adopt a
: > self-fulfilling prophesy that we will never know.
: Agreed. I'm not claiming that we will never know. I'm claiming that it looks
: pretty bleak.
While I agree that we can learn more about cipher strength - in the sense
of having better *upper* bounds on strengths for a lot of ciphers - as to
the question of a lower bound on the work factor of finding the key, given
theoretically adequate quantities of known plaintext,
there is, I'm afraid, a sense in which it is valid to say we will "never
know". The problem of establishing a lower bound on a cipher's strength
is, I believe, fundamentally equivalent to the famous "halting problem".
This isn't defeatism, any more than the fact that chaos theory prevents us
from giving precise local weather forecasts for a year from now is
defeatism. Knowing what is unlikely or impossible to achieve liberates us
to work on solvable problems, and ask reasonable questions.
If we can't "prove" a cipher is secure for work-factor causes, then we
can:
- use the one-time-pad, if we're really desperate;
- include larger "safety factors" in our designs;
- using multiple layers of encryption that are based on completely
different principles that at least _appear_ to make the prospects for
analysis very dim.
Eventually, however, mathematicians working in complexity theory may make
a fundamental breakthrough that will suggest possibilities, either for
proving cipher strength, or for some kind of "next best thing". In the
meantime, knowing the direct approach to the problem cannot be fruitful
just prevents a lot of wasted time.
John Savard
------------------------------
From: [EMAIL PROTECTED] ()
Subject: Re: Ritter's paper
Date: 9 Oct 99 14:19:39 GMT
Douglas A. Gwyn ([EMAIL PROTECTED]) wrote:
: The original papers when published in the open
: literature around 1967 had all cryptologic applications
: carefully expunged, alas.
Well, that's understandable: but if one, from the mathematical literature,
understands the concepts involved, I would think that coming up with at
least some of the simpler cryptologic applications might not be too hard
(even if the most important ones might be missed for a while).
At the moment, I only recall the very simple example of a Markov model
given in Scientific American wherein one, starting with balls numbered
from 00 to 99 in one of two containers, switched a ball from one container
to the opposite one when its number was selected.
That nicely illustrated how pressure would equalize between two vessels
after the valve between them was opened, so it was nice for understanding
thermodynamics and statistical mechanics, but _that_ kind of model
certainly was far away from anything cryptological. Unless one is
enciphering cocktail party conversation which is starting to get boring.
John Savard
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************