Cryptography-Digest Digest #280, Volume #10 Mon, 20 Sep 99 14:13:03 EDT
Contents:
Re: Okay "experts," how do you do it? (Sundial Services)
Re: some information theory (Tim Tyler)
Re: Large number arithmetic (Anton Stiglic)
Re: Okay "experts," how do you do it? ([EMAIL PROTECTED])
Re: Okay "experts," how do you do it? ([EMAIL PROTECTED])
Re: Okay "experts," how do you do it? (Patrick Juola)
Re: unix clippers that implement strong crypto. (SCOTT19U.ZIP_GUY)
Re: Comments on ECC (DJohn37050)
Re: FPGAs (Medical Electronics Lab)
Re: Okay "experts," how do you do it? (Tom St Denis)
AES finalists AMD Athlon performance? (David Crick)
Re: Comments on ECC ("Joseph Ashwood")
----------------------------------------------------------------------------
Date: Mon, 20 Sep 1999 08:26:39 -0700
From: Sundial Services <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Okay "experts," how do you do it?
Ahh yes, the NP-complete problem.
Okay, then, "for-GET how it works." Let's look only at the input, the
output, and the key. Let's pretend we cannot determine the algorithm.
What is it about an algorithm's input and output that enables us to say
that it is a "good" encryption algorithm? What is it about the
algorithm's dependence upon its two inputs, 'p' and 'k', as realized in
the output 'c', that makes the algorithm a "good" one or a "bad" one?
> Patrick Juola wrote:
>
> > There OUGHT to be an objective test-bed that we can plug these
> >algorithms into, to test them.
>
> Actually, it's much harder than you think to come up with an algorithm
> for analyzing other algorithms. In point of fact, that's one of the
> few things that's easy to prove *impossible* in computer science
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: some information theory
Reply-To: [EMAIL PROTECTED]
Date: Mon, 20 Sep 1999 15:46:33 GMT
Anti-Spam <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> Anti-Spam <[EMAIL PROTECTED]> wrote:
:> : Tim Tyler wrote:
:> If you have a set of N target data strings in need of compressing, the
:> optimum compression technique (in terms of size) is essentially to map
:> these strings onto the integers from 1 to N.
:>
:> Given a string taken at random from the starting set, the resulting
:> "compressed file" will be indistinguishable from random.
: Let me paraphrase, to make sure I understand this assertion:
: Start with a set of N data strings. Assume each data string is an
: encoding for a symbol S.
: I assume this set corresponds then to N symbols ( S1, S2, S3 ... Sn )
: that will appear in messages (compressed data/files). Call this set A.
: Conjecture: the optimum compression (here defined in terms of size of
: the resulting compression-encoded data/file) is achieved when (1) the N
: symbols are encoded as integers 1 through N in the compressed data/file,
: irrespective of the frequency of occurance of the Nth symbol in any
: particular message.
: Furthermore, (2) the random assigment of the integer index value code
: for the resulting compression encoded data/file will pass all confidence
: tests for random bit strings of the length of the resulting compressed
: data/file.
: Part (1):
: here is the set A: { S1, S2, S3 ... Sn }
: index into set A: 0, 1 , 2 ... N-1 -> N integer values.
: Assume the symbols S1 ... Sn do not occur equiprobably. The maximal
: entropy code (and thus minimal number of bits and thus minimal sized bit
: strings for messages composed of the symbols) requires
: H = - (SUM over i = 1 to n )[ ( probability of Si ) * ( log(base2) of
: probability of Si) ]
: where probability of Si = number of occurances of the ith symbol / total
: number of symbols that occur. Occur where, how? Look at a
: pre-compressed message M composed of Q occurances of symbols found in
: the set A. For some messages maybe all of the symbols in A occur. For
: others, maybe only a subset of the symbols in A occur.
: The frequencies of occurance are a function of the particular message
: M.
: So each symbol can be encoded as one out of 2 ^ H unique possible binary
: strings, with H varying from message M1, M2, M3... (Static and adaptive
: huffman codes create strings shorter and longer than H bits such that
: the *average* number of bits per symbol tends to H bits, to handle the
: fractional bit values given by this formula for maximal entropy
: encoding.)
: There are N symbols in the set A. An integer index into the set A
: requires ( log(base 2)of N ) bits to be represented. The use of an
: integer value (index) to code for the symbol in the compressed data/file
: results in the minimal size compressed data/file iff:
: H = ( log(base 2)of N ).
: i.e this is only true if the probability of occurance for each Si is the
: same for all symbols. All symbols as they appear in a specific message M
: must be equiprobable for the use of the index into the set A to give the
: minimal size for the compressed data/file.
: If the symbols are not equiprobable, then H bits/symbol < ( log(base
: 2)of N ) bits/symbol. Therefore there is another code with shorter bit
: lengths for the symbols. The use of the index values for elements in
: the set A as a code would not produce the minimal bit length
: compression of the data/file.
: But this is not the original assertion in (1). ***** So what am I
: missing? *****
Sorry if I was not clear. S1, S2, S3 ... Sn must all be different from
one another. If (for example) they are all equal it makes a nonsense
of the numbering scheme I proposed.
If S1, S2 ... Sn are all different, a compression into files of < Log2(n)
bits is completely impossible - as there would not be enough different
resulting compressed files of this size to expand into the originally
specified set of strings - so the one-to-one property would be
impossible.
: Point (2):
: Given the set A: { S1, S2, S3 ... Sn } and
: index into set A: 0, 1 , 2 ... N-1 -> N integer values.
: Randomly select an index value and its( log(base 2)of N ) bit string to
: represent (encode) the kth occurance of a symbol Si in the "compressed"
: data/file as it appears in the original data/file.
: Assume there is a random variable IND, an index value over in the range
: [0, N-1], the pdf of IND is uniformly distrbuted across the integer 0
: through N-1. Thus the probability of IND taking on any specific index
: value is 1/N.
: Take the message M composed of occurances of the symbols Si and for each
: symbol, take the ith value of IND as the index into set A and place that
: bit string code into the ith position in the "compressed" data/file:
: message M - (no spaces here really, just spreading 'em out to make it
: easier to read )
: s1 s3 s1 s2 s5 s2 s4 s2 s1 s1 s4 s7 s9 s10 ......
: index value
: 1 3 1 2 5 2 4 2 1 1 4 7 9 10
: value of IND (a random variable )
: 1 6 12 4 2 7 11 3 9 12 2 2 14 8 ..........
: so how do we get to the "compressed" data/file representation? Whatever
: the function, it must be reversible, one-to-one and onto - to ensure we
: successfully "decompress" the result. A random substitution of a new
: index value for the existing index value in the uncompressed data/file
: is not reversible. How would we know what it originally mapped to when
: we decompressed?
: Assume the substitution is equal to a modulo function (this is just one
: possible choice - there are others):
: compressed index = ( original index + IND ) mod N
: "decompression" given by original index = ( compressed index - IND ) mod
: N.
: so "compressed" values are:
: (1 + 1 )mod N, (3 + 6 ) mod N, (1 + 12) mod N, (2 + 4 ) mod N ......
: The compressed index is a random variable.
: This isn't acting as a compression, however, this is a cipher. ( If
: compression is supposed to come about due to the use of the index value
: coding scheme, we already showed it doesn't give the shortest bit length
: total for the "compressed" data/file. ) In fact, this is a non-periodic
: polyalphabetic cipher where the key is the total sequence of random
: values of IND and that key is as long as the original message (in terms
: of number of symbols in the original message M.)
: To decode the "compressed" data/file, which in this case is actually
: enciphered, the same key or sequence of values for IND must be
: subtracted.
: Conveying that sequence of random values, as long as the message M in
: terms of symbols Q in it, to the decompressor is equivalent to conveying
: a one-time pad sequence to the recipient to use to decipher it back to
: the original message M.
: Given the key (sequence of IND values) is random, and the index code
: strings are identical lengths, the "compressed" data/file, which is
: actually an encipher data/file, is random. The compressed index is a
: random variable.
: So, this will pass the tests for randomness since it is fed by a random
: key, the sequence of IND values.
: * AGREED.*
: However, if the "compression" scheme uses some function of the symbols
: frequencies that preceed the current symbol to be "compressed":
...then it will undoubtedly fail to optimally compress some types
of file. Consequently I'm not sure the rest of your argument needs
to be addressed.
: IND(k) = function( frequencies of occurance of some or all previous
: symbols as they appeared in M )
: then the sequence of IND values will be a pseudorandom sequence ( call
: it PIND). There will be correlations in the values of PIND due to the
: non-randomness in the occurance of symbols in M. Therefore,
: compressed index = ( original index + PIND ) mod N
: "decompression" given by original index = ( compressed index - PIND )
: mod N.
: The compressed index is not a random variable. Each new value of PIND is
: influenced by some or all of the previous frequencies of symbol
: occurance as represented in the un-"compressed" data/file, and thus this
: shared influence manifests as correlations across the bit sequence in
: the "compressed" data/file. The bit sequences in the "compressed"
: data/file will fail some or all statistical tests for randomness. If
: NONE of the frequencies of symbol occurance of any of the preceeding
: un-"compressed" data/file influences the code chosen for the current
: symbol to "compress" then the result would not be pseudo-random.
[...]
: A deterministic bit stream "combined" with another deterministic bit
: stream in a reversable (one-to-one and onto) process, no matter how
: complex or convoluted, produces another deterministic bit stream
: whose substrings will evidence some correlations.
To put it another way, in order to produce the appearance of randomness
in the resulting compressed file, the compression program that needs to be
applied to an unboundedly large source data itself needs to become
unboundedly large. Optimally compressing large files which are not
themselves random becomes more and more "tricky" the larger the file.
In general I'm talking about compressing files, rather than compressing
streams.
When compressing files you have all the data available at once.
I am far-from-certain that compressing streams in real time can produce
the appearance of genuine randomness.
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
Mankind has an incestuous relationship with mother earth.
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Subject: Re: Large number arithmetic
Date: Mon, 20 Sep 1999 11:57:54 -0400
>
Just to add.... If it's in Java that you are programing, Java (2.0 at
least...) has a bigInteger class in the math package which comes
with JDK.
Anton
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Okay "experts," how do you do it?
Date: 20 Sep 1999 15:34:29 GMT
Sundial Services <[EMAIL PROTECTED]> wrote:
>> Patrick Juola wrote:
>>
>> > There OUGHT to be an objective test-bed that we can plug these
>> >algorithms into, to test them.
>>
>> Actually, it's much harder than you think to come up with an algorithm
>> for analyzing other algorithms. In point of fact, that's one of the
>> few things that's easy to prove *impossible* in computer science
> Ahh yes, the NP-complete problem.
No, not NP-complete. All NP-complete problems are solvable, if you
have enough time (it's just that most people don't want to wait 100
millenia for the answers! :-) ). With faster computers your able to
solve larger and larger instances of NP-complete problems.
The *impossible* problems mentioned above are impossible no matter how
much time you allow, or how fast a computer you have...
--
Steve Tate --- srt[At]cs.unt.edu | Gratuitously stolen quote:
Dept. of Computer Sciences | "The box said 'Requires Windows 95, NT,
University of North Texas | or better,' so I installed Linux."
Denton, TX 76201 |
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Okay "experts," how do you do it?
Date: 20 Sep 1999 16:23:38 GMT
David A Molnar <[EMAIL PROTECTED]> wrote:
> jerome <[EMAIL PROTECTED]> wrote:
>> obviously maybe you are well known because you are good and the referee
>> may not be influenced by the reputation... hard to say.
> Aren't papers supposed to be refereed without knowledge of the authors'
> names?
It depends on the area... I've refereed many, many papers, and don't
think I've ever done one for a conference or journal that did
"anonymous" refereeing. Since many of the crypto people come from my
field (theoretical computer science) I'd bet that crypto papers are
not anonymously refereed either.
On the other hand, I have a friend who worked in systems, where they
do try to do anonymous refereeing for some of the conferences.
However, he said it's almost always obvious who the authors of papers
are, even though the names are not attached (it seems like it would be
particularly hard in systems, where people establish very unique
testbeds -- referring to your setup is almost like putting a big sign
on the paper as to where the work was done).
--
Steve Tate --- srt[At]cs.unt.edu | Gratuitously stolen quote:
Dept. of Computer Sciences | "The box said 'Requires Windows 95, NT,
University of North Texas | or better,' so I installed Linux."
Denton, TX 76201 |
------------------------------
From: [EMAIL PROTECTED] (Patrick Juola)
Subject: Re: Okay "experts," how do you do it?
Date: 20 Sep 1999 11:47:08 -0400
In article <[EMAIL PROTECTED]>,
Sundial Services <[EMAIL PROTECTED]> wrote:
>Ahh yes, the NP-complete problem.
Well, close. Actually, the Halting problem. But from a similar body
of results and probably discussed in the same textbook.
>Okay, then, "for-GET how it works." Let's look only at the input, the
>output, and the key. Let's pretend we cannot determine the algorithm.
>
>What is it about an algorithm's input and output that enables us to say
>that it is a "good" encryption algorithm? What is it about the
>algorithm's dependence upon its two inputs, 'p' and 'k', as realized in
>the output 'c', that makes the algorithm a "good" one or a "bad" one?
Formally? The ratio between the amount of work necessary to determine
p and the amount of work necessary to determine p *when given k*.
(Or at least, that's one formalization; there are others.)
Unfortunately, this will quickly degenerate into one of those nasty
unsolvable problems; "Does the ratio exceed <some constant>?" is
not a problem admitting of a trivial solution, and hence is not
solvable by a Turing machine.
Note -- if it were solvable without looking at the details of the
algorithm, then it would also be solvable while allowing looking
at the details of the algorithm (you just look but ignore), so
the Halting problem still applies. More generally, I think we've
just proven that *completely* automated cryptanalysis is impossible.
There's also the basic problem that we have no way of determining
*lower* bounds for the amount of work required; in point of fact,
the "lucky guess" algorithm will *always* work in O(1) time and
memory, but it's not (yet) an algorithm that anyone knows how to
implement. This is, of course Mr. Ritter's point.
-kitten
>> Patrick Juola wrote:
>>
>> > There OUGHT to be an objective test-bed that we can plug these
>> >algorithms into, to test them.
>>
>> Actually, it's much harder than you think to come up with an algorithm
>> for analyzing other algorithms. In point of fact, that's one of the
>> few things that's easy to prove *impossible* in computer science
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Crossposted-To: comp.security.unix
Subject: Re: unix clippers that implement strong crypto.
Date: Mon, 20 Sep 1999 17:45:11 GMT
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Terry Ritter) wrote:
>As I recall, a small data-compression company (STAC?) actually won
>their $100M patent infringement case against Microsoft, and in almost
>record time.
Was that Stacker and did they reallt win or did Microsoft just absorb them.
Where is that company today.
>
>> I do think you do care about cryptography and should be proud but I for one
>>think patents only help the rich steal from the poor.
>
>Big companies *generally* steal from little ones. That is the way it
>works. Big companies can wait for small companies to show new
>technology, then steal it, and with overwhelming marketing, dominate
>the field. Patents stand in the way of that. A big company which
>ignores a patent can be accumulating very significant damages --
>enough even to interest a third party with sufficient resources to
>prevail in court.
>
>Patents are a tool to protect *small* companies, because big companies
>don't need patents to create a monopoly.
>
>---
Since you say patents are a tool to protect "samll" companies. I would
like to know just what it cost to the nearest 3 figures. to hire a lawyer and
get a patent. Since you have done this several times maybe you can give
a cash and time cost. You also did not anwser how many times you have
sued and sued and won with your patents. Or if any big company has
decided to challenge your small company in court.
Thank YOu
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
------------------------------
From: [EMAIL PROTECTED] (DJohn37050)
Subject: Re: Comments on ECC
Date: 20 Sep 1999 16:58:10 GMT
Certicom has had an ECC challenge out for a few years and has had commercial
implementations of ECC before that. So far, the best attacks on the ECC
challenges align with theory very well.
The truth is that the known best attacks on IFP (RSA), DLP (DSA) and ECDLP
(ECDSA) involve sophisticated math. An expert in one area should not
necessarily be considered an expert in another, except that Pollard's name
comes up everywhere. Especially when one considers the source of the quoted
statement (the "A" in RSA), it should be taken with a grain of salt, as ECC is
an algorithmic competitor to RSA and is "stronger, shorter, faster, etc."
And there are rumors that some of the ECC naysayers are coming around.
Certainly NIST's endorsement of ECC for government use says a lot. Watch this
space.
Don Johnson
------------------------------
From: Medical Electronics Lab <[EMAIL PROTECTED]>
Subject: Re: FPGAs
Date: Mon, 20 Sep 1999 12:01:41 -0500
Kasper Pedersen wrote:
>
> Arthur Dardia <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]..
> > I've recently acquired 5 Xilinx FPGAs. The numbers on the chip look like
> this:
> >
> > XC4005E
> > PC84CKJ9637
> > A64196A
> > 3C
> > Anyone know how fast these things are? What if I ran them in parallel?
>
> > anyone point me to any resources on how to program these? They were given
> to me
> > by a friend who used to work at National Semiconductor. I have 2-3 test
> boards
>
> Good friend. The Xilinx Foundation Series software might work, if you are
> able to acquire it.
> What you need is either schematic entry, or a good VHDL compiler.
Check out Digi-key. They sell Xilinx parts, development systems and
sockets. The low end Foundation Series should work fine with schematic
entry, but if you have more money the VHDL compiler gives you more
control.
Patience, persitence, truth,
Dr. mike
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: Okay "experts," how do you do it?
Date: Mon, 20 Sep 1999 16:50:28 GMT
In article <7rta3o$1mdk$[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (SCOTT19U.ZIP_GUY) wrote:
> In article <[EMAIL PROTECTED]>, Eric
>Lee Green <[EMAIL PROTECTED]> wrote:
> >"SCOTT19U.ZIP_GUY" wrote:
> >> In article
> > <[EMAIL PROTECTED]>, Eric Lee
> > Green <[EMAIL PROTECTED]> wrote:
> >> >Whoops, I forgot part (d), which is when you pay the "real" experts real
> >> >money to cryptanalyse your product prior to its release. Depending upon
> >> >the importance of the crypto component of your product, that may be
> >> >money well spent (or maybe not). I know that Microsoft probably wishes
> >> >today that they'd hired Bruce to cryptanalyse MSCHAP-80 prior to its
> >> >release...
> >>
> >> How do you know that didn't hire him.
> >
> >Because he and Mudge tore MSCHAP-80 to shreds. See
> >http://www.counterpane.com for details. Basically, he blasted Microsoft
> >as being a bunch of amateurs, and insinuated that no real expert would
> >have put out such a piece of crud.
> May be he set them up and consulted under another name. Doing a bad
> job so that his other identity could take the credit for finding the weakness.
> Also even when in school lots of teachers take credit for what there pupils
> or others do. How can one be sure Mr B.S. actully balsted them. IT is possible
> his handlers attacked it for him so that his Spin Doctioring factor could go
> up. Things are not always what they seem to be in the world of crypto thats
> what makes it so fun.
> >
> >That doesn't sound like he consulted for them!
> >
> >> particular person was. But I doubt if the so called experts would really want
> >> that. Becasue its better to have a good line of BS than to really know
> >> anything.
> >
> >That, alas, IS a problem. A good line of bull is enough to snow many of
> >the pointy-haired bosses out there. But there's a way to tell whether
> >someone is relatively reputable or not. a) Does he have a lot of mention
> >in the literature? b) Does he have references willing to stand up and
> >say he did good work? c) Do your own in-house experts trust him once
> >they've quizzed him? etc. etc. etc. It's just like hiring an employee,
> >and like when hiring an employee, too many people fall for a good line
> >of bullshit, but if you're willing to work at it, you can tell the bull
> >from the beef.
>
> Having worked for the government over the years I have meet all types
> even those with excellent credentuals that knew almost nothing. I use
> to belive what you said above but not any more. Like inertia once some
> one with a good load of bull starts moving. It is hard for the truth to slow
> them down and they easily get more credentials that you think mean so
> much.
And what are your credentials? For all we know you could be a 17 year old
nerd kid. Why are you pursuing this? Are you jealous that Bruce knows
something and you don't?
BTW This is highly OT stop talking in this thread, try
news:alt.total.paranoid for more info.
Tom
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
------------------------------
From: David Crick <[EMAIL PROTECTED]>
Subject: AES finalists AMD Athlon performance?
Date: Mon, 20 Sep 1999 18:18:40 +0100
Does anyone know of actual/predicted performance of the AES
finalists on the AMD Athlon processor?
David.
--
+-------------------------------------------------------------------+
| David Crick [EMAIL PROTECTED] http://members.tripod.com/vidcad/ |
| Damon Hill WC96 Tribute: http://www.geocities.com/MotorCity/4236/ |
| M. Brundle Quotes: http://members.tripod.com/~vidcad/martin_b.htm |
| ICQ#: 46605825 PGP Public Keys: RSA 0x22D5C7A9 DH/DSS 0xBE63D7C7 |
+-------------------------------------------------------------------+
------------------------------
From: "Joseph Ashwood" <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Mon, 20 Sep 1999 11:01:45 -0700
Just a small addendum.
I will grant that ECC is stronger today than RSA is, and I will grant that
certain proofs regarding the relative ease of break between factoring and
finite fields exist, or can be done. But I find no reason not to remain
convinced that eventually factoring could very easily become
sub-exponential, and that by relation finite fields could also easily become
subexponential, I simply am in no place to speculate on _when_ that will
occur.
Of course you should also consider that this is coming from someone who
remains convinced of NP-completeness, but I remain unconvinced that the
finding of the such an algorithm will take a finite amount of time.
Joseph
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************