Cryptography-Digest Digest #263, Volume #13 Sun, 3 Dec 00 04:13:00 EST
Contents:
Re: Entropy paradox ("John A. Malley")
Re: I need to crack a key (Richard Heathfield)
Re: Newbie (Scott Craver)
Re: The Next Step After OTP (John Savard)
The Next Step After OTP (John Savard)
Re: I need to crack a key (JPeschel)
Re: Pentium 4 and modular exponential (Paul Rubin)
CoderPunks ? (Mark Harrop)
Re: The Next Step After OTP ("Scott Fluhrer")
Re: The Next Step After OTP (Bryan Olson)
Re: Entropy paradox (Mok-Kong Shen)
----------------------------------------------------------------------------
From: "John A. Malley" <[EMAIL PROTECTED]>
Subject: Re: Entropy paradox
Date: Sat, 02 Dec 2000 21:15:03 -0800
"r.e.s." wrote:
>
> "John A. Malley" <[EMAIL PROTECTED]> wrote ...
> [...]
> |
> | The Shannon entropy of X is the expected (average) value
> | of the summation of the -log( Probability[ X = x] ) over
> | the range of x.
> [...]
> | The Shannon entropy of Y is the expected (average) value
> | of the summation of the -log( Probability[ Y = y] ) over
> | the range of y.
> [...]
>
> Those are slips, of course.
The descriptions do sound funny, don't they? But it's just a way to
describe the formula for Shannon entropy in English, without the
frustration of trying to write the formula with symbols assembled from
dashes and backslashes and forward slashes :-)
The definition of an expected value for a random variable A that takes
on values a_1, a_2, a_3 ... a_n is given by
E(A) = summation over i = 1 to n of ( a_i * Probability[ A = a_i] ).
Consider a random variable X that takes on n distinct values x_1, x_2,
x_3 ... x_n. Let A be a random variable defined as
A = -log( Probability[X] )
so the random variable A takes on values a_i = -log(Probability[ X =
x_i]).
How often does A take on a particular value a_i? As often as X takes on
the particular value x_i. So the Probability[ A = a_i ] = Probability[
X = x_i ].
What's the expected value of A?
E(A) = summation over i = 1 to n of ( -log( Probability[ X = x_i)] *
Probability[ X = x_i ] )
and that's the Shannon entropy. That's what I tried to say with these
statements
"The Shannon entropy of X is the expected (average) value
of the summation of the -log( Probability[ X = x] ) over
the range of x."
where a_i = -log( Probability[ X = x ] ),
and
"The Shannon entropy of Y is the expected (average) value
of the summation of the -log( Probability[ Y = y] ) over
the range of y."
where a_i = -log( Probability[ Y = y ] ), and in both cases I assumed
the reader would apply the definition for expected value as described
above.
Maybe next time I'll just render the formula in piece-part hyphens and
slashes :-)
John A. Malley
[EMAIL PROTECTED]
------------------------------
Date: Sun, 03 Dec 2000 06:23:25 +0000
From: Richard Heathfield <[EMAIL PROTECTED]>
Subject: Re: I need to crack a key
Tom St Denis wrote:
>
> In article <90c8id$idj$[EMAIL PROTECTED]>,
> "r.e.s." <[EMAIL PROTECTED]> wrote:
> >
> > "Tom St Denis" <[EMAIL PROTECTED]> wrote ...
> > [...]
> > | Only lower case chars:
> > | 1. Length of 7 chars, # of keys = 8,031,810,176 (2^33)
> > | 2. Length of 14 chars, # of keys = 64,509,974,703,297,150,976
> > (2^65)
> > [...]
> > | Lower+Upper
> > | 1. Length of 7 chars, # of keys = 1,028,071,702,528 (2^40)
> > | 2. Length of 14 chars, # of keys =
> > 1,056,931,425,538,820,521,590,784
> > | (2^80)
> >
> > Sorry to nitpick, but ...
> >
> > n 2^n
> > ------------------------------
> > 33 8589934592
> > 40 1099511627776
> > 65 36893488147419103232
> > 80 1208925819614629174706176
>
> I truncated the fractions... my point was that there are lots of keys...
>
> (e.g 26^14 ~ 2^65.8 ~ 2^65)
Then it might have been an idea to provide only an appropriate number of
significant figures. In this case, that would appear to be 1 SF in three
cases, and - um - 0 SF in the fourth. ;-)
--
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton
------------------------------
From: [EMAIL PROTECTED] (Scott Craver)
Subject: Re: Newbie
Date: 3 Dec 2000 06:24:12 GMT
Michael <[EMAIL PROTECTED]> wrote:
>
>In your worst case scenario, someone has to be there waiting to take
>advantage of it.
>I don't think that is reality (in this real world scenario.)
>I do understand what you are saying.
Hi,
Allow me to provide you with an illustrative real-world
example. Recently, the recording industry has been
developing a system called SDMI (secure digital music
initiative,) in which secret messages are embedded in
the music on Compact Discs. The idea is that your
CD player/MP3 player at home will have a program in it
that will find and decrypt those messages, and
could refuse to play or record music on the basis of
the hidden message. The main goal, I guess, is to
keep people from making unauthorized copies of music.
But notice! All these devices have to know how to
extract the hidden messages. These devices are like
your friends, with whom you share your algorithm.
The industry can't use a changing secret key or
password, because all MP3 players will have to be
able to extract the hidden data from all music clips.
So it's just the algorithm, and it must be kept secret.
We analyzed their system as part of a challenge, in
which they wouldn't describe to anyone the algorithm,
of course. Yet hackers were able to determine quite
a bit about some of the methods involved. In one
case (I swear, the technical report is on its way)
an elaborate method of hiding data could be seen clear
as a bell by computing a plain old freq. response.
The method was so unique that it only took a short
search to find the ====> US PATENT <==== belonging to
the company, describing it in relative detail. Um, oops.
It is more realistic to expect an algorithm to be
compromised, or even reverse-engineered by good
cryptanalysis, than it is to rely on security through
obscurity.
-S
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: The Next Step After OTP
Date: Sun, 03 Dec 2000 06:44:32 GMT
On Sun, 03 Dec 2000 06:42:35 GMT, [EMAIL PROTECTED]
(John Savard) wrote, in part:
>Inspect the following table:
> Key
>P A B C D E F G H I J K L
>------------------------------------------
>00 | 00 01 10 11 00 01 10 11 00 01 10 11
>01 | 01 00 11 10 10 11 00 01 11 10 01 00
>10 | 10 11 00 01 11 10 01 00 01 00 11 10
>11 | 11 10 01 00 01 00 11 10 10 11 00 01
Incidentally, although this is an application of orthogonal Latin
squares, I don't _think_ it's one Terry Ritter has invented...although
it could have been invented long ago before both of us.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: The Next Step After OTP
Date: Sun, 03 Dec 2000 06:42:35 GMT
The most common means of carrying out a stream cipher is to XOR the
output of a PRNG with the plaintext.
This, like the classical methods of applying an OTP by addition or
XOR, is vulnerable to the bit-flipping attack.
In general, the method for avoiding a bit-flipping attack is either to
switch to the use of a block cipher, or to use a complicated stream
cipher such as a simulated rotor machine.
However, particularly in the case where one wishes to retain the
absolute security of the OTP, these alternatives seem too complicated,
or less secure. One could combine an OTP with a block cipher.
But is there an approach similar to the OTP which also protects
against bit-flipping?
Inspect the following table:
Key
P A B C D E F G H I J K L
==========================================
00 | 00 01 10 11 00 01 10 11 00 01 10 11
01 | 01 00 11 10 10 11 00 01 11 10 01 00
10 | 10 11 00 01 11 10 01 00 01 00 11 10
11 | 11 10 01 00 01 00 11 10 10 11 00 01
If the one-time pad consists of random letters from the set A to L,
each used to encipher a pair of bits, then not only is the ciphertext
pair of bits any of the four possibilities with equal probability, but
in addition, for each possible plaintext-ciphertext pair, the three
possible alphabets take the three other ciphertexts to the three other
possible plaintexts.
Thus, if one alters any part of a ciphertext with a known plaintext,
the bit pairs are altered to _one_ of the three possible other values,
but there is no way to know which one.
It is not necessarily an absolutely minimal method to achieve this:
one could likely use a smaller table if one was encrypting in base 3,
using a pad with six possible characters, having equivocation between
the two other plaintexts if the ciphertext is altered.
Because of the way the alphabets are grouped above, one could even use
a one time pad to provide perfect secrecy for the message combined
with a conventional stream cipher to protect against bit-flipping; the
stream cipher would produce 3-symbols selecting one of the groups
(A,B,C,D), (E,F,G,H), (I,J,K,L) supplies the key letter, and the
one-time-pad the letter within the group.
John Savard
http://home.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Date: 03 Dec 2000 07:20:51 GMT
Subject: Re: I need to crack a key
[EMAIL PROTECTED] writes, in part:
>I packed away som important stuff in a PGP-disk file (using Network
>Associates PGP disk 6.0.2i).
>The passphrase is rather short, so I reckon it will probably not take a
>brute force password cracker long to hammer all theoretical
>combinations away..
>
Your best bet might be a dictionary attack.
Create a dictionary with passphrases that are
similar to the one you used. Then mutate and scramble
the dictionary. As for the cracking program
itself -- I think AccessData has, or had, a free
one.
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: Paul Rubin <[EMAIL PROTECTED]>
Subject: Re: Pentium 4 and modular exponential
Date: 02 Dec 2000 23:24:15 -0800
Wei Dai <[EMAIL PROTECTED]> writes:
> The 32x32 -> 64 packed multiply instruction (PMULUDQ) in SSE2 is
> clearly designed with modular exponentiation in mind.
What makes you say that??? If it were made for big-integer arithmetic
it would have a wide carry register or large accumlator. The best
architecture I've seen yet for modexp is the Motorola 56000 series DSP.
This can start a 24x24 multiply-accumulate every cycle, with a 56 bit
accumulator, so you can add up to 256 partial products before you
have to worry about carry overflow.
------------------------------
Date: Sun, 03 Dec 2000 18:58:09 +1100
From: Mark Harrop <[EMAIL PROTECTED]>
Subject: CoderPunks ?
Hi all...
Could some kind soul pls email me, or post here, the address and commands
to join the CODERPUNKS list ?
Cheers!
Mark Harrop
[EMAIL PROTECTED]<mailto:>
Moderator of the following Programming Lists:
Send a empty message to:
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
Cheers!
Mark Harrop
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
Moderator of the following Programming Lists:
Send a empty message to:
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
+<(:o)|<:
------------------------------
From: "Scott Fluhrer" <[EMAIL PROTECTED]>
Subject: Re: The Next Step After OTP
Date: Sun, 3 Dec 2000 00:07:42 -0800
John Savard <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> The most common means of carrying out a stream cipher is to XOR the
> output of a PRNG with the plaintext.
>
> This, like the classical methods of applying an OTP by addition or
> XOR, is vulnerable to the bit-flipping attack.
>
> In general, the method for avoiding a bit-flipping attack is either to
> switch to the use of a block cipher, or to use a complicated stream
> cipher such as a simulated rotor machine.
Actually, the usual approach to avoid bit-flipping (or any other form of
active attack) is to add a MAC (message authentication code). A MAC is a
cryptographic object that takes a message and a key and produces a
"checksum". Both sides share a key. The sender runs the MAC on the data
and his key, and appends the checksum to the message. The reciever runs the
MAC on the received data[1] and his key, and compares that to the appended
checksum. If they don't match, the message is rejected as a forgery.
The fundamental property of a secure MAC is that an attacker cannot produce
a new valid data/checksum pair with nontrivial probability, even with mounds
of valid data/checksum pairs.
>
> Thus, if one alters any part of a ciphertext with a known plaintext,
> the bit pairs are altered to _one_ of the three possible other values,
> but there is no way to know which one.
You mean that the attacker has a 33% chance of being able to forge the
message he wants (not to mention he as a 100% chance of passing some
altered plaintext to the application)? That's a really low security
guarrantee...
[1] But not on the appended checksum, of course
--
scott
------------------------------
From: Bryan Olson <[EMAIL PROTECTED]>
Subject: Re: The Next Step After OTP
Date: Sun, 03 Dec 2000 08:37:27 GMT
John Savard wrote:
> The most common means of carrying out a stream cipher is
> to XOR the output of a PRNG with the plaintext.
>
> This, like the classical methods of applying an OTP by
> addition or XOR, is vulnerable to the bit-flipping attack.
The OTP is typically presented as a secrecy system,
in which case the bit-flipping attack is irrelevant.
> In general, the method for avoiding a bit-flipping attack is either to
> switch to the use of a block cipher, or to use a complicated stream
> cipher such as a simulated rotor machine.
I disagree. The solution is to add an authentication
mechanism. All of the three major classes of ciphers (OTP,
complexity based symmetric, and public-key) also offer
signature schemes.
[...]
> But is there an approach similar to the OTP which also protects
> against bit-flipping?
Yes. How to use an OTP for provable authentication has been
described here several times, and of course I'm most
familiar with my own posts. One of those is:
http://www.deja.com/getdoc.xp?AN=556818127
I dislike your proposed solution because it does not detect
forgeries. It is still vulnerable to bit-flipping, though
the attacker cannot choose the decrypted outcome.
Still, your solution has a similarity to the OTP
authentication method of using universal hash functions.
(Some authors adopt a more general notion of universal
hash functions, in which the ones I'm talking about
are called "2-universal".)
> Inspect the following table:
> Key
> P A B C D E F G H I J K L
> ------------------------------------------
> 00 | 00 01 10 11 00 01 10 11 00 01 10 11
> 01 | 01 00 11 10 10 11 00 01 11 10 01 00
> 10 | 10 11 00 01 11 10 01 00 01 00 11 10
> 11 | 11 10 01 00 01 00 11 10 10 11 00 01
>
> If the one-time pad consists of random letters from the set A to L,
> each used to encipher a pair of bits, then not only is the ciphertext
> pair of bits any of the four possibilities with equal probability, but
> in addition, for each possible plaintext-ciphertext pair, the three
> possible alphabets take the three other ciphertexts to the three other
> possible plaintexts.
Suppose we set aside for the moment the question of secrecy,
and look only at authentication. We can use your table to
specify the signature rather than the ciphertext. For
example, if the text is 01 and the signing-key is C, then
the signature is 11.
Now if the attacker knows the message and the signature, he
can determine that the signing key was one of C, F or I. He
cannot tell which one. If he wants to change the message to
any of the others, he cannot determine the proper signature,
since C, F and I will always induce three different
signatures.
The table falls short of being a universal hash function; in
a universal hash all four 2-bit signatures must be equally
likely on each altered message (even given the true message
and its signature).
In practice we would of course use a signature larger than
two bits. We might use 80-bit signatures, regardless of
message length. That would require at least 160 bits of key
- here's why: Without knowing the key, any 80-bit signature
must be equally likely on the true message. Thus seeing the
actual signature resolves 80 bits of key entropy. In order
to make each signature equally likely on another message,
(even after 80 bits has been resolved) there must be at
least 80 more bits of key entropy.
--Bryan
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Entropy paradox
Date: Sun, 03 Dec 2000 10:00:12 +0100
Bryan Olson wrote:
>
> Mok-Kong Shen wrote:
> >
> >
> > Bryan Olson wrote:
> > >
> > > Mok-Kong Shen wrote:
> > > > "Douglas A. Gwyn" wrote:
> > > > >
> > > > > Mok-Kong Shen wrote:
> > > > > > to generate u bits, with u >> m. We know that (accepting
> > > > > > certain plausible assumptions) the u bits are provably
> > > > > > secure.
> > > > >
> > > > > There is a simple proof that on average at least u-m
> > > > > bits are predictable from knowledge of the other m.
> > > > > So probably you're being confused by the sloppy use of
> > > > > "provably secure". You need to refer to the *specific*
> > > > > theorem that has been proven in order to figure out
> > > > > exactly what is meant by that phrase.
> > > >
> > > > That would mean any m bits suffice to predict the rest
> > > > of u (u could be made fairly large!).
> > >
> > > Not true. The m bits may have less than m bits of entropy.
> >
> > See below.
>
> Nothing of consequence there. The rest of u need not
> be predictable even with unbounded computation.
>
> > > > But this can only
> > > > mean theoretical predictability, not practical predictability,
> > > > for otherwise one can't use any output longer than m and
> > > > the generator would be useless (for one could just as well
> > > > use the original m bits). Anyway, I suppose it is not
> > > > necessary to argue further about the paradox, see the
> > > > follow-up of John A. Malley.
> > >
> > > That follow-up was serious?????
> > >
> > > The correct answer has appeared over and over: entropy and
> > > polynomial-time predictability are not the same. Deterministic
> > > computation cannot increase the entropy of it's starting
> > > state, but there's no similar result about computation-limited
> > > predictability.
> >
> > I suppose that you don't doubt that the paper cited was
> > serious????? (That was a published paper in a well-known
> > journal written by persons having some good names!)
>
> Of course the paper was serious. It re-enforces what
> people have been telling you all along.
It reinforces what I maintained, namely there is another
qunatity, rather similar to the theoretical 'entropy'
but different and of use in practical considerations.
M. K. Shen
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************