Cryptography-Digest Digest #523, Volume #14 Tue, 5 Jun 01 11:13:01 EDT
Contents:
Re: Def'n of bijection (Tim Tyler)
Re: about DH parameters & germain primes (Bob Silverman)
Re: Def'n of bijection ([EMAIL PROTECTED])
Re: Keyed hash functions (Tim Tyler)
Re: Def'n of bijection ([EMAIL PROTECTED])
Re: Def'n of bijection (SCOTT19U.ZIP_GUY)
Re: Def'n of bijection (Tim Tyler)
Re: Def'n of bijection (Tim Tyler)
Re: Def'n of bijection ([EMAIL PROTECTED])
Re: PRP => PRF (TRUNCATE) ("Scott Fluhrer")
Re: Def'n of bijection ([EMAIL PROTECTED])
----------------------------------------------------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 13:00:06 GMT
Tom St Denis <[EMAIL PROTECTED]> wrote:
: "JPeschel" <[EMAIL PROTECTED]> wrote in message
:> Reviewing, from time to time, the first few chapters from Menezes's book
:> might be a good way to reinforce the notation. Beats being corrected.
: Good tip. Menezes's book is? HAC?
Very likely: HAC [http://cacr.math.uwaterloo.ca/hac/]
- by Alfred J. Menezes, Paul C. van Oorschot and Scott A. Vanstone
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: [EMAIL PROTECTED] (Bob Silverman)
Subject: Re: about DH parameters & germain primes
Date: 5 Jun 2001 06:12:42 -0700
quequ <[EMAIL PROTECTED]> wrote in message
news:<[EMAIL PROTECTED]>...
> Hi, I've found this tip about Diffie-Hellman parameters:
>
> "If p, (p-1)/2 both prime, then you can just use any
> g you please [other than 0, 1, and -1], and you'll
> get a very large order [at least (p-1)/2]"
>
> It's right?
Ask yourself: what are the *possible* orders of the group?
0 is not in the group. The set {1,-1} forms a sub-group whose elements
have order 2 and whose index is (p-1)/2. What other sub-groups are there?
Look up "Lagrange's Theorem".
------------------------------
Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 05 Jun 2001 09:44:10 -0400
Tim Tyler <[EMAIL PROTECTED]> writes:
> [EMAIL PROTECTED] wrote:
>
>: He seems to be trying to create a scheme w.r.t. which every message
>: of size <n is the possible compression of a meaningful message.
>
> Yes, this is essentially the idea David is aiming for.
Note, however, that I ``don't know what I'm talking about.''
> Yes, achieved it for the domain of English language sentences is
> probably an impractical goal. However, note that progress towards the
> goal is itself worthwile.
Why? What does that goal buy us?
> Even if you don't have "perfect" compression, good compression still
> helps.
Good compression exists. Under various hypotheses, optimal compression
exists. What do you believe is lacking? And what do you think ``perfect
compression'' means?
> If you can't make all the cyphertexts smaller than n decrypt to
> correct-looking messages, increasing the proportion that do may
> still be worthwhile.
Note that even OTP does not do this. All messages of size n are equally
likely as decrypts, but most possible decrypts are not meaningful. This
condition is essentially the best that can be hoped for, and in general
it requires that the keyspace be as large as the message space--i.e.,
that keys be as long as the message.
(Though I conjecture that this property can also be achieved if keys
are only as long as the compressed message. I.e., if keys contain
exactly as much entropy as the original message. But my observation is
probably trite; as a mathematician but a non-cryptologist, I further
conjecture that this is what the theorems on OTP actually say.)
Anyway, that seems to be the problem here: Scott (and some others) are
conflating the notions of ``compression'' with the desirable
properties of a OTP, and then expressing their confused ideas with
confusing language.
Len.
--
Frugal Tip #64:
Find a lucrative new use for mildew.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Keyed hash functions
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 13:41:35 GMT
Mark Wooding <[EMAIL PROTECTED]> wrote:
: Tim Tyler <[EMAIL PROTECTED]> wrote:
:> Are such "keyed hash functions" recognised as a primitive cryptographic
:> type, distint from MACs?
: I believe that your idea of a `keyed hash' is attempting to capture what
: is usually referred to as a `pseudo-random function family'. A PRF
: family is suitable for use as a MAC. However, not all MACs are good
: PRFs. [...]
: I think there is a hope (rather than anything better informed) that
: HMAC with a decent hash function is a passable PRF. [...]
:> [Note: I *don't* want to talk about hash(CTR|KEY) schemes]
: Very wise.
A helpful post - thanks.
FWIW, my final comment was intended to cover /all/ constructions involving
keying orthodox hash functions, /including/ things like HMAC.
I'd ideally like an aesthetically pleasing construction for generating a
keyed pseudo-random function - and the idea of repeatedly feeding the key
in at the input and increasing the volume of data hashed is not terribly
appealing - especially if it's necessary to use multiple serial hashes
(as HMAC does) in order to get the proper level of security.
Maybe I'm just being squeamish. I can see how using an existing hash
function might save a lot of time, from the designer's point of view.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 05 Jun 2001 09:57:26 -0400
Tim Tyler <[EMAIL PROTECTED]> writes:
> [EMAIL PROTECTED] wrote:
>: "Tom St Denis" <[EMAIL PROTECTED]> writes:
>:>
>:> My question now is ... replace A..Z with 0..255. Now why isn't CBC mode
>:> encryption a "BIJECTIVE" operation?
>
>: If you are now defining a message as ``any possible array of bites'', and
>: you are viewing CBC mode as a mapping from possible messages to possible
>: messages, then it is indeed a bijection.
>
> How do you think you encrypt a 2 byte message with a conventional block
> cypher in CBC mode?
Scott (and now you) are correct; I didn't allow for padding. Note that
this is not a very interesting quibble, though. I imagine it's customary
to sidestep the issue by defining a ``message'' as the equivalence class
of messages modulo padding. The only information we lose is how many
trailing bytes were padding, and how many were actually message content
(and the answer is ``none'').
>: He proposes to ``remedy'' that by first applying a bijection between
>: meaningful messages and possible messages, which he calls ``bicom
>: compression''.
>
> Ah. Now it transpires that you don't know what you're talking about.
> http://www3.sympatico.ca/mtimmerm/bicom/bicom.html
You're right; BICOM is even stupider than I conjectured. He says of BICOM
that ``you can appreciate this bijective quality, because bijectivity
implies that the compression adds no known plaintext to its output.''
There are plenty of ways to suppress known plaintext in a compresssed file.
Suggesting a good way is useful, but it's not some sort of breakthrough.
In particular, BICOM merely adds a layer of obscurity to a message; one
simply decompresses the message and THEN checks it for meaningful content.
Not useless, but again not breakthrough material.
Len.
--
Frugal Tip #35:
Don't be afraid to pay extra for quality bungee jumping equipment. In
the long run, you'll save!
------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Def'n of bijection
Date: 5 Jun 2001 14:20:36 GMT
[EMAIL PROTECTED] wrote in <[EMAIL PROTECTED]>:
>In particular, BICOM merely adds a layer of obscurity to a message; one
>simply decompresses the message and THEN checks it for meaningful
>content. Not useless, but again not breakthrough material.
>
But where it appears to be a break trough as far as open
crypto is this. With BICOM any KEY leads to a file that might
have been compressed and ecnrypted. With lesser nonbijective
schemes currently in use this does not happen and most KEYS
can be rejected right away as impossible. If you ever played
with common compression encryption methods you would see that
your solution of "simply decompresses" is not the real world
of current open crypto. Its an easy trap to fall for since
virtuall any file compresses ( by compress I mean it runs
throurgh compressor and file comes out. But most files get
larger one just hopes for class of files of interest they
get smaller) One falsely assumes that most files would also
decompress. The common myth is just drop headers and them it
will decompress nicely. Things are not quite that simple.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged or
something..
No I'm not paranoid. You all think I'm paranoid, don't you!
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 14:21:10 GMT
[EMAIL PROTECTED] wrote:
: Tim Tyler <[EMAIL PROTECTED]> writes:
:> [EMAIL PROTECTED] wrote:
:>: He seems to be trying to create a scheme w.r.t. which every message
:>: of size <n is the possible compression of a meaningful message.
:>
:> Yes, this is essentially the idea David is aiming for.
: Note, however, that I ``don't know what I'm talking about.''
Yes, I believe I'd already publicly noted that.
:> Yes, achieved it for the domain of English language sentences is
:> probably an impractical goal. However, note that progress towards the
:> goal is itself worthwile.
: Why? What does that goal buy us?
Progress towards the goal helps because it makes more decrypts
look like plausible messages, thereby hindering any cryptanaltic
effort to identify which of them was intended.
:> Even if you don't have "perfect" compression, good compression still
:> helps.
: Good compression exists. Under various hypotheses, optimal compression
: exists. What do you believe is lacking? [...]
I don't think there was really any clear idea about what optimal
compression of files might mean until David's original treatment of
the issue - let alone any implementations of them. A number of people
(including me) have attempted to locate prior art - but without any success.
: And what do you think ``perfect compression'' means?
Well, it can mean many things - but one of the things that it implies is
that the range is properly packed and that there's only one compressed
representation of each uncompressed file. Any compressor that wastes its
range storing multiple representations of any uncompressed files is
not going to deserve the title.
:> If you can't make all the cyphertexts smaller than n decrypt to
:> correct-looking messages, increasing the proportion that do may
:> still be worthwhile.
: Note that even OTP does not do this. All messages of size n are equally
: likely as decrypts, but most possible decrypts are not meaningful. This
: condition is essentially the best that can be hoped for, and in general
: it requires that the keyspace be as large as the message space--i.e.,
: that keys be as long as the message.
Yes. The case where the key is as large as the original message is not
where compression helps.
: (Though I conjecture that this property can also be achieved if keys
: are only as long as the compressed message. I.e., if keys contain
: exactly as much entropy as the original message. But my observation is
: probably trite; as a mathematician but a non-cryptologist, I further
: conjecture that this is what the theorems on OTP actually say.)
I don't think they mention the possibity of compression.
: Anyway, that seems to be the problem here: Scott (and some others) are
: conflating the notions of ``compression'' with the desirable
: properties of a OTP, and then expressing their confused ideas with
: confusing language.
A bizarrely inaccurate representation of the situation IMO.
AFAICS, nobody is "conflating the notions of ``compression''" with anything.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Def'n of bijection
Reply-To: [EMAIL PROTECTED]
Date: Tue, 5 Jun 2001 14:34:13 GMT
[EMAIL PROTECTED] wrote:
: Tim Tyler <[EMAIL PROTECTED]> writes:
:> [EMAIL PROTECTED] wrote:
:>: "Tom St Denis" <[EMAIL PROTECTED]> writes:
:>: He proposes to ``remedy'' that by first applying a bijection between
:>: meaningful messages and possible messages, which he calls ``bicom
:>: compression''.
:>
:> Ah. Now it transpires that you don't know what you're talking about.
:> http://www3.sympatico.ca/mtimmerm/bicom/bicom.html
: You're right; BICOM is even stupider than I conjectured. He says of BICOM
: that ``you can appreciate this bijective quality, because bijectivity
: implies that the compression adds no known plaintext to its output.''
Yes...
: There are plenty of ways to suppress known plaintext in a compresssed file.
AFAIK, only David Scott's compressors and BICOM succeed in avoiding adding
information to the compressed file that was not present in the original
message.
Consequently, all other compressors add information which can be used by a
cryptanalyst to reject decrypts - in much the same way that they can use
known plaintext.
If there are other compressors (besides David's and BICOM) that perform
similarly, I would be delighted to hear about them.
: In particular, BICOM merely adds a layer of obscurity to a message;
The aim of the compressor is mainly compression - not adding obscurity.
Compression us useful for many reasons - including depriving the adversary
of cyphertext, and increasing the entropy-per-bit of the input text.
This decreases redundancies in the cyphertext which might otherwise be
used to assist frequency analysis.
: [...] one simply decompresses the message and THEN checks it for
: meaningful content.
I *keep* hearing this - as though it is some sort of mantra.
In cases where checking for meaningful content might have worked with
uncompressed messages, it will not necessarily work where the messages
have been compressed, since there are likely to be more decrypts that
look as though they contain meaningful content.
Compression increases the resulting unicity distance for a system.
This makes it likely that some messages which had unique decrypts will no
longer do so if compression is applied.
--
__________
|im |yler Try my latest game - it rockz - http://rockz.co.uk/
------------------------------
Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 05 Jun 2001 10:56:18 -0400
Tim Tyler <[EMAIL PROTECTED]> writes:
>: it requires that the keyspace be as large as the message space
>
> Yes. The case where the key is as large as the original message is not
> where compression helps.
Smaller message ==> smaller key. Compression is a great way to economize
your OTP key material.
>: if keys contain exactly as much entropy as the original message...I
>: further conjecture that this is what the theorems on OTP actually say.
>
> I don't think they mention the possibity of compression.
But I'll bet you a beer they mention entropy.
Loosely speaking, the best compression is the one for which ``bits in
output file'' equals ``bits of entropy in input file''. By the definition
of entropy, better (lossless) compression is impossible. And perfect
security can probably be achieved if ``bits of entropy in key'' equals
``bits of entropy in message''.
>: Anyway, that seems to be the problem here: Scott (and some others) are
>: conflating the notions of ``compression'' with the desirable
>: properties of a OTP, and then expressing their confused ideas with
>: confusing language.
>
> A bizarrely inaccurate representation of the situation IMO. AFAICS,
> nobody is "conflating the notions of ``compression''" with anything.
Then think carefully about it. With gzip, brute-forcing the key might
mean, ``decrypt, and check for a valid gzip header.'' With BICOM, it might
mean ``decrypt, then decompress, then check for meaningful content.''
If done right, that does add work to the decryption stage, which is nice.
But that is effectively a form of cipher chaining. IOW, BICOM represents
a compressor which also obscures information. Applying ROT128 to a gzipped
file would have a similar effect, as would using a compressor with a fixed
(detached) codec.
In short, if the cipher is so easy to brute force that spotting gzip's
header does a lot for you, then the cipher is no good. Adding work is
nice, but you really want a cipher which stands on its own merits. It
should be perfectly safe to encrypt ASCII-coded English.
Compression is done (1) to save work and bandwidth, and (2) for good
measure.
Len.
--
You're a perfect example of a phenomenon described by Knuth in 1974:
"...premature optimization is the root of all evil..."
-- Dan Bernstein
------------------------------
From: "Scott Fluhrer" <[EMAIL PROTECTED]>
Subject: Re: PRP => PRF (TRUNCATE)
Date: Tue, 5 Jun 2001 07:50:09 -0700
Tom St Denis <[EMAIL PROTECTED]> wrote in message
news:VC2T6.31386$[EMAIL PROTECTED]...
> Reading the paper David pointed to a bit ago I see they have one way to go
> from PRP to PRF by truncating bits of the output.
>
> Obviously there will be alot of PRPs that make the same PRF. Wouldn't a
> better method of truncating be reducing modulo a prime?
>
> I.e if you want a four bit PRF you do (PRP mod 17) mod 16 = PRF. That way
> the higher order bits will affect the output. Did I misunderstand the
> original intent?
Well, yes. For one, if you assume a perfect PRP, what advantage would your
approach have over simple truncation? He achieved a provable result -- can
you make the analogous proof with your more complicated method.
In addition, if the two modulii are too close, you'll get problems. In the
simplistic example you gave, 0 is be output twice as often as any other
value.
--
poncho
------------------------------
Subject: Re: Def'n of bijection
From: [EMAIL PROTECTED]
Date: 05 Jun 2001 11:05:11 -0400
Tim Tyler <[EMAIL PROTECTED]> writes:
>
>: In particular, BICOM merely adds a layer of obscurity to a message;
>
> The aim of the compressor is mainly compression - not adding obscurity.
Precisely. Which is why you (and other BICOM fans) are slightly confused.
You yourself just said, ``all other compressors add information which
can be used by a cryptanalyst to reject decrypts''. In other words, you
wish the compressor would contribute obscurity.
Hint: If compressors are used to compress, not add obscurity, then the
presence of ``known plaintexts'' is a non-issue. I could have encrypted
the original ASCII file--which makes it trivially easy to reject decrypts.
>: [...] one simply decompresses the message and THEN checks it for
>: meaningful content.
>
> I *keep* hearing this - as though it is some sort of mantra.
Because that's what one does. Why are you upset that people point out
what ``brute-forcing'' means for BICOM?
> In cases where checking for meaningful content might have worked with
> uncompressed messages, it will not necessarily work where the messages
> have been compressed, since there are likely to be more decrypts that
> look as though they contain meaningful content.
That statement requires proof. (Hint: It's false. Each compressed file
has exactly one decompressed original. One merely decompresses and THEN
looks at the result.)
> Compression increases the resulting unicity distance for a system.
> This makes it likely that some messages which had unique decrypts will no
> longer do so if compression is applied.
Since BICOM compression is ``bijective'', there is exactly one message
per compresssed file. All I have to do is decompress--I am then guaranteed
to be looking at the file whose compression I started with. There are not
``lots of possible'' decompressions.
Len.
--
POSIX is a ``standard'' designed by a vendor consortium several years
ago to eliminate progress and protect the installed base.
-- Dan Bernstein
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to sci.crypt.
End of Cryptography-Digest Digest
******************************