Cryptography-Digest Digest #584, Volume #14 Mon, 11 Jun 01 06:13:01 EDT
Contents:
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG
(SCOTT19U.ZIP_GUY)
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG (Tim
Tyler)
Re: Uniciyt distance and compression for AES (Tim Tyler)
Re: Uniciyt distance and compression for AES (Tim Tyler)
Re: Shannon's definition of perfect secrecy (wtshaw)
Re: Uniciyt distance and compression for AES ([EMAIL PROTECTED])
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY ("John A.
Malley")
best encryption? ("Dirk Heidenreich")
Re: Brute-forcing RC4 (S Degen)
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY (Mok-Kong
Shen)
Re: Shannon's definition of perfect secrecy (Mok-Kong Shen)
Re: Uniciyt distance and compression for AES (Mok-Kong Shen)
Re: National Security Nightmare? (Mok-Kong Shen)
Re: Help with Comparison Of Complexity of Discrete Logs, Knapsack, and (Mok-Kong
Shen)
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY (JPeschel)
Re: National Security Nightmare? (JPeschel)
Re: Crypto Links ("Tom St Denis")
Re: best encryption? ("Tom St Denis")
Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG (Tim
Tyler)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (SCOTT19U.ZIP_GUY)
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG
Date: 11 Jun 2001 02:55:16 GMT
[EMAIL PROTECTED] (John A. Malley) wrote in
<[EMAIL PROTECTED]>:
>The paper is on-line, in pages scanned and posted as PDF files, at
>
>http://www3.edgenet.net/dcowley/docs.html
>
>
I have those images. But you comment on them as PDF files
WHERE. I see no PDF files there.
David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**NOTE FOR EMAIL drop the roman "five" ***
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged or
something..
No I'm not paranoid. You all think I'm paranoid, don't you!
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 03:03:37 GMT
John A. Malley <[EMAIL PROTECTED]> wrote:
:> > Tim Tyler wrote:
:> > > M.K. Shen wrote:
:> > > : My memory of Shannon's paper is no good, but I don't think that he
:> > > : considered the length of the messages.
:> > >
:> > > I don't think it was mentioned either - all the messages were the same
:> > > length in the system in question.
: Just a comment - the messages in a finite set do NOT need to be of the
: same length for the cipher to achieve perfect secrecy. [...]
...but they *do* if one is using an OTP to encrypt them.
Apologies if the fact that an OTP was intended was not clear from the context.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 03:10:08 GMT
Tom St Denis <[EMAIL PROTECTED]> wrote:
: "SCOTT19U.ZIP_GUY" <[EMAIL PROTECTED]> wrote in message
:> [EMAIL PROTECTED] wrote in <[EMAIL PROTECTED]>:
[...]
:> >If the compressor doesn't reject meaningless messages in the original
:> >language, then meaningless messages will be compressed and decompression
:> >can be used to distinguish between compressed decryptions that are
:> >meaningful and compressed decryption that are not meaningful.
:>
:> Actaully your quite wrong there is not needed to reject meaningless
:> messages by compression. What compression does is to make a large
:> set of files smaller. Some are meaningless and some are not. But
:> since many files get compressed smaller many more get longer. The
:> ones that get longer will moist likely be truely meaningless.
: This is not true. In fact it's just the opposite. Any good codec makes a
: few files smaller.
You err. Most codecs make an infinite set of files smaller.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 03:26:04 GMT
[EMAIL PROTECTED] wrote:
: I concur that a compression algorithm that compressed messages
: which are meaningful in the original language, and expanded messages
: which are meaningless in the original language, would increase unicity
: distance if only compressed messages were used as inputs to the
: encryption algorithm.
Ah good. There's no need for me to reply to your original message, then
;-)
: It seems that one would have to demonstrate that only meaningful
: messages are compressed in order to say that compression increases
: unicity distance.
No, that would be an error, also perpetrated by other participants here.
It is *not* necessary to show that *only* meaningfulk messages are
compressed in order to say that compression increases unicity distance.
All that is necessary is for meaningfull messages to be compressed. It
matters not-in-the-slightest that loads of other junk is compressed as
well.
: What isn't clear to me is how a compression algorithm can be intelligent
: enough to distinguish "meaningful" from "meaningless" inputs (although
: it would be easier if the compression algorithm knew the input language).
Compression algorithms need to do no such thing in order
for the unicity distance to be increased.
All they really need to do is compress plausible-looking messages
on average - and face it - if they didn't do that, it would be hard
to justify calling them compressors in the context of the target data.
--
__________
|im |yler [EMAIL PROTECTED] Home page: http://alife.co.uk/tim/
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Shannon's definition of perfect secrecy
Date: Sun, 10 Jun 2001 21:37:19 -0600
In article <[EMAIL PROTECTED]>, Mok-Kong Shen
<[EMAIL PROTECTED]> wrote:
> One could even go in the other direction. A long message
> could be divided in to pieces and 'formally' sent as
> a number of different messages (to be put together by
> the recepient). (Between these pieces other messages
> may intervene.) I personally would think that a practically
> viable way is to always send 'records' of a quite natural
> constant length, say 160 bytes, and one doesn't provide to
> the outside world the notion of (individual) messages at
> all. (The messages have headers to distinguish themselves
> but these are encrypted and can't be found by the opponent,
> since the encryption is considered secure.) So the
> opponent sees a stream of such records and has no idea of
> how many messages are in there at all.
>
> M. K. Shen
With a minimum of separation, there is no reason that any number of
messages might be put into a single file. Imagine several people sending
such to each other or a central hub, having them sorted, then forwarded.
Necessary is that secret keys be used for each dedicated hop. If messages
are multiple encripted with an inner envelope for distribution, the
originator and end recipient can share n additional secret key for
resolving the final layer of processing.
--
To make a person into a puppet, start with one with a wooden head.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Uniciyt distance and compression for AES
Date: Sun, 10 Jun 2001 19:06:19 -0800
Tim Tyler wrote:
>
> [EMAIL PROTECTED] wrote:
>
> : I concur that a compression algorithm that compressed messages
> : which are meaningful in the original language, and expanded messages
> : which are meaningless in the original language, would increase unicity
> : distance if only compressed messages were used as inputs to the
> : encryption algorithm.
>
> Ah good. There's no need for me to reply to your original message, then
> ;-)
>
> : It seems that one would have to demonstrate that only meaningful
> : messages are compressed in order to say that compression increases
> : unicity distance.
>
> No, that would be an error, also perpetrated by other participants here.
> It is *not* necessary to show that *only* meaningfulk messages are
> compressed in order to say that compression increases unicity distance.
> All that is necessary is for meaningfull messages to be compressed. It
> matters not-in-the-slightest that loads of other junk is compressed as
> well.
>
> : What isn't clear to me is how a compression algorithm can be intelligent
> : enough to distinguish "meaningful" from "meaningless" inputs (although
> : it would be easier if the compression algorithm knew the input language).
>
> Compression algorithms need to do no such thing in order
> for the unicity distance to be increased.
>
> All they really need to do is compress plausible-looking messages
> on average - and face it - if they didn't do that, it would be hard
> to justify calling them compressors in the context of the target data.
Someone is going to have to a better job of explaining this before I can
buy it.
The explanation should be simple:
n_o = H(k)/d where n_o = unicity distance, H(k) is keyspace entropy
and d = redundancy = r_o - r_n and r_o = r_n if all possible
messages are meaningful.
Explain how compression effectively reduces the redundancy (i.e. even if
I am able to decompress decryptions before determining whether or not
the key is spurious) if both meaningful and meaningless messages are
compressed. H(k) is assumed constant so the only way to increase n_o is
to decrease d . So to show that compression increases unicity distance,
one has to show that compression effectively reduces the redundancy d.
I don't see how compression can reduce d unless it filters out
meaningless
messages.
There's been plenty of hand-waving already...just the math please. ;}
------------------------------
From: "John A. Malley" <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY
Date: Sun, 10 Jun 2001 23:49:51 -0700
"SCOTT19U.ZIP_GUY" wrote:
> >The paper is on-line, in pages scanned and posted as PDF files, at
> >
> >http://www3.edgenet.net/dcowley/docs.html
> >
> >
>
> I have those images. But you comment on them as PDF files
> WHERE. I see no PDF files there.
>
My mistake, that site has each page of the Shannon paper scanned and
converted to JPEG files.
Joe Peschel posted he'd heard The Bell System Technical Journal planned
to post an electronic copy of Shannon's paper on-line. Mad Cow replied
to that post and explained how he converted the scanned photocopies to
PDF with unreadable results. That's what I mis-remembered.
I just checked The Bell System Technical Journal web site and didn't
find it posted yet. :-(
But at least I have the literal image-copy of the paper from Mad Cow :-)
John A. Malley
[EMAIL PROTECTED]
------------------------------
From: "Dirk Heidenreich" <[EMAIL PROTECTED]>
Subject: best encryption?
Date: Fri, 8 Jun 2001 15:00:29 +0200
Hello,
i am not in use to any security programms. But i am interested into how to
safe my data. I am looking for a very good programm, so which one would you
suggest? What is objective the best?
Thanks for your help.
Dirk Heidenreich
------------------------------
From: S Degen <[EMAIL PROTECTED]>
Subject: Re: Brute-forcing RC4
Date: Mon, 11 Jun 2001 09:34:15 +0200
David Wagner wrote:
>
> If you want to break WEP encryption, there are many ways to do so
> without recovering the RC4 key. (You can see the paper to be presented
> at MOBICOM 2001 for some discussion, for instance.)
I know, i am doing research about WLAN security, but i simply want to
decrypt the key :) Where can i find this paper?
>
> Alternatively, if for some reason it is crucial to recover the RC4 key,
> it seems likely to dramatically speed up the 40-bit search by exploiting
> flaws in WEP. All the WEP cards that I've seen start their IV off at 0
> when they are reset, and count up incrementally from there. Moreover,
> known plaintext is often available in the form of DHCP Discover messages,
> etc. (see Arbaugh's work).
Know these too, but : The 0 IV "bug" should be fixed in the latest
cards.
(For example Cisco:
http://www.cisco.com/warp/public/cc/pd/witc/ao350ap/prodlit/1281_pp.htm
)
>
> Therefore, you could use Hellman's time-space tradeoff (precomputed
> with an IV of 0 or some other small number) to greatly reduce the
> cost of cryptanalysis, if you wanted to recover more than one RC4 key.
> I believe one can expect to break each RC4 key with only 2^27 work per
> key and 2^26 storage, after a one-time 2^40 precomputation. Of course,
> these remarks apply only to the 40-bit version of WEP; to break 104-bit
> WEP, you'll want the non-key-recovery attacks in the MOBICOM paper.
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY
Date: Mon, 11 Jun 2001 10:36:40 +0200
Tim Tyler wrote:
>
> John A. Malley <[EMAIL PROTECTED]> wrote:
> :> > Tim Tyler wrote:
> :> > > M.K. Shen wrote:
>
> :> > > : My memory of Shannon's paper is no good, but I don't think that he
> :> > > : considered the length of the messages.
> :> > >
> :> > > I don't think it was mentioned either - all the messages were the same
> :> > > length in the system in question.
>
> : Just a comment - the messages in a finite set do NOT need to be of the
> : same length for the cipher to achieve perfect secrecy. [...]
>
> ...but they *do* if one is using an OTP to encrypt them.
>
> Apologies if the fact that an OTP was intended was not clear from the context.
Opinions seem to differ here. So let me once again ask:
Has Shannon proved the perfect security of the conventional
OTP (for messages of finite but varying length) or not?
I like to know the result clearly and unambigiously. Thanks.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Shannon's definition of perfect secrecy
Date: Mon, 11 Jun 2001 10:47:40 +0200
wtshaw wrote:
>
> Mok-Kong Shen<[EMAIL PROTECTED]> wrote:
>
> > One could even go in the other direction. A long message
> > could be divided in to pieces and 'formally' sent as
> > a number of different messages (to be put together by
> > the recepient). (Between these pieces other messages
> > may intervene.) I personally would think that a practically
> > viable way is to always send 'records' of a quite natural
> > constant length, say 160 bytes, and one doesn't provide to
> > the outside world the notion of (individual) messages at
> > all. (The messages have headers to distinguish themselves
> > but these are encrypted and can't be found by the opponent,
> > since the encryption is considered secure.) So the
> > opponent sees a stream of such records and has no idea of
> > how many messages are in there at all.
> >
> > M. K. Shen
>
> With a minimum of separation, there is no reason that any number of
> messages might be put into a single file. Imagine several people sending
> such to each other or a central hub, having them sorted, then forwarded.
> Necessary is that secret keys be used for each dedicated hop. If messages
> are multiple encripted with an inner envelope for distribution, the
> originator and end recipient can share n additional secret key for
> resolving the final layer of processing.
One may add additional layers to increase the complexity
for the opponent, but that's not necessary, if the
encryption is otherwise secure. Consider the normal case
of business letters. They have each sufficient header
informations (including page numbers), such that, if the
pages of several letters get mixed up, they can be
separated. What I said is that one could always have
the headers included in the encryption processing and
that one has a number messages concatenated with the
result sent as a number of 'records' of some fixed
constant length. (The last message, if not urgent, could
be sent only partly, with the remaining sent on the next
day, say.)
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Uniciyt distance and compression for AES
Date: Mon, 11 Jun 2001 10:53:30 +0200
Tim Tyler wrote:
>
> Tom St Denis <[EMAIL PROTECTED]> wrote:
> : This is not true. In fact it's just the opposite. Any good codec makes a
> : few files smaller.
>
> You err. Most codecs make an infinite set of files smaller.
A compressor appropriate for a given application should
compress the files of that application on the average
to a smaller sizes. One certainly needn't care files
that don't belong to the application.
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: National Security Nightmare?
Date: Mon, 11 Jun 2001 11:03:54 +0200
JPeschel wrote:
>
> "Boyd Roberts" [EMAIL PROTECTED] writes:
>
> >yeah, wrong. i'm pleading an 'upper respiratory infection' defence.
> >
>
> Watch out! The grammar police is a comin'. Or is it, "are a comin'?"
>
> Nevermind. They got us. Apparently, we have the right to remain silent...
In France I heard that there is a national instute
that decides authoritatively on language issues of French.
Is there a similar one for the English world? If yes, I
suppose we should forward some of the posts of the thread
there and simply wait for the genuinely correct answers.
(Or else are there some internet groups devoted to
linguistics?)
M. K. Shen
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Help with Comparison Of Complexity of Discrete Logs, Knapsack, and
Date: Mon, 11 Jun 2001 11:17:39 +0200
"Douglas A. Gwyn" wrote:
>
> Mok-Kong Shen wrote:
> > While my knowlegde is too far from being able to understand
> > the matter, I do like very very much to know of the name of
> > a good reference where it is claimed/established that the
> > theory which Whitehead and Russell developed (or further
> > developed) is wrong. Could you please supply such a
> > reference?
>
> Just look up "Hilbert's program" (or programme) and you should
> soon find it.
My point was that the stuff of Whitehead and Russell
is not wrong (mathematically). They had the ambitious
goal of sort of reducing ALL mathematics to logics.
That goal failed. But that doesn't mean that the content
of their book is any wrong. In fact, they refined the
the theoretical apparatus of Dedekind and Peano in the
axiomatization of arithmatics. If you accept Peano,
you couldn't call Whitehead and Russell wrong.
There is an analogous case, namely that of the legendary
Nicolas Bourbaki (passed away a few years ago officially).
He attempted to axiomatize the whole of mathematics
but failed. But that does not mean that anything he
wrote is wrong.
M. K. Shen
==========================
http://home.t-online.de/home/mok-kong.shen
------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Date: 11 Jun 2001 09:34:24 GMT
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY
"John A. Malley" [EMAIL PROTECTED] writes, in part:
>Joe Peschel posted he'd heard The Bell System Technical Journal planned
>to post an electronic copy of Shannon's paper on-line.
Fran Grimes told me in early March that the paper would be posted
in about a week.
>I just checked The Bell System Technical Journal web site and didn't
>find it posted yet. :-(
Yeah, I know. I wrote to her again in late March, but I still haven't received
an answer. Maybe The Journal changed it's mind; maybe Fran forgot;
maybe more of us should write to her.
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: [EMAIL PROTECTED] (JPeschel)
Date: 11 Jun 2001 09:48:30 GMT
Subject: Re: National Security Nightmare?
Mok-Kong Shen [EMAIL PROTECTED] writes:
>In France I heard that there is a national instute
>that decides authoritatively on language issues of French.
>Is there a similar one for the English world?
Yes. They told me you should listen to me and Len, er, Len and
me... I mean, uh, Len and I...
Joe
__________________________________________
Joe Peschel
D.O.E. SysWorks
http://members.aol.com/jpeschel/index.htm
__________________________________________
------------------------------
From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: Crypto Links
Date: Mon, 11 Jun 2001 09:55:08 GMT
BlankPlease post in TEXT format not HTML. Also I suggest if you want to
learn crypto you should buy some books.
Tom
============
"news.singnet.com.sg" <[EMAIL PROTECTED]> wrote in message
news:9g1i2i$fn4$[EMAIL PROTECTED]...
Can anyone provide a list of links to go to where I could find general info
about Cryptography from general issues all the way to the nitty grittys of
each cipher technique?
Just realised I had links for all my other hobbies but none for Crypto!
(except for this newsgroup link)
Thanks in advance!
Annie L.
------------------------------
From: "Tom St Denis" <[EMAIL PROTECTED]>
Subject: Re: best encryption?
Date: Mon, 11 Jun 2001 09:56:16 GMT
"Dirk Heidenreich" <[EMAIL PROTECTED]> wrote in message
news:9g1rnk$t4o$04$[EMAIL PROTECTED]...
> Hello,
>
> i am not in use to any security programms. But i am interested into how to
> safe my data. I am looking for a very good programm, so which one would
you
> suggest? What is objective the best?
> Thanks for your help.
I take it English is not your first lang. Hmm no problem though :-)
The "best encryption" algorithm depends on what your task is. If it's just
file crypto try a cryptosystem like PGP or GnuPG.
Tom
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Best, Strongest Algorithm (gone from any reasonable topic) - VERY LONG
Reply-To: [EMAIL PROTECTED]
Date: Mon, 11 Jun 2001 09:58:56 GMT
Mok-Kong Shen <[EMAIL PROTECTED]> wrote:
: Tim Tyler wrote:
:> John A. Malley <[EMAIL PROTECTED]> wrote:
:> :> > Tim Tyler wrote:
:> :> > > M.K. Shen wrote:
:> :> > > : My memory of Shannon's paper is no good, but I don't think that he
:> :> > > : considered the length of the messages.
:> :> > >
:> :> > > I don't think it was mentioned either - all the messages were
:> :> > > the same length in the system in question.
:>
:> : Just a comment - the messages in a finite set do NOT need to be of the
:> : same length for the cipher to achieve perfect secrecy. [...]
:>
:> ...but they *do* if one is using an OTP to encrypt them.
:>
:> Apologies if the fact that an OTP was intended was not clear from the
:> context.
: Opinions seem to differ here. So let me once again ask:
: Has Shannon proved the perfect security of the conventional
: OTP (for messages of finite but varying length) or not?
: I like to know the result clearly and unambigiously. Thanks.
I thought John Malley at least was fairly clear and unambiguous in writing:
``3) WHY ENCIPHERING A FINITE SET OF MESSAGES BY XORING RANDOM BINARY
STRINGS AS LONG AS THE MESSAGES DOES *NOT* GUARANTEE PERFECT SECRECY''
...though maybe a qualification abot there not being 2^n messages all of
the same length needs to be tacked onto that headline.
It appears that Shannon only mentioned the OTP while dealing with
the case of infinite streams of data.
You say "opinions seem to differ here". Who disagrees at this stage?
Are you referring to Tom St Denis?
--
__________
|im |yler Try my latest game - it rockz - http://rockz.co.uk/
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to sci.crypt.
End of Cryptography-Digest Digest
******************************