Cryptography-Digest Digest #305, Volume #9 Tue, 30 Mar 99 21:13:03 EST
Contents:
Norton diskreet (Hudobija)
Norton Diskreet...again... (Borut Ratej)
Re: True Randomness & The Law Of Large Numbers ("karl malbrain")
Re: Live from the Second AES Conference (Jim Gillogly)
Re: True Randomness & The Law Of Large Numbers (R. Knauer)
Re: My Book "The Unknowable" (karl malbrain)
Re: True Randomness & The Law Of Large Numbers (R. Knauer)
Re: Live from the Second AES Conference (wtshaw)
Re: Live from the Second AES Conference ("Brian Gladman")
Re: newbie question ("Kryspin Ziemski")
Re: What is fast enough? ([EMAIL PROTECTED])
Re: fast enough (sequel) (Bryan G. Olson; CMSC (G))
----------------------------------------------------------------------------
From: Hudobija <[EMAIL PROTECTED]>
Subject: Norton diskreet
Date: Tue, 30 Mar 1999 23:36:40 +0200
I have a file encrypted with Norton Diskreet and would like to decrypt
it but I (of sourse) forgot the password. Is there a program that can
decrypt the file or is there someone that would decrypt it for me. I
can't afford to pay the price AccessData charges but I am willing to pay
a small fee (if theres no other way) if someone decrypts this file for
me.
Please e-mail me if you can help me.
Thanks in advance!
Borut
------------------------------
From: Borut Ratej <[EMAIL PROTECTED]>
Subject: Norton Diskreet...again...
Date: Tue, 30 Mar 1999 23:42:23 +0200
(I forgot to include my e-mail in the previous posting, so here it is
again...sorry for spamming the newsgroup)
I have a file encrypted with Norton Diskreet and would like to decrypt
it but I (of sourse) forgot the password. Is there a program that can
decrypt the file or is there someone that would decrypt it for me. I
can't afford to pay the price AccessData charges but I am willing to pay
a small fee (if theres no other way) if someone decrypts this file for
me.
Please e-mail me if you can help me.
Thanks in advance!
Borut
------------------------------
From: "karl malbrain" <[EMAIL PROTECTED]>
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 30 Mar 1999 13:56:18 -0800
Dave Knapp <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]...
> The distance from the origin is indeed a correlated property,
> especially, as you claim, that one must look at it as a time series.
>
(...)
> If you follow a _particular particle_ in time, it tends to be, on
> average, further from the origin than an ensemble of particles created
> using the distribution of distances from the origin at a particular
> time. (... remainder snipped ...)
What we're trying for here is a MEASURE of the ENSEMBLE as the outcome of
the motions of ALL the particles, together. It applies to cryptography as
BRUTE FORCE applies to ALL the key possibilities, together.
I believe Mr. Knauer to illustrate BROWNIAN motions, as DIFFERENTIATED into
RANDOM=CHAOS/COMPLEXITY. Again, the QUOTIENT here can be <<moved to another
position>> as MYSTICISM. Karl M
------------------------------
From: Jim Gillogly <[EMAIL PROTECTED]>
Subject: Re: Live from the Second AES Conference
Date: Tue, 30 Mar 1999 13:52:17 -0800
Reply-To: [EMAIL PROTECTED]
David Wagner wrote:
>
> In article <[EMAIL PROTECTED]>, Jim Gillogly <[EMAIL PROTECTED]> wrote:
>
> > My suggestion was that to use E2.RC6, where each of E2 and RC6 is to
> > be used in its 128-bit mode, you would need a 256-bit key, with all
> > 256 bits independently chosen. E2 would use 128 bits, and RC6 would
> > use 128 bits, each using its own key schedule.
>
> Ok, I see now.
> But then I don't see how this can be reasonably promulgated as an
> AES standard, because it has only (at most) 128 bits of strength, not
> 256 bits of strength. I believe that if we standardize on an algorithm
> with a 256-bit key, it should have 256 bits of strength. Moreover, it
> was a design requirement for AES that the ciphers be able to support
> 128 bit key lengths, which E2.RC6 would not satisfy.
I agree that this does not produce a candidate for AES, and this was not
in fact Dianelos' intent when he brought it up. He introduced it as a
lunch-time question to an unnamed cryppie, and framed it as A and B
representing two strong ciphers, using as a specific example two AES
candidate ciphers. I reacted to his question on that basis.
> Thus, I don't think cascading multiple algorithms will give us a good
> AES standard. (Doesn't mean it's not a useful technique, just that it
> doesn't immediately solve the problem of finding an AES cipher.)
Agreed, again. I'm reminded of King Gama (from Princess Ida), who found
that he couldn't persuade anybody to argue with him.
Before ditching the question completely I'll also agree with William
Hugh Murray's point that crypto strength is unlikely to be the weak
link in one's security. However, if CPU is not an issue, cascading
protects against unexpected breakage of one of the components.
--
Jim Gillogly
Sterday, 8 Astron S.R. 1999, 21:40
12.19.6.1.3, 5 Akbal 16 Cumku, Fifth Lord of Night
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 30 Mar 1999 22:05:16 GMT
Reply-To: [EMAIL PROTECTED]
On Tue, 30 Mar 1999 14:13:29 -0700, "Tony T. Warnock"
<[EMAIL PROTECTED]> wrote:
>Z-score is (x-mean)/standard_deviation, it's a normal word, common in statistics.
>In the binomial case, the dispersion is getting larger by Sqrt(N)/2 and your +-5% to
>the mean
>is getting larger by .1N. This means that an ever larger fraction is within 5% of the
>mean. In statistical mechanics, N is about 10^23 (give or take a factor of a billion)
>so
>the resulting distribution looks like a spike.
But this is not what I was trying to communicate. The ink in water
diffusion experiment hardly looks like a delta function as time
progresses. In fact it looks exactly the opposite - a broad flat
distribution, not a spike centered at the origin.
What does the Z-score actually measure anyway? The thing of interest
to me was the area under the Gaussian in the range of +- 5% of the
mean, divided by the total area (if the Gaussian is not normalized).
That tells you the fraction of sequences which are within +- 5% of the
origin after N steps. Physical intuition tells you that fraction must
decrease with time as the ink diffuses away from the origin.
The reciprocal of your Z-score, namely (standard_deviation/mean),
seems to be a better measure of what I am interested in
(dispersion/mean), and it grows smaller as N grows larger - the
opposite of spike.
In any event, how does your Z-score tell us about the non-randomness
of a TRNG?
Bob Knauer
"The laws in this city are clearly racist. All laws are racist.
The law of gravity is racist."
- Marion Barry, Mayor of Washington DC
------------------------------
From: karl malbrain <[EMAIL PROTECTED]>
Subject: Re: My Book "The Unknowable"
Date: Tue, 30 Mar 1999 21:44:14 GMT
In article <7dec5m$8gn$[EMAIL PROTECTED]>,
karl malbrain <[EMAIL PROTECTED]> wrote:
> Under materialism, CHAOS is BROKEN DOWN inorder to LIQUIDATE RANDOMNESS (See
> the <<equation>> above, the unusable part being the dividend). It's the
> REMAINDER that I'm after. This should read ABILITY to USE -- USEABILITY is
> DETERMINED by MATTER'S admittance of INFORMATION -- see an electronics
> definition of OPERATIONAL AMPLIFIERS. Karl M
CORRECTION: The QUOTIENT is LIQUIDATED as MYSTICISM.
============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/ Search, Read, Discuss, or Start Your Own
============= Posted via Deja News, The Discussion Network ============
http://www.dejanews.com/ Search, Read, Discuss, or Start Your Own
------------------------------
From: [EMAIL PROTECTED] (R. Knauer)
Subject: Re: True Randomness & The Law Of Large Numbers
Date: Tue, 30 Mar 1999 22:26:54 GMT
Reply-To: [EMAIL PROTECTED]
On Tue, 30 Mar 1999 21:28:05 GMT, Dave Knapp <[EMAIL PROTECTED]> wrote:
>Please, please, please go read a first-year book on statistics, OK? And
>if you've already tried that, go re-read it and try to _understand_ it.
It is clear from your posts that it is you who needs to hit the books.
I have done my homework, and demonstrated it by quoting from
acknowledged experts like Feller. All we have seen from you is bluster
and pontification.
> The _cause_ of this behavior is correlation: the position of a
>_given_ particle as a function of time is a highly correlated value.
That is a very strange use of the term "correlation". No wonder
statistics is so useless in characterizing a TRNG.
Successive steps of the random walk are completely independent of one
another, therefore there is no correlation between one step and the
next.
>Let the position of the particle at time i be Xi and the random variable
>for its motion be theta.
Maybe you should read some books on probability. The conventional
notation for the position after n steps is Sn and the random variable
for each step is Xi. That is, the uniform random walk, which results
from a UBP, is characterized by n random variables Xi such that the
path is X1X2X3...Xn, and the location after n steps is
Sn = Sum (Xi), where Xi can take on the values +- 1.
Using this, let's see what you are going to say below in more
conventional terms:
>Then Xi+1 = Xi + theta.
You are saying Sn+1 = Sn + Xn, which is wrong. The correct statement
is Sn = Sn-1 + Xn.
But that is a small nitpick.
>Notice the "Xi" in
>that equation: it says that Xi+1 depends on Xi. That is the
>_definition_ of correlation.
Not in my book.
The underlying process for the random walk is the UBP, and it is
completely uncorrelated. Each step is completely independent of all
other steps, including the one preceding it.
Sn measures the bias of the path after n steps - it is the net
difference between steps in one direction versus steps in the other
direction. There is no "correlation" involved.
Nowhere have I said I was going to use the Sn of the random walk to
make a TRNG keystream. Nowhere. It is the sequence X1X2X3...Xn that is
used as the keystream.
> So the statistical properties you are trumpeting as proof that
>statistics is useless in characterizing random number generators is
>based upon a correlation effect.
This is compete nonsense. Nowhere in Feller's book on probability does
he discuss any such "correlation" in either his extensive writeups on
the UBP and the random walk.
You are just a garden-variety troll who is trying to confuse people
for perverted sport. OK - so you have had your little demented laugh.
Now go play in another sand box.
<plonk>
Bob Knauer
"The laws in this city are clearly racist. All laws are racist.
The law of gravity is racist."
- Marion Barry, Mayor of Washington DC
------------------------------
From: [EMAIL PROTECTED] (wtshaw)
Subject: Re: Live from the Second AES Conference
Date: Tue, 30 Mar 1999 12:27:08 -0600
In article <[EMAIL PROTECTED]>, Robert Harley
<[EMAIL PROTECTED]> wrote:
....
> *** What is going on? ***
...
> The purpose of the whole thing is to replace DES and that is not
> because of DES's performance: it is because DES is not secure enough.
> As Dianelos mentioned, speed is a relatively easy issue....
> In fact speed is by far the lesser concern compared to security.
...
> Thus it seems clear that the important question is NOT:
> Which is the fastest algorithm (with decent security)? but instead:
>
> Which is the most secure algorithm (with decent performance)?
> This is the more difficult question but it is the one we have to ask
> and the one we have to answer.
>
> I hope the guys at NIST can still see the wood for the trees.
>
Sometimes it is best to let the hot bloods run around in circles, they
seem to enjoy it so much. As you point out, the serious government need
is not for speed, be that the need of other people in other places.
One might wonder whether the whole business is more a PR measure, if
making some political statement, or a sincere effort to appeal for help in
a fair and open manner. We doubt the latter since we are so used to
government doing so much for the chosen and in private, and there are
those that continue to do so.
An adopted standard may not really mean much after all other than setting
up a target for additional study, which may or may not reach results on a
preset schedule. I mention this, because it is another area where time,
like speed, may seem or actually push wrong-headeding evaluation.
A slow and deliberate process is necessary to maintain proper
prospective. And, the government has the option to can the whole thing,
regardless of the effort that has been involved. After all, there are
always more, and surely new optons, if the goal is really to weigh the
security need as first.
--
Too much of a good thing can be much worse than none.
------------------------------
From: "Brian Gladman" <[EMAIL PROTECTED]>
Subject: Re: Live from the Second AES Conference
Date: Tue, 30 Mar 1999 21:52:08 +0100
Terje Mathisen <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Bruce Schneier wrote:
> >
> > On Mon, 29 Mar 1999 16:15:16 GMT, [EMAIL PROTECTED]
> > (John Savard) wrote:
> [snip]
> > >If everybody's C code is compiled on the same compiler, one may be
> > >comparing optimizations or something, but one isn't comparing
compilers.
> >
> > One is comparing both how well the coder optimized his code, and how
> > well the compiler optimizes the particular algorithm. For example,
> > the Borland C compiler can't do rotates well. Any algorithm using
> > rotates will look relatively worse than an algorithm that does not, if
> > compared using a Borland compiler. This relativel difference won't
> > exist if the algorithms are compared using a Microsoft compiler.
>
> Indeed.
>
> Even though there might exist crypto algorithms which would happen to
> compile into near-optimal code on almost all compilers, I believe a new
> standard encryption algorithm is more than important enough to deserve
> being implemented in hand-optimized asm code for all major cpu
> architectures.
>
> I.e. there is no particular reason to handicap an algorithm just because
> it uses a normal cou instruction which is hard/impossible to describe
> directly in portable C.
>
> This is why I really like the AES anylysis submitted by B. Schneier's
> group, where they compared the relative speed of a theoretically perfect
> asm implementation of each algorithm.
>
> The numbers they came up with seems to correlate well with what good
> coders have been able to do on several of the algorithms.
>
I wonder why :-)
Brian Gladman
------------------------------
From: "Kryspin Ziemski" <[EMAIL PROTECTED]>
Subject: Re: newbie question
Date: Tue, 30 Mar 1999 02:29:05 -0500
My understanding of public key cryptography is that each person is assigned
to keys one private and one public. the sender encrypts the message with
his private key and then sends the message to the reader.
the reader gets the sender's public key from a directory and then converts
the public key to the private key using some mathematical relation and
decrypts the message using the private key. If this is how public key
cryptography works what stops me obtaining the sender's public key and
debugging the program to find the code that works on the public key to
convert it to the private key which i'm assuming is done locally on the
computer.
David A Molnar <[EMAIL PROTECTED]> wrote in message
news:7dpnj8$a7u$[EMAIL PROTECTED]...
> Kryspin Ziemski <[EMAIL PROTECTED]> wrote:
> > i was not specific enough in my question
> > public key cryptography works on the principle that there is a
mathematical
> > relation between the public key and private key
> > if you knew the specific algorithm to convert from the public key to the
> > private key then you could decrypt the message. couldn't you then
analyze
> > the code that decrypts the message and find the specific parameters
> > ( by this i mean if the general algorithm was y = a*x + b find a and b,
> > simple example but you get my meaning ) am i missing something ther or
is an
>
> We design algorithms so that the amount of data which needs to be kept
> secret is clearly identified. "data" here includes things like 'workings
> of algorithm, parts of a secret key, third parties', and so on. The rest
> of the data is considered as if it's public.
> (warning - not standard terminology. I want something that includes both
> "algorithm details", "protocol details" and "key material." suggestions
> or pointers?)
>
> Your question, if I've got it right, is "why don't we reverse-engineer a
> piece of software and see what the private key is, even though the
algorithm
> works well and can't be inverted even though its workings are known."
> The answer is...well
>
> a) we clearly identify the secret data and then go to great lengths
> not to store it anywhere permanent.
>
> b) when we do have to deal with secret data, like to perform
> decryption, we try to use it for as short a time as possible and
> in as difficult to observe a manner as possible.
> i
> you may think these are difficult. you're right. see all the fun things
> you can do to smart cards. or check out www.cryptography.com
>
> So in a well-designed system, no DLL would contain the private key of the
> user. The "a and b" of your example would be parameters fed to some
general
> encryption function in the DLL. The "a and b" would be kept by the user
> in some place considered secure, like their head, in a separate (possibly
> encrypted) file, on a floppy, in their genetic code, etc. etc.
>
> Not all systems are well designed in this sense. would anyone like to
> give a definition of what it means to say a cryptosystem "sucks" ?
>
>
>
> > actual problem but there is a counter measure.
> > what would stop me from just debugging a dll and knowing the public key
of
>
> Public keys don't give you any ability to decrypt. They allow you to
encrypt.
> You would need to find someone's private key to retrieve a message
(assuming
> that no better way of attacking the system is available).
> Systems that keep a private key lying around in a DLL are not well
designed.
> If you encounted such a system, then, yes, this works.
> Fortunately, with the popularity of cryptography these days, maybe after
> a while we will see fewer such systems.
>
> -David
>
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: What is fast enough?
Date: 30 Mar 1999 23:02:59 GMT
Reply-To: [EMAIL PROTECTED]
"Lassi Hippeläinen" <[EMAIL PROTECTED]> writes:
>[EMAIL PROTECTED] wrote:
>> [EMAIL PROTECTED] writes:
>> >Jack Lloyd and I are currently working on a cipher together. I was just
>> >wondering (from the communities point of view) what is acceptable speeds?
>> >
>> >Right now, in the unoptimized C code, on a 233mhz Cyrix MII, I get between
>> >1.4MB/sec and 2.9MB/sec (32 rounds and 16 rounds respectively).
>> >
>> >Isn't anything above 1MB/sec considered fast enough? I mean my hd controller
>> >only works at 4.5MB/sec anyways!
>>
>> 100baseT is 12.5MB/sec.
>>
>> --
>> Lamont Granquist ([EMAIL PROTECTED])
>> ICBM: 47 39'23"N 122 18'19"W
>
>That is the raw bit rate, right?
Sure, multiply by 8 and you get 100Mbs.
>Reserve something for framing, protocol
>overhead(s), an occasional retransmission, etc., and you're guite a bit slower. I
>still haven't seem too many systems, whose sustained payload rate is more than
>half of the raw rate.
I've gotten >80Mbs of actual data transfer between an Alphaserver 4100
running DU4.0D and a 533MHz (non-DEC) Alpha running Linux. Just used a
short bit of code to open a socket (TCP) and shove data through as fast as
possible. Oh yeah, and they're both connected to a 3Com SSII 3300 which
was otherwise fairly unloaded.
>But then, the 1 Gb/s Ethernet is just around the corner, even though not yet on
>every desktop. It all depends on the application.
Mmmmmm.... 1Gbs...
--
Lamont Granquist ([EMAIL PROTECTED])
ICBM: 47 39'23"N 122 18'19"W
------------------------------
From: [EMAIL PROTECTED] (Bryan G. Olson; CMSC (G))
Subject: Re: fast enough (sequel)
Date: 31 Mar 1999 01:19:23 GMT
[EMAIL PROTECTED] wrote:
: Why on earth should the algorithm work at 50mb/s on a PC? Unless you have
: specialized hardward or something...
First, fifty milli-bits a second is very slow. I'm not sure
whether you meant 50 MB/s or 50 Mb/s.
50MB/s is faster than most I/O interfaces, and I assume the
question is what the point is of out-running the I/O. The
answer is that encryption is rarely the only operation one
needs to do, even on the same data stream. If our encryption,
verification, and error correction each run at 50MB/s, the
chain of all three has a throughput of about 16.7MB/s, and we
haven't even parsed the data yet.
: I realize the algorithm should be efficient and fast, but on a PC I would
: think
: >= 2MB a sec is ok. Probably around 8MB a sec would be best.
Faster is always better. (Well, key-stretching is an arguable
exception but beside the point here.)
--Bryan
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************