Cryptography-Digest Digest #304, Volume #10 Thu, 23 Sep 99 18:13:03 EDT
Contents:
Re: msg for Dave Scott (Tom St Denis)
Re: frequency of prime numbers? ("Trevor Jackson, III")
Re: Mystery inc. (Beale cyphers) (Tim Tyler)
Re: Increasing password security dramatically without making it harder (Anton
Stiglic)
Re: Increasing password security dramatically without making it harder (Eric Lee
Green)
Re: Exclusive Or (XOR) Knapsacks (David Wagner)
Re: Exclusive Or (XOR) Knapsacks (d g)
Re: RSA weak? (Tom St Denis)
Re: Schrodinger's Cat and *really* good compression (Mok-Kong Shen)
Re: Comments on ECC ("Douglas A. Gwyn")
Re: Another bug RE: CryptAPI (Christopher Biow)
Re: frequency of prime numbers? ("Douglas A. Gwyn")
Re: frequency of prime numbers? ("karl malbrain")
Re: Decryption --Help!!! (Paul Koning)
----------------------------------------------------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: msg for Dave Scott
Date: Thu, 23 Sep 1999 19:21:24 GMT
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Johnny Bravo) wrote:
> On 23 Sep 1999 04:15:12 GMT, [EMAIL PROTECTED] (JPeschel) wrote:
>
> >Yes, Dave claims that the chaining modes and "short key" ciphers
> >are weak. It is, however, according to Dave, the NSA that can
> >break these.
> >
> >Which ciphers have you broken?
> >
> >Joe
>
> He's like Charles Booher, he thinks that all the crypto in the world, except
> for his own personal system, has already been broken by the NSA.
>
> Johnny Bravo
Your not talking about me are you?
Tom
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
------------------------------
Date: Thu, 23 Sep 1999 16:38:27 -0400
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: frequency of prime numbers?
Douglas A. Gwyn wrote:
> "Trevor Jackson, III" wrote:
> >> So truth is context-dependent. The issue that started this line of throught was
> > whether there can be unprovable false statements within a system. Now a trivial
> > line of thought is that the inverse of an unprovable true statement must be
> > unprovably false.
> > Here I'm using the slippery context-dependence by which provability is evaluated
> > within an axiomatic system (intrinsic property) while truth value is known from
> > outside the system (extrinsic property).
> > Is there a fundamental inequality that asserts unprovable true statements cannot
> > be negated?
>
> What *are* you talking about?
Read what I wrote and figure it out.
> Of *course* false statements cannot
> be proved, in any consistent system.
Sure they can. In this context "proved" does not mean "show to be true" it means
"truth value resolved", which applies to both possible truth values.
> I think you mean to discuss
> whether the *negation* of an arbitrary false statement can be
> proved within the system.
No, I meant to ask whether there were known-false statements whose falsity could not be
decided with the enclosing axiomatic system.
And I did.
> The answer is, of course, no. If G
> is a Goedel-undecidable (true) statement, then so is !!G, thus
> !G is a false statement whose inverse cannot be proved within the
> system.
------------------------------
From: Tim Tyler <[EMAIL PROTECTED]>
Subject: Re: Mystery inc. (Beale cyphers)
Reply-To: [EMAIL PROTECTED]
Date: Thu, 23 Sep 1999 19:40:30 GMT
Roger Fleming <[EMAIL PROTECTED]> wrote:
>1. The quantity of treasure: some 1.3 metric tonnes of gold (value today,
>around $US 18 million) and 2.3 tonnes of silver (a little under half a
>million), plus gems (wildly guessing average CPI over the last 178 years at
>3%, somewhwere around several million). This is a _huge_ hoard of treasure;
>quite enough to make the suckers salivate and run out and buy pamphlets.
FWIW, you can get hold of all three cyphertexts by getting hold of a
copy of Simon Singh's "The Code Book".
Alternatively all three codes are available at:
http://treasurehunt.miningco.com/bllet.htm?pid=2804&cob=home
--
__________
|im |yler The Mandala Centre http://www.mandala.co.uk/ [EMAIL PROTECTED]
End the pollution of cyberspace: destroy Microsoft.
------------------------------
From: Anton Stiglic <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp,comp.security.pgp
Subject: Re: Increasing password security dramatically without making it harder
Date: Thu, 23 Sep 1999 16:41:31 -0400
"Thomas J. Boschloo" wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
>
> Instead of hashing the whole pass phrase, you hash the pass phrase with
> some random data appended. I think I'll patent it! It's a great idea and
> it is funny nobody thought of it before.
Congradulations!!!! (duh)
You just re-invented a passphrase with salt system...
------------------------------
From: Eric Lee Green <[EMAIL PROTECTED]>
Crossposted-To: alt.security.pgp,comp.security.pgp
Subject: Re: Increasing password security dramatically without making it harder
Date: Thu, 23 Sep 1999 13:40:09 -0700
"Thomas J. Boschloo" wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Instead of hashing the whole pass phrase, you hash the pass phrase with
> some random data appended. I think I'll patent it! It's a great idea and
> it is funny nobody thought of it before.
It's been thought of before. In fact, a paper published in 1978
presented the idea.
Here is a reference to the paper:
http://plan9.bell-labs.com/7thEdMan/vol2/password
Note that this describes the password file format for Unix Version 7.
The basic problem is that you are still succeptible to dictionary
attacks even after you've hashed in the extra random data, because you
have to send or store the random data somewhere so that you can get the
same hash when you try to validate the password. Robert Morris Sr. did
not think of that possibility when he invented this scheme because, on a
2mhz PDP-11 16-bit minicomputer, the thought of cracking DES within a
human lifetime was laughable. Or maybe he did think of that (RM Sr. was
known to have close contact with the No Such Agency).
--
Eric Lee Green http://members.tripod.com/e_l_green
mail: [EMAIL PROTECTED]
There Is No Conspiracy
------------------------------
From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Exclusive Or (XOR) Knapsacks
Date: 23 Sep 1999 12:41:10 -0700
In article <[EMAIL PROTECTED]>,
Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
> David Wagner wrote:
> > I think you (and another poster) misread his question.
> > You're thinking of the case where the number of elements in the set
> > is _less_ than the length of the bitvectors, ...
>
> SVD works in any case.
>
> > Extra elements don't hurt you; Gaussian elimination still works.
>
> So long as there is no inconsistency in the input matrix.
I've been trying to puzzle out this post for a few days:
like those statements from a Greek oracle, it sounds really
interesting, but it non-trivial to parse.
Would you mind giving a few brief clues to your meaning?
I'm no SVD expert, but I've seen it used, and I can't figure
out its application to this setting. How does a SVD help?
And what do you mean by inconsistency? It wasn't clear to
me that there was any notion of consistency in the problem
statement...
Thanks.
------------------------------
From: d g <[EMAIL PROTECTED]>
Subject: Re: Exclusive Or (XOR) Knapsacks
Date: 23 Sep 1999 14:27:52 -0700
[EMAIL PROTECTED] (David Wagner) writes:
> In article <[EMAIL PROTECTED]>,
> Douglas A. Gwyn <[EMAIL PROTECTED]> wrote:
> > David Wagner wrote:
> > > I think you (and another poster) misread his question.
> > > You're thinking of the case where the number of elements in the set
> > > is _less_ than the length of the bitvectors, ...
> >
> > SVD works in any case.
> >
> > > Extra elements don't hurt you; Gaussian elimination still works.
> >
> > So long as there is no inconsistency in the input matrix.
>
> I've been trying to puzzle out this post for a few days:
> like those statements from a Greek oracle, it sounds really
> interesting, but it non-trivial to parse.
>
> Would you mind giving a few brief clues to your meaning?
>
> I'm no SVD expert, but I've seen it used, and I can't figure
> out its application to this setting. How does a SVD help?
I can't figure this out either --- given that we're talking about
linear forms over a finite field and not over R (or even Q), what does
SVD provide us?
> And what do you mean by inconsistency? It wasn't clear to
> me that there was any notion of consistency in the problem
> statement...
Perhaps he meant that the matrix may not be invertible? Or
ill-conditioned matrices? (again, I fail to see the relevance of
conditioning for the case under discussion: GL(F_p))
Cheers,
Dipankar
[EMAIL PROTECTED]
------------------------------
From: Tom St Denis <[EMAIL PROTECTED]>
Subject: Re: RSA weak?
Date: Thu, 23 Sep 1999 21:23:49 GMT
In article <7scv8s$5er$[EMAIL PROTECTED]>,
"Kem" <[EMAIL PROTECTED]> wrote:
> Hello:
>
> I am new on all this of crypt. and maybe I am going to say a crazy
> afirmation. Please, confirm me this.....
>
> RSA algorithm is based on the generation of a key thanks to two prime
> number very high. Prime numbers has the cuality of been only divisible by
> their and 1, but maybe two differents numbers (not prime) has the quality of
> generate the same key, and maybe it is possible to take adventage of this
> quality to break algorithm with "littles" changes....
> I know I am say silly afirmation, but it is only an idea. What's your
> opinion?
> Thx.
>
> P.D.: is there any good reference where RSA algorithm was explained with
> "simples" words.
I have a copy of an RSA paper if you want.
BTW if the two number p and q (n = pq) are prime then n will only have two
factors (n and q) and can't be made (or the product of) any other numbers.
Tom
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
------------------------------
From: Mok-Kong Shen <[EMAIL PROTECTED]>
Subject: Re: Schrodinger's Cat and *really* good compression
Date: Thu, 23 Sep 1999 22:03:32 +0200
Douglas A. Gwyn wrote:
>
> But Schroedinger's cat is neither; it is a "thought experiment"
> that one could readily *actually perform*. There is no analogy
> being drawn, nor is it a metaphor for anything.
Perhaps it should be more 'strongly' described as a theatrical play
according to a work that is not realistic. For if a cat is one
of our cats (of this real world and not a mathematically or
otherwise 'defined' cat) than we know from biology that at any
time point it is either alive or dead. Further, a real-life cat also
has eyes and hence also 'observes' even though its intelligence is
much lower than that of human. So I am of the opinion that the whole
setup is just fundamentally unrealistic and thus the 'thought
experiment' is not a proper thought experiment that serves to save
some actual experimentation cost but a 'pseudo-experiment'. Well,
this is of course my layman's view and could be the result of my
misunderstanding and lack of knowledge and consequently could well
be sheer nonsense in the eyes of professionals.
M. K. Shen
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: Comments on ECC
Date: Thu, 23 Sep 1999 20:11:25 GMT
Jerry Coffin wrote:
> If ... somebody proves that P=NP, then being NP-complete
> doesn't mean much anymore, since it's then proven that all
> NP problems really have deterministic solutions in polynomial
> time.
Unless somebody also invents an algorithm for converting
any NP-complete problem into a P problem, knowing that
P=NP wouldn't be of any practical use.
> "Lucky guess" and "non-deterministic" basically mean the
> same thing...
No! In this context, nondeterministic refers to the automaton
model, which is allowed to occasionally branch "at random"
rather than according to a preset recipe.
The "lucky guess" approach is merely used for counterexample
purposes, to show that so long as verification of the answer
is cheap, there is no *guaranteed* lower bound to the
resource requirements of a problem for *all* inputs.
The fixed algorithm
if try(1) then success!
else loop forever
won't crack a general case of a problem, but if the answer
can sometimes be "1", then it does crack at least one special
case.
The reason I recently mentioned lucky-guessing is to point
out that the *standard* measures of difficulty are either
for a worst-case problem, or statistical over the universe
of problems. The significance of this observation may be
more apparent if you consider what happened to "knapsack"
encryption: although the general knapsack problem is
NP-complete, the actual knapsack-based cryptosystem was
broken with a vastly reduced work factor, because that
did not involve always solving a maximally-hard instance
of the knapsack problem.
Many years ago (before the knapsack system was cracked),
I was violently flamed in this newsgroup for saying that
complexity theory wasn't nearly as relevant to cryptology
as was general believed at the time. Another reason I
gave then was that complexity theory primarily (at the time,
nearly exclusively) dealt with asymptotic limits as the
problem size approaches infinity, but no actual problem is
that large! The big-O approach of dropping constant
factors is very misleading in cases where those factors
dominate for all practical sizes.
------------------------------
From: Christopher Biow <[EMAIL PROTECTED]>
Crossposted-To: talk.politics.crypto
Subject: Re: Another bug RE: CryptAPI
Date: Thu, 23 Sep 1999 16:35:46 -0400
Eric Lee Green <[EMAIL PROTECTED]> wrote:
>Christopher Biow wrote:
>> Paul Mikesell <[EMAIL PROTECTED]> wrote:
>> >I'd like to point out that a non-dll implementation only will only make
>> >the system more secure through obfuscation, which does not really make it
>> >more secure.
>> By analogy, passwords are security through obfuscation, but their use has
>> non-trivial effects upon security.
>Err, passwords are keys. No security system is secure if its keys are
>known.
The analogy is at best partial, as keys are generally not transmitted in a
clear channel, whereas passwords usually are. Good keys have 128 bits of
entropy; very few passwords do.
>You're saying that your car is insecure because if someone has the key,
>they can open the door and drive off in it. Not a good analogy at all...
I did not claim that passwords are "insecure". To the contrary, I was
attacking claims that leading desktop OS's can be meaningfully split among
the categories "secure" and "insecure". I stated that both passwords and
closed-source code have non-trivial effects upon security, in some respects
providing analogous enhancement of security over other methods.
>Basically: a good encryption algorithm is secure whether its details are
>known or not...
Of course that is true. It is also aside from the original subject of this
thread, which concerns not encryption algorithms, but rather the protection
of crypto-related code from malicious alteration. In the absence of
theoretical security comparable to that of a good cipher, closed-source
code without an exposed, intermediate interface makes it much more
expensive to conduct such an attack. Admittedly, the analogy is inexact, as
a good password must be either sniffed or attacked brute-force, whereas
object code can be decompiled and analyzed.
>Because you can bet that attackers will discover the details eventually...
Actually, before there was good reason to place great trust in the
algorithms (analogous to our present lack of meaningful system security), a
great deal of effort went into preserving the obscurity of the algorithms.
For Enigma, that obscurity was the only effective protection. Without
access to a machine (i.e. the algorithm), Enigma would not have been
cracked. With the machine in hand, cracking was expensive but possible.
If, someday, we have operating systems (and attached operators) with
security and procedures as good as those of modern cryptographic systems,
then that will be clearly better than "security through obscurity." But we
are a very long way from that OS technology, especially in the context of
the systems that run MS CryptoAPI. Had it been feasible, monolithic code
would have significantly raised the cost of attacking that API over the
system with exposed DLL interfaces on both ends.
Back to the closed/open source comparison: Note that it was the leaking of
source-code symbol information which allowed the identification of the "NSA
key" vulnerability. Recovering that information from the object code (to
the point of noting the vulnerability) would have been much more expensive.
Recovering the semantics provided by "NSA key" from the object code would
probably not have been possible.
Open source is essential for verifying strength of highly secure
cryptographic algorithms and the quality of their implementation, both
which are reasonably verifiable by inspection of that code. By contrast,
open source operating system code for probably insecure systems is a
double-edged sword. It results in *greater* vulnerabilities by at least
some measures (e.g. time periods during 1999 during which there have been
unpatched, well-known root/admin remote exploits on the average,
moderately-maintained desktop system, NT vs. *nix).
------------------------------
From: "Douglas A. Gwyn" <[EMAIL PROTECTED]>
Subject: Re: frequency of prime numbers?
Date: Thu, 23 Sep 1999 20:21:37 GMT
Patrick Juola wrote:
> An unprovable false statement is simply a false statement which is not
> provably so.
But you're mixing levels. There is proof within the (axiomatic)
system, and metaproof about statements of the system. When you
say that a false statement is not provably so, you're talking
about metaproof. I took the original poster's "unprovable false
statement" to be talking about proof within the system. The
correct term for the metaproperty would be "unprovably false".
------------------------------
Reply-To: "karl malbrain" <[EMAIL PROTECTED]>
From: "karl malbrain" <[EMAIL PROTECTED]>
Subject: Re: frequency of prime numbers?
Date: Thu, 23 Sep 1999 14:36:02 -0700
Trevor Jackson, III <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
>
>
> Douglas A. Gwyn wrote:
(...)
> > I think you mean to discuss
> > whether the *negation* of an arbitrary false statement can be
> > proved within the system.
>
> No, I meant to ask whether there were known-false statements whose falsity
could not be
> decided with the enclosing axiomatic system.
>
> And I did.
>
> > The answer is, of course, no. If G
> > is a Goedel-undecidable (true) statement, then so is !!G, thus
> > !G is a false statement whose inverse cannot be proved within the
> > system.
It's time to ANTE up guys. No, you cannot DIFFERENTIATE anything from
ZEROES, you just get more zeroes. Have the COMPILER emit a warning message
for( !! ) if you must, else dismiss as a SINGULARITY at run-time. Karl M
------------------------------
From: Paul Koning <[EMAIL PROTECTED]>
Subject: Re: Decryption --Help!!!
Date: Thu, 23 Sep 1999 17:21:39 -0400
[EMAIL PROTECTED] wrote:
>
> Anybody out there know where I can find a trigram-- you know a listing
> of the frequencies of three letter words in the english alphabet ENT,
> etc.
Not online, but... get "Cryptanalysis" by Helen Fouche Gaines
(Dover). A small paperback originally published in the 1930s.
It has a collection of statistics in the back, including a trigram
frequency table as far as I remember.
paul
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************