Cryptography-Digest Digest #123, Volume #11 Mon, 14 Feb 00 19:13:01 EST
Contents:
Re: Basic Crypto Question 3 (John Savard)
Re: Basic Crypto Question 3 (John Savard)
Re: Odp: Odp: New encryption algorithm ABC. (John Savard)
Re: Using Gray Codes to help crack DES (Peter Trei)
Re: Has some already created a DATA DIODE? (Terry Ritter)
Re: New standart for encryption software. ("Trevor Jackson, III")
Re: Large Floating Point Library? ([EMAIL PROTECTED])
Newbie - Determining encryption Bit Level ("Brian Bosh")
Re: parece que me colge (Shawn Willden)
Re: Which compression is best? (SCOTT19U.ZIP_GUY)
Re: Basic Crypto Question 3 (David Wagner)
Re: Which compression is best? (SCOTT19U.ZIP_GUY)
Re: Has some already created a DATA DIODE? (John Savard)
Re: Guaranteed Public Key Exchanges (Darren New)
----------------------------------------------------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Basic Crypto Question 3
Date: Mon, 14 Feb 2000 15:16:47 GMT
"Douglas A. Gwyn" <[EMAIL PROTECTED]> wrote, in part:
>Bruce Schneier wrote:
>> First, understand that you can't mathematically prove anything more
>> than: a cascade of block ciphers is as strong as the weakest block
>> cipher in the cascade.
>I don't think even that is provable in general.
True, but _if_ you can rule out all interactions (there are no back
channels available to the ciphers, all the keys are independent, and
_none of the ciphers is allowed to expand the message_ - if IVs are
allowed as an exception to this, strict conditions have to be placed
on how they're used - and compressing plaintext, then expanding
afterwards, is still expansion) then the cascade is at least as strong
as the _strongest_ cipher in the cascade.
I haven't seen an explicit statement of the required conditions, but I
would class them as "intuitively obvious".
Conditions for safe cipher cascade:
- none of the ciphers being cascaded is in the form of a "black box"
or object code; the algorithm being used for each step is known and
validated
- no cipher is given any information about the keys supplied to the
other ciphers
- any compression of plaintext takes place prior to the first cipher
in the cascade, and is independent of the ciphers
- if the ciphers use a message-unique random value, as an IV
(initialization vector), message indicator, or "salt", this value is
taken from an external trusted random source and supplied to the
cipher
- other than the IV noted above, the cipher must not lengthen messages
submitted to it; where the input is a random n-bit message, the output
must also be a random sequence of n bits (the equivalent condition
applying to non-binary encryption, _pace_ W. T. Shaw)
Any I missed? (question, of course, directed to those permitted to
answer)
John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Basic Crypto Question 3
Date: Mon, 14 Feb 2000 15:18:07 GMT
[EMAIL PROTECTED] (David Wagner) wrote, in part:
>Ahh, you'll want to read Biham's papers.
>He has serious attacks on inner feedback.
>(Isn't this in Applied Cryptography?)
The existence of the papers and attacks is noted there, yes.
John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Odp: Odp: New encryption algorithm ABC.
Date: Mon, 14 Feb 2000 15:19:30 GMT
"Bogdan Tomaszewski" <[EMAIL PROTECTED]> wrote, in part:
>My new encryption algorithm is the absolutely random function.
Is that a new encryption algorithm? The one-time-pad has been around
for quite a while. And it is, indeed, secure.
John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: [EMAIL PROTECTED] (Peter Trei)
Subject: Re: Using Gray Codes to help crack DES
Date: 14 Feb 2000 17:22:40 -0500
In article <[EMAIL PROTECTED]>,
John Savard <[EMAIL PROTECTED]> wrote:
>A while back I noted that if one used Gray code for the analogue
>values in a biometric, retaining only the error-checking part of an
>error-correcting code on file allows information in a biometric to be
>used for an encryption key despite the inescapable problem of analogue
>values occasionally crossing a threshold, no matter how coarse the
>grid used. (Instead of just storing the full key, and making it
>available based on a pass/fail test.)
>
>Here is another cryptographic use of Gray codes:
>
>Let us suppose one has a table of 56 sets of 16 48-bit masks,
>corresponding to how each bit in the DES key propagates through the
>subkeys of DES. This speeds up using a new key, since one doesn't have
>to repeatedly use Permuted Choice II.
>
>By cycling through the 56-bit DES keys in Gray code order, one can
>ensure that one has to do only *one* set of 16 XORs to proceed from
>one key being tried to the next key to test.
>
>Doubtless someone else has already thought of, and used, this
>particular optimization.
>
>John Savard (teneerf <-)
>http://www.ecn.ab.ca/~jsavard/index.html
Back when I was working on the original DES cracker (deskr) for the
first RSA challange (long before I came to work at RSA Security),
I invented this method. Perry Metzger improved it by suggesting
searching keys in Gray code order. This doubled the speed, since
only one bit changed for each key, instead of an average of two.
Together, these changes speeded the generation of new subkey
tables by over 100-fold over the 'standard' method. I posted
a description to cypherpunks (and it got on sci.crypt) at that
time. It became standard for all non-parallel DES key searchers.
Peter Trei
------------------------------
From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Has some already created a DATA DIODE?
Date: Mon, 14 Feb 2000 22:34:02 GMT
On Mon, 14 Feb 2000 13:04:27 GMT, in
<[EMAIL PROTECTED]>, in sci.crypt
[EMAIL PROTECTED] (John Savard) wrote:
>[...]
>LFSRs and LCGs are insecure for the same reason, both being linear, so
>I don't think it matters which one you use for MacLaren-Marsaglia.
Right. Any weak generator may well be attacked at its point of
weakness because the M-M combiner leaks information.
>The literature references to MacLaren-Marsaglia generators being
>cracked involved short-period LCGs, where the entire output, including
>the LSBs, with periods down to two, was used in the output stream, so
>I don't think they can be used to support a conclusion that the
>technique ought to be abandoned.
The first system in the literature did use 2 LCG's with an M-M
combiner. But if the LCG had been thought strong, there would have
been no need for a combiner. The point of the combiner, then, was to
add strength. It failed.
The *reason* the system failed was because the M-M combiner leaks
information. The attacker can use that information in a whole range
of attacks, not just the one or two given in the literature. Avoiding
those particular attacks simply means that some other attack will have
to be devised. But there is no reason to suspect such an attack to be
impossible, because THE M-M COMBINER LEAKS INFORMATION!
One alternative is to just use strong generators. But if we have
strong generators we don't need the M-M combiner.
>However, I would recommend, at a
>minimum, a technique like this: have _two_ MacLaren-Marsaglia
>generators, and use the XOR of their output as input to a _third_
>buffer.
But what does M-M bring to the party? Shall we simply XOR two LCG's
and claim that system is strong? If we have a weak system and XOR it
with another weak system, what do we expect to get?
>I have little fear that this kind of technique will suddenly be proven
>insecure.
Then I think you need to develop more fear.
>However, the PRNGs used for the three buffers must have
>rel-prime periods.
---
Terry Ritter [EMAIL PROTECTED] http://www.io.com/~ritter/
Crypto Glossary http://www.io.com/~ritter/GLOSSARY.HTM
------------------------------
Date: Mon, 14 Feb 2000 17:46:25 -0500
From: "Trevor Jackson, III" <[EMAIL PROTECTED]>
Subject: Re: New standart for encryption software.
"Albert P. Belle Isle" wrote:
> On Sun, 13 Feb 2000 23:34:55 -0500, "Trevor Jackson, III"
> <[EMAIL PROTECTED]> wrote:
>
> >>
> >> Although our source code is available for review under NDA, any
> >> INFOSEC professional knows that spiking cryptosystem implementations
> >> at the object code level is a much greater threat than "backdoors"
> >> spelled-out in well-documented source code. Hence, the emphasis on
> >> testing performance of the cryptosystem, rather than trusting pretty
> >> source code listings.
> >>
> >> (Of course, that doesn't seem to inhibit the calls by sci.crypt
> >> posters to "show me the source code." Any professional spiker would be
> >> all too happy to get the resulting "seal of approval" <g>.)
> >
> >You have mixed (possibly confused) two distinct problems that haunt
> >software offered by untrusted implementors. First, and unquestionably
> >foremost, is the threat of incompetence. An implementor may design a weak
> >cipher, or poorly implement a strong cipher, or perfectly implement a
> >strong cipher but overlook a security weakness in some supporting aspect
> >of the software. Source code inspection -- peer review -- addresses these
> >kinds of threats.
> >
>
> Perhaps you do, but I know of no "black bag jobs" that involved
> replacing source code. I also don't have access to all of MSFT's
> source code but, again, perhaps you do.
>
> If you re-read the first line of the above quote from my original
> posting, or the past four days worth of subsequent postings in which I
> clearly restated our belief in the _necessity_ of source code review,
> I fail to understand how my insistance on their not being _sufficient_
> for INFOSEC against professional attackers could be construed as the
> straw man you're attacking - i.e., discouragement of source code
> reviews by qualified reviewers.
While you have since made it clear that you consider source code review
necessary to security, your post as quoted above does nothing but ridicule the
concept of source code review. I believe this explains why I, among a number
of others, interpreted your position to be that source code review was
unnecessary.
>
>
> I certainly don't discourage the use of seat belts, but as I always
> told my children, they won't protect you against all hazards. Can that
> be somehow construed as my offering an inducement to ignore them?
Certainly not. But in the context of the messages to which you were replying
it certainly connoted so.
>
> >The second kind of threat is that of a malicious vendor who purposefully
> >implements a weakness or a back door. This is a dramatically smaller
> >threat. And, BTW, one that source code review _does_ reduce, because it
> >is quite hard to hide such a back door from an inspector able to recreate
> >the binary. Given the same tools the binaries should be close to
> >indistinguishable. And a debug script that works on one ought to produce
> >the same log when applied to the other. So even patched binaries are not
> >hard to uncover.
> >
>
> With your carefully stated qualifying "givens," I'd agree that a
> single, crudely-spiked executable file _could_ be caught out (if you
> add the proviso that it be inspected on a trusted system with all
> compilers, linkers, debuggers, report generators and other tools
> included in the evaluation of what constitutes a TCB).
That's reasonable. You inspect on your system and I'll inspect on mine. ;-)
>
>
> However, large file sets, installed to multiple directories (running
> as both applications and services), with the possibility of chained
> aliasing between them, can present a more challenging "INsecurity
> through obscurity," to coin a phrase.
I suspect this problem is exaggerated. Yes, testing a big app is hard. But
not because the needle is so carefully camouflaged, but because you've got
acres of hay in which to hide it. (Soon Micros~1 will use acres to measure the
amount of CD surface are they need to ship their software service packs).
However, given that I build such a system piecemeal, and that QA testing tools
are now able to reliably measure source code coverage, I do not find the
problem to be unresolvable.
>
>
> Spiking supposedly standard OS function libraries (MFC*.DLLs, for
> instance), whose accompanying debug (.MAP) files are always "updated"
> along with them, could give such statements about the ease of spiking
> detection a rather embarassing quality.
True, but then you've changed the whole thesis under discussion. Rather than
spiking an application you'd be using the application install as a cover for
inserting a trojan horse. While certainly a threat, it bears no resemblance to
you original claims re patching executables.
> Patching and using KERNEL32.DLL's IsDebuggerPresent() function affords
> some interesting possibilities on NT platforms, and there's always
> that old favorite ReadProcessMemory().
You can wax lyrical about what will happen when you get a hostile application
installed on a system. But you have not addressed the issue of getting it
there, past a thorough source code review.
------------------------------
From: [EMAIL PROTECTED]
Subject: Re: Large Floating Point Library?
Date: Mon, 14 Feb 2000 22:27:13 GMT
In article <[EMAIL PROTECTED]>,
Runu Knips <[EMAIL PROTECTED]> wrote:
> Mok-Kong Shen schrieb:
> > Clockwork wrote:
> > > There are numerous large integer libraries, but does anyone know
of a large
> > > floating point library?
> >
> >
http://cse.eng.lmu.edu/~acad/personal/faculty/dmsmith/FMLIB.html
>
> That is a broken link. Even ~acad doesn't exist !!
Try:
http://www.lmu.edu/acad/personal/faculty/dmsmith2/FMLIB.html
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: "Brian Bosh" <[EMAIL PROTECTED]>
Subject: Newbie - Determining encryption Bit Level
Date: Mon, 14 Feb 2000 16:10:44 -0700
How do you determine the encryption bit rate is?
--
Brian
http://becubed.cjb.net
------------------------------
Date: Mon, 14 Feb 2000 16:13:57 -0700
From: Shawn Willden <[EMAIL PROTECTED]>
Subject: Re: parece que me colge
"Matias C. Szmulewiez" wrote:
> Como estas amor.. ahhh te extra�o asi que aca toy de vuelta molestando
> :) como veras me colgue de vuelta aca en sinectis... y bue.. pero eso si
> antes de las 5 me rajo seguro : )
> Che llamame ok... ahhh y sabes que te amo mucho no??? bue sabelo.. sos
[...]
So, is there some steganographic message hidden in the mispellings, dropped
syllables, varied line endings and odd punctuation? Or is this *really* a
mispost from a semi-illiterate Argentine in love? This being sci.crypt, we
may never know...
:-)
Shawn
------------------------------
From: SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Date: Mon, 14 Feb 2000 23:12:14 GMT
In article <[EMAIL PROTECTED]>,
Jerry Coffin <[EMAIL PROTECTED]> wrote:
> In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] says...
>
> There seems to be a LOT of discussion that's either unrelated to the
> original subject, or else is simply being done at cross-purposes, or
> nit-picking about language used rather than anything substantive about
> the real effects of compression on security. As such, it seemed to me
> more profitable to try to summarize my opinions rather than continue a
> discussion that's mostly going nowhere.
>
> First of all, WRT. to Dave Scott's compression, there are really two
> separate areas that are being intermixed rather freely, despite being
> more or less unrelated:
>
> 1) All-or-nothing transformations.
> 2) 1-1 compression.
>
> I believe all-or-nothing transformations, such as David Scott gets by
> running compression in both directions can be of some use, as long as
> you're a great more concerned with ensuring against interception than
> you are with ensuring that the intended recipient receives at least a
> partial message. I think it's essentially up to an individual user to
> decide whether that applies to his/her care or not, but I'm certain
> that in MANY situations, the reverse is really true.
>
> I'm convinced that 1-1 compression rarely accomplishes anything useful
> at all. Its primary claim to fame is making it more difficult to say
It is to bad you have such a short sitghted view of the future.
Your wrtitting ability far exceeds mine just as earlier pompous
assholes who thought we would never get to the moon or that Noble gas
compounds are impossible. History my friend will prove your an ass.
> whether a particular trial decryption is valid or not. Some forms of
> compression allow the rejection of certain keys based on the structure
> of file produced, while leaving others as possible outputs. With a 1-
> 1 compressor, every decryption will produce a output that can be
> decompressed, but statistical analysis of that decompressed output
> suffices the reject the vast majority of keys, at least under normal
> circumstances. In theory, it's barely possible to imagine
> transmitting data with so little internal structure that it really
> would be difficult or impossible to analyze and use to reject
> incorrect keys, then 1-1 compression is worth considering. I believe
> this is such a rare situation as to merit no consideration beyond
> noting the theoretical possibility of its existence (and even that's
> giving it more time than it REALLY deserves).
You seem to not see the obvious. Yes if the users knows
the characteristics of the data being compressed. THen he can
check the decrypted file to see if those characterristics are there.
But this is true with any compression. The thing you can't seem
to get in you pee brain is that with non 1-1 compression many more
files can be rejected before one needs to look for the known
charactristics of the file that was compressed. In fact for most
files I would guess inferior compression would only lead to one correct
solution with non 1-1 compression but for some reason you don't
care and seem to stupid to understand the obvious.
>
> At the present time, forms of compression that are not 1-1 provide
> substantially better compression than those that are. Though it's not
> guaranteed, I believe this is unlikely to change. I believe the
> restrictions placed on compression to make it 1-1 limit the
> possibilities on compression too much to hope for it to match other
> forms any time soon.
While least see what happens in the next few years. Because I
belive this last statement is pure SHIT. You haven't the foggest
notion how hard it is to add to other compression schemes.
>
> A completely unrelated benefit of compression (I.e. unrelated to those
> mentioned above) is simply reducing the amount of text encrypted with
> a particular key. WRT to this benefit, the only relevant criteria is
> compression ratio. In theory, as has been mentioned previously,
> producing truly optimal compression ratios would more or less force
> the algorithm to be 1-1 as well. Despite this, the best algorithms
> known are neither 1-1, nor of a nature that appears likely to be
> amenable to conversion to 1-1.
THe only glimer of hope in this is you seem to have the
brain to understand that truely optimal compression must have
this 1-1 property. But you wrongly think that just becasuse the
people who design compression have seemed to give this area little
thought that it is unreachable. Well I have more respect for the
human mind than you. Either you want reseach to stop in this area
beacuse the NSA might not want it out. Our you think you know the
limits of what others can do and I doubt your that bright.
...snip rest is just the same crap over and over with little thought
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (David Wagner)
Subject: Re: Basic Crypto Question 3
Date: 14 Feb 2000 15:23:40 -0800
In article <[EMAIL PROTECTED]>,
John Savard <[EMAIL PROTECTED]> wrote:
> True, but _if_ you can rule out all interactions (there are no back
> channels available to the ciphers, all the keys are independent, and
> _none of the ciphers is allowed to expand the message_ - if IVs are
> allowed as an exception to this, strict conditions have to be placed
> on how they're used - and compressing plaintext, then expanding
> afterwards, is still expansion) then the cascade is at least as strong
> as the _strongest_ cipher in the cascade.
>
> I haven't seen an explicit statement of the required conditions, but I
> would class them as "intuitively obvious".
Then you need to read the reference I posted.
It gives an example of a pair of ciphers with no back channels,
independent keys, no message expansion, etc., yet the cascade
can be as weak as the weaker of the two. The example is under
a probable-plaintext attack model (you know the ciphertext, you
know the distribution on the plaintext, nothing else).
The example shows that cascades are trickier than you might expect,
and that there is a definite need for precise, formal, rigorous
reasoning here, lest we be swayed by misleading intuition.
------------------------------
From: SCOTT19U.ZIP_GUY <[EMAIL PROTECTED]>
Subject: Re: Which compression is best?
Date: Mon, 14 Feb 2000 23:34:17 GMT
In article <[EMAIL PROTECTED]>,
Runu Knips <[EMAIL PROTECTED]> wrote:
> [EMAIL PROTECTED] schrieb:
> > As I've read here, it's good to compress before you encrypt the
data.
>
> Not everything which you can read here is true. Especially this
> thought is totally wrong.
>
> > Now I've got 2 questions about this:
> > 1) From a security perspective, how important is compression?
>
> If you compress your data before encrypting, the encrypted data has
> a known structure which can, for example, be easier tested in a
> brute-force attack, and maybe helps the decrypter in other attacks,
> too.
Compression can be very important from a security perspective
but it is not something the phoney crypto gods want people to learn
about since the art of encryption is still very much kept in the dark
It was easier for the Chinese to still our latest nuclear weapons
that to get the darker secrets on crypto from the NSA. Our government
considers crypto very important so we can spy on everybody. Even
the French are starting to wake up to the fact we steal business
secrets fron them so we can stay a super power. We steal make treates
and pretend to be honest. But I think since we have such a corrupt
immoral crook for a president Europe may wake up to the fact we read
and know just about every dirty secret they try to hide. My point is
that anything that can weaken the NSA absolute control of breaking
information is done at all costs. 1-1 compression should have been
common for years but it is not. WHY??
>
> > Is prior
> > compression just a kind of "weak enhancement" or is considered it an
> > integral part of the encryption process as a whole?
>
> It has nothing to do with encrypting. You compress to save space
> and you encrypt to make information secret.
>
> Every little change in an encryption algorithm, especially "little
> improvements" like compression, will almost always weaken security.
>
BULL SHIT
use 1-1 compression
if you use compression at all
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE
http://www.jim.com/jamesd/Kong/scott19u.zip
http://members.xoom.com/ecil/index.htm
NOTE EMAIL address is for SPAMERS
Sent via Deja.com http://www.deja.com/
Before you buy.
------------------------------
From: [EMAIL PROTECTED] (John Savard)
Subject: Re: Has some already created a DATA DIODE?
Date: Mon, 14 Feb 2000 16:45:19 GMT
[EMAIL PROTECTED] (Terry Ritter) wrote, in part:
>On Mon, 14 Feb 2000 13:04:27 GMT, in
><[EMAIL PROTECTED]>, in sci.crypt
>[EMAIL PROTECTED] (John Savard) wrote:
>>The literature references to MacLaren-Marsaglia generators being
>>cracked involved short-period LCGs, where the entire output, including
>>the LSBs, with periods down to two, was used in the output stream, so
>>I don't think they can be used to support a conclusion that the
>>technique ought to be abandoned.
>The first system in the literature did use 2 LCG's with an M-M
>combiner. But if the LCG had been thought strong, there would have
>been no need for a combiner. The point of the combiner, then, was to
>add strength. It failed.
No; the attack given in the literature was not as trivial as the one
against a single LCG. So it did add strength. Since a super-weak LCG
was used - all the output, and not just the most significant bits,
were used - the attack was possible.
>The *reason* the system failed was because the M-M combiner leaks
>information. The attacker can use that information in a whole range
>of attacks, not just the one or two given in the literature. Avoiding
>those particular attacks simply means that some other attack will have
>to be devised. But there is no reason to suspect such an attack to be
>impossible, because THE M-M COMBINER LEAKS INFORMATION!
It is true that the buffer, unlike the one in DynSub or alleged RC4,
doesn't consist of all 256 values uniformly. Is this what you mean?
If so, the cure is obvious: use a big buffer.
>One alternative is to just use strong generators. But if we have
>strong generators we don't need the M-M combiner.
Ah, but can you be sure the generator is strong?
>>However, I would recommend, at a
>>minimum, a technique like this: have _two_ MacLaren-Marsaglia
>>generators, and use the XOR of their output as input to a _third_
>>buffer.
>But what does M-M bring to the party? Shall we simply XOR two LCG's
>and claim that system is strong? If we have a weak system and XOR it
>with another weak system, what do we expect to get?
XOR two LCGs, and you get an LCG. This is not true about an M-M
generator. Suddenly, all the leaked information disappears: each byte
of output could be produced by many different combinations of input
from the generators themselves.
John Savard (jsavard<at>ecn<dot>ab<dot>ca)
http://www.ecn.ab.ca/~jsavard/crypto.htm
------------------------------
From: Darren New <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Guaranteed Public Key Exchanges
Date: Tue, 15 Feb 2000 00:07:14 GMT
No Brainer wrote:
> OK Guys, now you're really being paranoid :)
Not at all. My example is exactly what you are asking about.
> I see your point though...all I want to do is exchange public keys over the
> Internet...that can't be too hard could it?
No. It's very easy. You just email them. The question is "who does this
public key belong to?" That's the hard one to ask. Because you have no way
of knowing who is *supposed* to read the email, as you specified in your
problem.
> Let's assume that the e-mail address I have in my hand is the right contact
> point for the person (whom I may have or may have not seen).
What do you mean by "contact point"? That's the crux of the problem. Let's
say "[EMAIL PROTECTED]" is the email address you have. What does that tell you
about Joe? Nothing. You can't communicate with Joe. You can't trust Joe. All
you can communicate with is that email address. What do you mean by "contact
point"? If ...
1) You trust anyone who can read the mail sent to [EMAIL PROTECTED] Then yes,
send that address a message, ask them to encrypt it with the private key,
and return it with the public key. *By definition*, the man in the middle is
trusted, because he can read the mail sent to [EMAIL PROTECTED]
2) You know that mail delivered to [EMAIL PROTECTED] is read only by Joe,
because he's across the hall from you at the NSA or something and therefore
there can't be a man in the middle. Simple. Ask him for the key.
Define precisely what you are asking in terms of the email address and the
public keys. Forget about the people behind them, which you say you don't
know anyway. Then ask the question again. You'll see it's almost impossible
to even ask the question that way in a way that addresses your concerns.
--
Darren New / Senior MTS / Invisible Worlds Inc.
San Diego, CA, USA (PST). Cryptokeys on demand.
There is no safety in disarming only the fearful.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list (and sci.crypt) via:
Internet: [EMAIL PROTECTED]
End of Cryptography-Digest Digest
******************************