Re: Maybe It's Snake Oil All the Way Down

2003-06-08 Thread Jaap-Henk Hoepman
I thought the 3G (UMTS) cellphones at least were going to use reasonably good
crypto; don't know about the overall security architecture though.

Jaap-Henk 

On Fri, 06 Jun 2003 14:30:04 -0400 Ian Grigg <[EMAIL PROTECTED]> writes:
> John Kelsey wrote:
>
>> So, what can I do about it, as an individual?  Make the cellphone companies
>> build good crypto into their systems?  Any ideas how to do that?
>
> Nope.  Cellphone companies are big slow moving
> targets.  They get their franchise from the
> government.  If the NSA wants weak crypto, they
> do weak crypto.

-- 
Jaap-Henk Hoepman   |  I've got sunshine in my pockets
Dept. of Computer Science   |  Brought it back to spray the day
University of Nijmegen  |Gry "Rocket"
(w) www.cs.kun.nl/~jhh  |  (m) [EMAIL PROTECTED]
(t) +31 24 36 52710/531532  |  (f) +31 24 3653137



Re: Maybe It's Snake Oil All the Way Down

2003-06-08 Thread Frederick Hirsch
Rich Salz wrote:

Perhaps a few "best practices" papers are in order.  They might help
the secure (distributed) computing field a great deal.
/r$
--
The new book, Practical Cryptography, by Niels Ferguson and
Bruce Schneier is useful.
regards,

Frederick



Re: Maybe It's Snake Oil All the Way Down

2003-06-08 Thread Anne & Lynn Wheeler
At 04:42 PM 6/4/2003 -0700, Eric Rescorla wrote:
>Nonsense. One can simply cache the certificate, exactly as
>one does with SSH. In fact, Mozilla at least does exactly
>this if you tell it to. The reason that this is uncommon
>is because the environments where HTTPS is used
>are generally spontaneous and therefore certificate caching
>is less useful.


there are actually two scenarios  one is to pre-cache it (so that its 
transmission never actually has to happen) and the other is to compress it 
to zero bytes. detailed discussion of certificate pre-caching and 
certificate zero byte compression:
http://www.garlic.com/~lynn/ansiepay.htm#aadsnwi2

the typical use for HTTPS for e-commerce is to hide the account number on 
its way to the financial institution. for the most part the merchant is 
primarily interested in the response from the consumer's financial 
institution on whether or not the merchant gets paid. If it weren't for the 
associated business processes, the merchant could get by with never knowing 
anything at all about the consumer (the merchant just passes the account 
number on ... and gets back what they are really interested in  the 
notification from the bank that they will get paid).

So a HTTPS type solution is that the consumer pre-caches their bank's 
certificate (when they establish a bank account).  and they transmit 
the account number "hidden" using the bank's public key. This happens to 
pass thru the merchants processing  but for purposes of the 
authorization, the merchant never really has to see it. The protocol would 
require minor issues of replay attacks  and be able to be done in a 
single round trip  w/o all the SSL protocol chatter. Actually, is isn't 
so much pre-caching their bank's certificate  as loading their bank's 
public key into a table  analogous to the way CA public keys are 
loading into tables (aka using out-of-band processing  the convention 
that they may be self-signed and encoded in a certificate format is an 
anomoly of available software and in no way implies a PKI). The primary 
purpose of HTTPS hasn't been to have a secure channel with the merchant, 
the primary purpose of the HTTPS is to try and hide the consumer's account 
number as it makes its way to the consumer's financial institution.

The other solution is the X9.59 standard (preserve the integrity of the 
financial infrastructure for all electronic retail payments, not just 
internet, not just POS, not just credit, ALL; credit, debit, stored value, 
etc) that creates authenticated transactions and account numbers that can 
only be used in authenticated transaction. The consumer's public key is 
registered in their bank account (out of band process, again no PKI). X9.59 
transactions are signed and transmitted. Since the account number can only 
be used in authenticated transactions  it changes from needing 
encryption to hide the value as part of a shared-secret paradigm to purely 
a paradigm that supports integrity and authentication. As in the above, 
scenario, the merchant passes the value thru on its way to the consumer's 
financial institution; and is focused on getting the approved/disapproved 
answer back about whether they will be paid. As in the bank HTTPS scenario 
where the bank's pubilc key is pre-cached at the consumer, the pre-caching 
of the consumer's public key is pre-cached at the bank  involves no PKI 
business processes (although their may be some similarities that the 
transport of the public key involves encoding in a certificate defined 
format).  misc. x9.59 refs:
http://www.garlic.com/~lynn/index.html#x959

Both pre-caching solutions are between the business entities that are 
directly involved; the consumer and the consumer's financial institution 
based on having an established business relationship.

The invention of PKI was primarily to address the issue of an event between 
two parties that had no prior business relationship and possibly weren't 
going to have any future business relationship and that they would conclude 
their business relying on some mutual trust in the integrity of a 3rd party 
w/o actually having to resort to an online environment. The e-commerce 
scenario is that there is real-time, online transaction with the trusted 
3rd party (the consumer's financial institution) involving prior business 
relationship. This negates the basic original assumptions about the 
environment that PKIs are targeted at addressing.
--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm



Re: Maybe It's Snake Oil All the Way Down

2003-06-08 Thread Ian Grigg
John Kelsey wrote:

> So, what can I do about it, as an individual?  Make the cellphone companies
> build good crypto into their systems?  Any ideas how to do that?

Nope.  Cellphone companies are big slow moving
targets.  They get their franchise from the
government.  If the NSA wants weak crypto, they
do weak crypto.

There is literally no point in hoping the cell
phone company - or any large franchise holder -
will help you in your fight against big brother.

OTOH, what you can do is argue for reasonable
crypto.

(Similar to GSM's.  That is hard to attack,
there is AFAIR no 'trival' attack, you have to
get access to the SIM or you have to probe the
phone with another phone over a period of hours.
I.e., the attacker leaves tracks, and he does so
in a way that will move him on to another mode
of tapping, such as purchasing a straight listening
device.)

Now, it seems that the US standards didn't get
even that.  There's definately a case for arguing
for better crypto in the US.  And, market forces
and all that, one would think that this would
happen in due course.

But arguing for strong crypto end-to-end - save
your breath.

John Kelsey (paraphrased):
> The only way I can see getting decent security [in any application] is to do
> something that doesn't require the rest of the world's permission or
> assistance.

(I edited the above to broaden the assert!)

Opportunistic crypto - that which uses the tools
immediately available and delivers crypto that
is the best available right now - is the only
crypto that will work for *you* the user in any
application.  Anything that defers security off
to some external party has a result of slowing
or killing the application, or delivering less
or no security than if you'd gone ahead in the
first place.

This isn't saying anything new.  It's the Internet,
after all.  On the Internet, one doesn't ask for
permission to participate.  That's no accident,
it's a core reason for its arisal.  Any protocol
that has a step of "now ask for permission" is,
IMHO, breaking one of the major principles of the
Internet.

> ... I
> have an old Comsec 3DES phone at home.  It's nice technology.  I think I've
> used it twice.  If you're not a cryptographer or a cocaine smuggler, you
> probably don't know anyone who owns an encrypting phone or would
> particularly want to.  Even if you'd like to improve your own privacy, you
> can't buy an end-to-end encrypting phone and improve it much.  That's what
> I'd like to see change.

I guess there's no reason why you couldn't load
up speakfreely on a custom Unix box with a flashed
OS, put in the USB headset, and sell it as an end
to end encrypting phone.  The software's all free,
a cheap machine is $300 at Walmart, some enterprising
crypto guy could ship out a network appliance for
$500.

(Or, put it in a PDA that's got the right hooks?)

Half the price of your old Comsec, wasn't it selling
for $1000?

-- 
iang



Re: Maybe It's Snake Oil All the Way Down

2003-06-08 Thread Tim Dierks
At 10:09 PM 6/4/2003, James A. Donald wrote:
Eric Rescorla
> Nonsense. One can simply cache the certificate, exactly as
> one does with SSH. In fact, Mozilla at least does exactly
> this if you tell it to. The reason that this is uncommon is
> because the environments where HTTPS is used are generally
> spontaneous and therefore certificate caching is less useful.
Certificate caching is not the problem that needs solving.  The
problem is all this spam attempting to fool people into logging
in to fake BofA websites and fake e-gold websites, to steal
their passwords or credit card numbers
I don't think this problem is easier to solve (or at least I sure don't 
know how to solve it). It seems to me that you could tell a user every time 
they go to a new site that it's a new site, and hope that users would 
recognize that e-g0ld.com shouldn't be "new", since they've been there 
before. However, people go to a large enough number of sites that they'd be 
seeing the "new" alert all the time, which leads me to believe that it 
wouldn't be taken seriously.

Fundamentally, making sure that people's perception of the identity of a 
web site matches the true identity of the web site has a technical 
component that is, at most, a small fraction of the problem and solution. 
Most of it is the social question of what it means for the identity to 
match and the UI problem of determining the user's intent (hard one, that), 
and/or allowing the user to easily and reliably match their intent against 
the "reality" of the true "identity".

Any problem that has as a component the fact that the glyphs for 
"lower-case L" and "one" look pretty similar isn't going to be easy to 
solve technologically.

 - Tim



Re: Maybe It's Snake Oil All the Way Down

2003-06-08 Thread Eric Rescorla
[EMAIL PROTECTED] (Peter Gutmann) writes:

> Bodo Moeller <[EMAIL PROTECTED]> writes:
> 
> >Using an explicit state machine helps to get code suitable for multiplexing
> >within a single thread various connections using non-blocking I/O.
> 
> Is there some specific advantage here, or is it an academic exercise?  Some
> quirk of supporting certain types of hardware like nCipher boxes that do async
> crypto/scatter-gather?
I've had to do this on environments where threads weren't a viable
option. See, for instance, my paper from USENIX Security 2002.

-Ekr
-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Anne & Lynn Wheeler
At 04:24 PM 6/6/2003 -0700, James A. Donald wrote:

>I don't think so.

??? public key registered in place of shared-secret?

NACHA debit trials using digitally signed transactions did it with both 
software keys as well as hardware tokens.
http://internetcouncil.nacha.org/News/news.html
in the above scroll down to July 23, 2001 ... has pointer to detailed report?

X9.59 straight forward establishes it as standard  with some activity 
moving on to ISO
http://www.garlic.com/~lynn/index.html#x959

pk-init draft for kerberos specifies that public key can be registered in 
place of shared secret.

following has demo of it with radius with public keys registered in place 
of shared-secret.
http://www.asuretee.com/
the radius implementation has been done be a number of people.

in all of these cases, there is change in the business process and/or 
business relationship  doesn't introduce totally unrelated parties to 
the business activities. the is digital signing on the senders side 
(actually a subset of existing PKI technology) and digital signature 
verification on the receivers side (again a subset of existing PKI 
technology). To the extent that there is impact on existing business 
process ... it is like in the case of introducing x9.59 authentication for 
credit transactions that have relatively little authentication currently 
... and as a result would eliminate major portion of the existing credit 
card transaction related fraud.

The big issue isn't the availability of the technology ... although there 
is a slight nit in the asuretee case being FIPS186-2, ecdsa  and having 
support in CAPI and related infrastructures. It not working (easily) is 
like when my wife and I were doing the original payment gateway  with 
this little client/server startup in menlo park (later moved to mountain 
view and have since been bought by AOL) and people saying that SSL didn't 
exist ... misc ref from the past
http://www.garlic.com/~lynn/aadsm5.htm#asrn2
http://www.garlic.com/~lynn/aadsm5.htm#asrn3

--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread James A. Donald
--
On 4 Jun 2003 at 20:58, Anne & Lynn Wheeler wrote:
> it is relatively trivial to demonstrate that public keys can
> be registered in every business process that currently
> registers shared- secrets (pins, passwords, radius, kerberos,
> etc, etc)

I don't think so.

Suppose the e-gold, to prevent this sea of spam trying to get
people to login to fake e-gold sites, wanted people to use
public keys instead of shared secrets, making your secret key
the instrument that controls the account instead of your shared
password.

They could not do this using the standard IE webbrowser.  They
would have to get users to download a custom client, or at
least, like hushmail, a custom control inside IE.

HTTPS assumes that the certificate shall be blessed by the
administrator out of band, and has no mechanism for using a
private key to establish that a user is simply the same user as
last time. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 q1a1Whb1YeRws7qoDm6h15qfDstFHciUyP2I4fte
 42lCFXf0IqXfh5Mz2mFtznxv6N40EuqpKvQJhLBgS



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread James A. Donald
--
On 7 Jun 2003 at 19:05, Dave Howe wrote:
> issuing certs to someone is trivial from both a server and a 
> user endpoint - the user just gets a "click here to request 
> your key" and hits ok on a few dialog boxes; the server 
> simply hosts some pretty off-the-shelf cgi.
>[...]
> its surprisingly reliable and easy - particuarly if your end 
> users are just using the MS keystore, which requires them to 
> do no more than double-click the pkcs file and hit "next" a 
> few times.

This sounds more like what I was looking for.

Probably someone has already pointed out the url to this, but 
if they did, I when I looked at it I was snowed under by 
verisign oriented shit, which assumes a large budget and ample 
administrator time for face to face contact with certified 
people, a very small number of clients, some hours of work by
each client, a manual, user training, etc, and failed to grasp
it.

Could you point me somewhere that illustates server issued 
certs, certification with zero administrator overhead and small 
end user overhead?

Also, I have many times heard that public key operations were 
surprisingly easy, and have been key administrator for several 
companies, and have unfailingly found that I was the only 
person capable of doing these operations at that company. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 v6gZFuZoUgyGH55ME+JoilJSfw5LrufrbWWB454U
 4FhiB65yyXwp1RgeJrLADfEYBoqz0YAch8fJ0Fisp



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Ian Grigg
Derik asks the pertinant question:
> The question is:  how do we convince M$ and Netscape to include something
> else in their software?  If it's not supported in IE, then it wont be
> available to the vast majority of users out there.

My view, again, IMHO:  ignore Microsoft.  Concentrate
on the open source solutions:  KDE, Mozilla, Apache.

These groups will always lead in security, because
they are not twisted by institutional conflicts;
they can examine historical security model from the
point of view of interested professionals, rather
than commercial actors trying to preserve this or
that revenue stream.

The trick is to understand whether HTTPS as it
currently is can be improved.  If it can, then
those above guys can do it.

Once the improvements are shown to work, Microsoft
will follow along.  They are a follower company,
not an innovator, and they need to see it work in
practice before doing anything.  As Derik suggests,
the vast majority of users will have to wait.

Along those lines, there's one piece of excellent
news:

Eric Rescorla wrote:
> One can simply cache the certificate, exactly as
> one does with SSH. In fact, Mozilla at least does exactly
> this if you tell it to.

That's fantastic!  I never knew that.  How does one
set that option on Mozilla?  (I'm using 5.0 / 1.3.1.)

-- 
iang



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Anonymous Sender
James A. Donald writes:
> Suppose the e-gold, to prevent this sea of spam trying to get 
> people to login to fake e-gold sites, wanted people to use 
> public keys instead of shared secrets, making your secret key 
> the instrument that controls the account instead of your shared 
> password. 
>
> They could not do this using the standard IE webbrowser. They 
> would have to get users to download a custom client, or at 
> least, like hushmail, a custom control inside IE. 

Why do you say that?  You were already given pointers to how they
could configure their web servers to use certificate based client
authentication.  These techniques work with standard browsers.  I have
used Netscape to access corporate-internal sites which required me to
have a client certificate.

> HTTPS assumes that the certificate shall be blessed by the 
> administrator out of band, and has no mechanism for using a 
> private key to establish that a user is simply the same user as 
> last time.

HTTPS is just HTTP over SSL/TLS.  It says nothing about the trust model
for the certificates; it merely specifies how each side can deliver its
cert(s) to the other side.  Deciding which ones to trust is out of scope
for TLS or HTTPS.

E-Gold could set things up to allow its customers to authenticate with
certs issued by Verisign, or with considerably more work it could even
issue certs itself that could be used for customer authentication.
Why doesn't it do so?  Well, it's a lot of work, and it would have some
disadvantages - for one thing, customers would have difficulty accessing
their accounts from multiple sites, like at home and at work.  Further,
it would require customers to use some features of their browser that most
of them have never seen, which is going to be difficult and error-prone
for most users.



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Dave Howe
Anonymous Sender wrote:
> James A. Donald writes:
> E-Gold could set things up to allow its customers to authenticate with
> certs issued by Verisign, or with considerably more work it could even
> issue certs itself that could be used for customer authentication.
> Why doesn't it do so?  Well, it's a lot of work,
Nope. issuing certs to someone is trivial from both a server and a user
endpoint - the user just gets a "click here to request your key" and hits ok
on a few dialog boxes; the server simply hosts some pretty off-the-shelf
cgi.

> and it would have
> some disadvantages - for one thing, customers would have difficulty
> accessing their accounts from multiple sites, like at home and at
> work.
Not so much that as have a much bigger security issue. Maintaining keys
securely would then become a task for the client, and while keeping a
written password secret is something most people can handle the concept of,
keeping a block of computer data safe from random trojans while exporting it
to be transported between machines is much, much harder.
Of course, you *could* generate the key entirely locally on the server,
protecting it with a HTTPS download, and protect it with the enduser's
password (not sure how secure the PKCS password is - if it isn't, then use
some self-decoding-exe like the 7z one) but that still wouldn't force the
end user to do more than hit "import" and have it stored insecurely on their
client machine.

> Further,
> it would require customers to use some features of their browser that
> most of them have never seen, which is going to be difficult and
> error-prone for most users.
its surprisingly reliable and easy - particuarly if your end users are just
using the MS keystore, which requires them to do no more than double-click
the pkcs file and hit "next" a few times.



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread t . c . jones
my site has one.
ca0.net
.tom
> --
> On 7 Jun 2003 at 19:05, Dave Howe wrote:
> > issuing certs to someone is trivial from both a server and a 
> > user endpoint - the user just gets a "click here to request 
> > your key" and hits ok on a few dialog boxes; the server 
> > simply hosts some pretty off-the-shelf cgi.
> >[...]
> > its surprisingly reliable and easy - particuarly if your end 
> > users are just using the MS keystore, which requires them to 
> > do no more than double-click the pkcs file and hit "next" a 
> > few times.
> 
> This sounds more like what I was looking for.
> 
> Probably someone has already pointed out the url to this, but 
> if they did, I when I looked at it I was snowed under by 
> verisign oriented shit, which assumes a large budget and ample 
> administrator time for face to face contact with certified 
> people, a very small number of clients, some hours of work by
> each client, a manual, user training, etc, and failed to grasp
> it.
> 
> Could you point me somewhere that illustates server issued 
> certs, certification with zero administrator overhead and small 
> end user overhead?
> 
> Also, I have many times heard that public key operations were 
> surprisingly easy, and have been key administrator for several 
> companies, and have unfailingly found that I was the only 
> person capable of doing these operations at that company. 
> 
> --digsig
>  James A. Donald
>  6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
>  v6gZFuZoUgyGH55ME+JoilJSfw5LrufrbWWB454U
>  4FhiB65yyXwp1RgeJrLADfEYBoqz0YAch8fJ0Fisp
> 
> 
> -
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread James A. Donald
--
James A. Donald:
> > Certificate caching is not the problem that needs solving.
> > The problem is all this spam attempting to fool people into
> > logging in to fake BofA websites and fake e-gold websites,
> > to steal their passwords or credit card numbers

On 6 Jun 2003 at 15:04, Tim Dierks wrote:
> I don't think this problem is easier to solve (or at least I
> sure don't know how to solve it).

It is a hard problem with many well known solutions, none of
which have to my knowledge been implemented in HTTPS.  For
example one can use SPEKE, in which case setting up the account
involves sharing (or issuing) a password, but logging in to the
account does not require one to reveal the password to the site
where one is logging in.   In this case the fake website would
gain no useful information by luring the user to login to it.

The most HTTPS like solution would be to generate a keyfile
containing a self signed private key on one's computer, and
whenever one hit the website, it would do the HTTPS handshake
to log you in to that website's account for the public key
corresponding to your private key, however HTTPS does not seem
to directly support this model.   In this case the bogus web
site could log you in, but this would not leak any information
that would enable the operators of the bogus web site to login
to the real web site. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 /JhekrYM+sQCMQKXhiWzhB3RnOv6PZROgxYwprXj
 4LHJfuGlcn7fO4tcfo20/t0cdEy/HyK++XiBVvMFy



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Harmon Seaver
On Fri, Jun 06, 2003 at 06:08:34PM -0400, Ian Grigg wrote:
> Derik asks the pertinant question:
> > The question is:  how do we convince M$ and Netscape to include something
> > else in their software?  If it's not supported in IE, then it wont be
> > available to the vast majority of users out there.
> 
> My view, again, IMHO:  ignore Microsoft.  Concentrate
> on the open source solutions:  KDE, Mozilla, Apache.

   Mozilla already has a pretty neat interface to gnupg, called Enigmail. See 
http://enigmail.mozdev.org/

-- 
Harmon Seaver   
CyberShamanix
http://www.cybershamanix.com



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Dave Howe
James A. Donald wrote:
> Could you point me somewhere that illustates server issued
> certs, certification with zero administrator overhead and small
> end user overhead?
Been a while since I played with it, but IIRC OpenCA (www.openca.org) is a
full implimentation of a CA, in perl cgi, with no admin intervention
required.  Obviously, that involves browser-based key generation.
If you want server-based key generation, then take a look at
http://symlabs.com/Net_SSLeay/smime.html

If you are iis/asp rather than perl, then there are activex components that
will give you access to x509 certificates - EBCrypt is probably the easiest,
but there is a activex wrapper for cryptlib too, iirc.



Re: Maybe It's Snake Oil All the Way Down

2003-06-07 Thread Peter Gutmann
Derek Atkins <[EMAIL PROTECTED]> writes:

>Actually, the ASN.1 part is a major factor in the X.509 interoperability
>problems.  Different cert vendors include different extensions, or different
>encodings.  They put different information into different parts of the
>certificate (or indeed the same information into different parts).  Does the
>FQDN for a server cert belong in the DN or some extension?  What about the
>email address for a user cert?

That doesn't really have anything to do with ASN.1 though.  You can make just
as big a mess with XML (actually even bigger, in my experience), or EDIFACT,
or whatever.  The problem isn't the bit-bagging format, it's that it's
accumulated such a mass of cruft that no two people can agree on what to put
in there.  Whether the resulting mess is wrapped in ASN.1 or XML or EDIFACT or
plastic pooper scooper bags doesn't really make any difference.

Peter.



Re: Maybe It's Snake Oil All the Way Down

2003-06-06 Thread Derek Atkins
Eric Rescorla <[EMAIL PROTECTED]> writes:

> This isn't really true in the SSL case:
> To a first order, everyone ignores any extensions (except sometimes
> the constraints) and uses the CN for the DNS name of the server.

Except some CAs make certs that can only work as an SSL server and not
an SSL client, or don't work with certain verifiers, or can't be
parsed right, or have the "commit-bit" set on some extensions.  It's
been a major pain in a problem that I'm working on -- not all vendor's
certs work properly.

> -Ekr

-derek

-- 
   Derek Atkins
   Computer and Internet Security Consultant
   [EMAIL PROTECTED] www.ihtfp.com



Re: Maybe It's Snake Oil All the Way Down

2003-06-06 Thread Adam Shostack
On Wed, Jun 04, 2003 at 07:15:13PM -0400, John Kelsey wrote:
| At 03:50 PM 6/3/03 -0700, Eric Blossom wrote:
| ...
| >GSM and CDMA phones come with the crypto enabled.  The crypto's good
| >enough to keep out your neighbor (unless he's one of us) but if you're
| >that paranoid, you should opt for the end-to-end solution.  The CDMA
| >stuff (IS-95) is pretty broken: *linear* crypto function, takes 1
| >second worst case to gather data sufficient to solve 42 equations in
| >42 unknowns, but again, what's your threat model?  Big brother and
| >company are going to get you at the base station...
| 
| Big brother has a limited budget, just like the rest of us.  If he has to 
| produce a warrant or tap a wire somewhere to listen in on me, he probably 
| won't bother.
| 
| The only thing protecting my cellphone calls right now is trivially-broken 
| encryption, the need for some moderately expensive equipment, and some laws 
| prohibiting cellphone eavesdropping.  That means that some bad guys may be 
| eavesdropping now, and there's no telling how many bad guys will be doing 
| so tomorrow.  Nobody here knows how much eavesdropping is being done, 

More bad guys will be listening tomorow, because SDR and Moore's law
will drive down the cost.  At some point, we'll hit a knee in the
curve, and cell phones will be either made more secure, or we'll live
with the fact that all our calls are being listened to, much like the
Brits are always on video.

Adam

-- 
"It is seldom that liberty of any kind is lost all at once."
   -Hume



Re: Maybe It's Snake Oil All the Way Down

2003-06-06 Thread John Kelsey
At 03:50 PM 6/3/03 -0700, Eric Blossom wrote:
..
GSM and CDMA phones come with the crypto enabled.  The crypto's good
enough to keep out your neighbor (unless he's one of us) but if you're
that paranoid, you should opt for the end-to-end solution.  The CDMA
stuff (IS-95) is pretty broken: *linear* crypto function, takes 1
second worst case to gather data sufficient to solve 42 equations in
42 unknowns, but again, what's your threat model?  Big brother and
company are going to get you at the base station...
Big brother has a limited budget, just like the rest of us.  If he has to 
produce a warrant or tap a wire somewhere to listen in on me, he probably 
won't bother.

The only thing protecting my cellphone calls right now is trivially-broken 
encryption, the need for some moderately expensive equipment, and some laws 
prohibiting cellphone eavesdropping.  That means that some bad guys may be 
eavesdropping now, and there's no telling how many bad guys will be doing 
so tomorrow.  Nobody here knows how much eavesdropping is being done, 
because communications intercepts can be done without leaving any record 
anywhere.  Do the police in some cities troll for interesting cellphone 
calls?  Does the NSA do that in the US, quietly?  Do Russian or French 
intelligence agencies?  How would we know?

So, what can I do about it, as an individual?  Make the cellphone companies 
build good crypto into their systems?  Any ideas how to do that?

The only way I can see getting decent security on my cellphone is to do 
something that doesn't require the rest of the world's permission or 
assistance.  The simplest version of that is to have a box at my house 
that's connected to two phone lines, and have all calls to and from my 
cellphone go through that box.  Calls to other secure cellphones can be 
encrypted end-to-end.  Calls to everyone else get encrypted between my 
phone and my box at home.  I spend a little extra for extra security, 
nobody else has to pay anything, and I can call friends on my cellphone 
without being susceptible to trivial eavesdropping.

Can the bad guys defeat this?  Sure, they can tap my landline, or bug my 
car, or do all sorts of other things.  But none of those are cheap enough 
to do to everyone, and probably none are cheap enough to do to me.  Tapping 
my landline either means interacting with the phone company, or paying 
someone to go install a tap, each of which implies a risk of getting 
caught, practical limits on how often it can be done, etc.

This also bypasses the "network effect" of encrypting phones, where you get 
approximately zero benefit from having one until they're widespread.  I 
have an old Comsec 3DES phone at home.  It's nice technology.  I think I've 
used it twice.  If you're not a cryptographer or a cocaine smuggler, you 
probably don't know anyone who owns an encrypting phone or would 
particularly want to.  Even if you'd like to improve your own privacy, you 
can't buy an end-to-end encrypting phone and improve it much.  That's what 
I'd like to see change.

..
Eric
--John Kelsey, [EMAIL PROTECTED]
PGP: FA48 3237 9AD5 30AC EEDD  BBC8 2A80 6948 4CAA F259


Re: Maybe It's Snake Oil All the Way Down

2003-06-06 Thread Eric Rescorla
Derek Atkins <[EMAIL PROTECTED]> writes:

> Eric Murray <[EMAIL PROTECTED]> writes:
> 
> > Too often people see something like Peter's statement above and say
> > "oh, it's that nasty ASN.1 in X.509 that is the problem, so we'll just
> > do it in XML instead and then it'll work fine" which is simply not true.
> > The formatting of the certificates is such a minor issue that it is lost
> > in the noise of the real problems.  And Peter publishes a fine tool
> > for printing ASN.1, so the "human readable" argument is moot.
> 
> Actually, the ASN.1 part is a major factor in the X.509
> interoperability problems.  Different cert vendors include different
> extensions, or different encodings.  They put different information
> into different parts of the certificate (or indeed the same
> information into different parts).  Does the FQDN for a server cert
> belong in the DN or some extension?  What about the email address for
> a user cert?
This isn't really true in the SSL case:
To a first order, everyone ignores any extensions (except sometimes
the constraints) and uses the CN for the DNS name of the server.

-Ekr

-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Derek Atkins
Eric Murray <[EMAIL PROTECTED]> writes:

> Too often people see something like Peter's statement above and say
> "oh, it's that nasty ASN.1 in X.509 that is the problem, so we'll just
> do it in XML instead and then it'll work fine" which is simply not true.
> The formatting of the certificates is such a minor issue that it is lost
> in the noise of the real problems.  And Peter publishes a fine tool
> for printing ASN.1, so the "human readable" argument is moot.

Actually, the ASN.1 part is a major factor in the X.509
interoperability problems.  Different cert vendors include different
extensions, or different encodings.  They put different information
into different parts of the certificate (or indeed the same
information into different parts).  Does the FQDN for a server cert
belong in the DN or some extension?  What about the email address for
a user cert?

> Note that there isn't a real running global PKI using SPKI
> or PGP either.

That's a different problem (namely that the "big guys" like RSA
Security, Microsoft, and Verisign don't sell PGP-enabled software or
PGP certificates).  PGP's problem is an integration problem, making
it easy to use for non-techies.  That has been the barrier to entry
for PGP.

> The largest problem with X.509 is that various market/political forces
> have allowed Verisign to dominate the cert market and charge way too
> much for them.  There is software operable by non-cryptographers that
> will generate reasonable cert reqs (it's not standard Openssl) but
> individuals and corporations alike balk at paying $300-700 for each cert.
> (yes I know about the free "individual" certs, the failure of
> S/MIME is a topic for another rant).

This is only part of the problem... It is not all of it.  Indeed the
cost (both in money, time, and headache) has always been a barrier to
entry.  I don't believe that market or political forces are the largest
problem with X.509  I will certainly agree that the cost is a
major impediment.

The question is:  how do we convince M$ and Netscape to include something
else in their software?  If it's not supported in IE, then it wont be
available to the vast majority of users out there.

-derek

-- 
   Derek Atkins
   Computer and Internet Security Consultant
   [EMAIL PROTECTED] www.ihtfp.com



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Eric Rescorla
"James A. Donald" <[EMAIL PROTECTED]> writes:
> Eric Rescorla
> > Nonsense. One can simply cache the certificate, exactly as 
> > one does with SSH. In fact, Mozilla at least does exactly 
> > this if you tell it to. The reason that this is uncommon is 
> > because the environments where HTTPS is used are generally 
> > spontaneous and therefore certificate caching is less useful.
> 
> Certificate caching is not the problem that needs solving.  The 
> problem is all this spam attempting to fool people into logging 
> in to fake BofA websites and fake e-gold websites, to steal 
> their passwords or credit card numbers 

The only solutions to that problem involve getting rid of
passwords and credit card numbers. SSL does that job about
as well as we know how.

-Ekr


-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Sunder
Depends on how it gets passed from the web servers to that computer.  If
it's encrypted with a public key on the web server that only the database
has the private half, you're safe from someone sniffing that "proprietary
one-way interface."

However, if somone's already broken into the web server, they can collect
the cc:'s before they get sent to the secure db.

So if you're an old Amazon customer and don't change your CC >BEFORE<
someone hacks into their web server, you're safe.

It's certainly better than storing all CC's on the web server.

Now if those CC's are in raw text on the DB end, Amazon is up shit's creek
if someone walks away with a db dump, backup tape, or whatever.

I don't claim to know what they're using, but long, long time ago, in
another galaxy, I used to work with a product from OpenMarket that worked
similarly, but they held all credit cards encrypted in the DB making it
much harder.  (Of course if you have the key it's as good as cleartext,
but it was at least another layer of protection.)

Ultimately they'll need either a cybercash interface or some interface to
a bank to charge your card.  If the bad guy intercepts at that level or
gets unencrypted access to the DB, or you change your CC while the web
server is compromised, you are in for some interesting CC statements.


However, this is in a lot of ways MORE secure than handing that waiter or
store clerk your CC.  Remember that nice yellow slip has your signature,
CC number and expiration date on it.  Very useful for an attacker.  
Infact, they likely had physical access to the CC and have that extra 3
digit # on the back too. 

Some stores even ask for your driver's license to prove that you are you,
which at least in NY has your date of birth and address as well.  Even
more useful to the evildoer.  If they can also get your SSN on top of
that, you're at their mercy.  Think about any credit application type
transactions  these days, buying (some) cell phones, or car, or
signing up for satelite TV requires these.


I feel safer with Amazon's use of my CC than the above, don't you?



--Kaos-Keraunos-Kybernetos---
 + ^ + :25Kliters anthrax, 38K liters botulinum toxin, 500 tons of   /|\
  \|/  :sarin, mustard and VX gas, mobile bio-weapons labs, nukular /\|/\
<--*-->:weapons.. Reasons for war on Iraq - GWB 2003-01-28 speech.  \/|\/
  /|\  :Found to date: 0.  Cost of war: $800,000,000,000 USD.\|/
 + v + :   The look on Sadam's face - priceless!   
[EMAIL PROTECTED] http://www.sunder.net 

On Tue, 3 Jun 2003, Jeroen van Gelderen wrote:

> "To provide you with an additional layer of security, all credit card 
> numbers provided to Amazon.com are stored on a computer that is not 
> connected to the Internet. After you type or call it in, your complete 
> credit card number is transferred to this secure machine across a 
> proprietary one-way interface. This computer is not accessible by 
> network or modem, and the number is not stored anywhere else."
> 
> Now I'm not sure how they get to use the number during the billing 
> process but hey... :)
> 
> I don't know if I'd feel much better if Amazon didn't have my CC on 
> file. The danger of a disgruntled sysadmin snarfing the numbers while 
> they pass trough the system for one time use during a single billing 
> cycle seems to real for me.



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Eric Rescorla
"James A. Donald" <[EMAIL PROTECTED]> writes:

> --
> On 3 Jun 2003 at 15:04, James A. Donald wrote:
> > I never figured out how to use a certificate to authenticate 
> > a client to a web server, how to make a web form available to 
> > one client and not another.  Where do I start?
> >
> > What I and everyone else does is use a shared secret, a 
> > password stored on the server, whereby the otherwise 
> > anonymous client gets authenticated, then gets an ephemeral 
> > cookie identifying him..   I cannot seem to find any how-tos 
> > or examples for anything better, whether for IIS or apache.
> >
> > As a result we each have a large number of shared secret 
> > passwords, whereby we each log into a large number of 
> > webservers.  Was this what the people who created this 
> > protocol intended?
> 
> Or to say the same thing in different words -- why can't HTTPS 
> be more like SSH?Why are we seeing a snow storm of scam
> mails trying to get us to login to e-g0ld.com? 
Because HTTPS is designed to let you talk to people you've
never talked before, which is an inherently harder problem
than allowing you to talk to people you have.

-Ekr

-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread James A. Donald
--
James A. Donald
> > Or to say the same thing in different words -- why can't
> > HTTPS be more like SSH?Why are we seeing a snow storm
> > of scam mails trying to get us to login to e-g0ld.com?

Eric Rescorla
> Because HTTPS is designed to let you talk to people you've 
> never talked before, which is an inherently harder problem 
> than allowing you to talk to people you have.

In attempting to solve the hard problem, it fails to make
provision for solving the easy problem.

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 bZy6QJLI0fL6IOhhS8lxNx/EUctBs0cj1se8YRt5
 4LvAbyVinp/3mbNkE+8/qx6UYDSxykTEFMpTXzsoD



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Peter Gutmann
Bodo Moeller <[EMAIL PROTECTED]> writes:

>Using an explicit state machine helps to get code suitable for multiplexing
>within a single thread various connections using non-blocking I/O.

Is there some specific advantage here, or is it an academic exercise?  Some
quirk of supporting certain types of hardware like nCipher boxes that do async
crypto/scatter-gather?  I have a vague idea from discussions with some
OpenSSL-engine developers that they had some requirement for supporting async
hardware in non-threaded environments, but from hearing the complaints about
how hard this ended up being I had the impression that this was a major
rewrite rather than something the state-machine implementation had been
specifically designed for (sorry, I don't have that much technical info, the
discussions tended to devolve into griping sessions about how hard async
crypto hardware was to work with, not helped by comments like "That's because
you're taking the path of most resistance, just use threads" :-).

I also don't know if that explains why, years before this was an issue,
everyone was already treating SSL as a state machine problem.

Peter.



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Anne & Lynn Wheeler
At 04:25 PM 6/4/2003 -0700, James A. Donald wrote:
> --
>Everyone in America has several shared secrets identifying them
>-- the number of the beast to identify them to the state, and
>their credit card numbers identifying them to various financial
>institutions, plus a hundred passwords to  login to their
>email, their bank, their network provider, e-gold, etc.
>
>The PKI idea was that we would instead use PK in place of
>shared secrets, but if an ordinary person had a private key,
>what could he use it for?
>
>The spam that seeks to get us to login to e-g0ld and the
>BankOf4merica.com works because the logins are based on shared
>secrets, not private keys, and the networks are setup to rely
>on shared secrets because there is no practical alternative.

one could claim that public-key is a practical alternative but it got 
significantly sidetracked with independent business model that wanted 
extract huge amount of money out of existing infrastructures (say totally 
brand new independent operations wanting $100/annum for every person, 
extracted from the existing infrastructure for no significant positive 
benefit ... aka say 200m people at @$100/annum is $20b/annum ... in return 
for some abstract bit vapor that doesn't change any core business issue).

it is relatively trivial to demonstrate that public keys can be registered 
in every business process that currently registers shared-secrets (pins, 
passwords, radius, kerberos, etc, etc).  the issue then becomes one of cost 
to change/upgrade those infrastructures to support digital signature 
authentication with the stored public keys in lieu of string comparison (no 
new business operations, no new significant transfer of wealth to brand new 
outside business entities, etc).

however, think about even these simple economics for a minute  even for 
relatively modest technology changes that don't change any of the business 
processes/relationships ... it still costs some money ... and the 
beneficiary isn't the institution, it is the individual. The individual has 
the paradigm changed from hundreds of shared-secrets to a single key-pair 
.. however each institution continues to see just as many individuals and 
account records. From a very practical standpoint ... entities don't 
frequently fund things that they don't benefit from ... and typically most 
success is achieved when the entity that benefits from the change is also 
driving/funding the change.

the issue is to find out how the individual pays for the change  or 
figure out how the institutions are going to benefit.
--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Eric Murray
On Wed, Jun 04, 2003 at 04:32:23PM +1200, Peter Gutmann wrote:
> "James A. Donald" <[EMAIL PROTECTED]> writes:
> 
> >I never figured out how to use a certificate to authenticate a client to a
> >web server, how to make a web form available to one client and not another.
> >Where do I start?
> 
> There's a two-level answer to this problem.  At an abstract level, doing
> client certs isn't hard, there are various HOWTOs around for Apache, Microsoft
> have Technet/MSDN papers on it for IIS, etc etc.  At a practical level, it's
> almost never used because it's just Too Hard.  That's not the SSL client-cert
> part, it's the using-X.509 part.

It's the I part of PKI that's hard.  That the assumptions built
into X.509 (i.e. a rigid certificate hierarchy) don't work everywhere
just makes it harder.  And the obstinance of the standards organizations
involved don't help.

Too often people see something like Peter's statement above and say
"oh, it's that nasty ASN.1 in X.509 that is the problem, so we'll just
do it in XML instead and then it'll work fine" which is simply not true.
The formatting of the certificates is such a minor issue that it is lost
in the noise of the real problems.  And Peter publishes a fine tool
for printing ASN.1, so the "human readable" argument is moot.

Note that there isn't a real running global PKI using SPKI
or PGP either.


The largest problem with X.509 is that various market/political forces
have allowed Verisign to dominate the cert market and charge way too
much for them.  There is software operable by non-cryptographers that
will generate reasonable cert reqs (it's not standard Openssl) but
individuals and corporations alike balk at paying $300-700 for each cert.
(yes I know about the free "individual" certs, the failure of
S/MIME is a topic for another rant).

This is why lne.com's STARTTLS cert is self-signed.  Verisign
isn't getting any more of my money.


Eric



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Bill Frantz
At 8:07 AM -0700 6/4/03, Sunder wrote:
>Depends on how it gets passed from the web servers to that computer.  If
>it's encrypted with a public key on the web server that only the database
>has the private half, you're safe from someone sniffing that "proprietary
>one-way interface."
>
>However, if somone's already broken into the web server, they can collect
>the cc:'s before they get sent to the secure db.
>
>So if you're an old Amazon customer and don't change your CC >BEFORE<
>someone hacks into their web server, you're safe.
>
>It's certainly better than storing all CC's on the web server.
>
>Now if those CC's are in raw text on the DB end, Amazon is up shit's creek
>if someone walks away with a db dump, backup tape, or whatever.
>
>
>
>However, this is in a lot of ways MORE secure than handing that waiter or
>store clerk your CC.  Remember that nice yellow slip has your signature,
>CC number and expiration date on it.  Very useful for an attacker.
>Infact, they likely had physical access to the CC and have that extra 3
>digit # on the back too.
>
>...
>
>I feel safer with Amazon's use of my CC than the above, don't you?

Well, I've only ordered from Amazon 2 or 3 times since they've been in
business.  Having my CC on file gives a much longer exposure time than the
brief periods of time it would be "in transit".  So, no I don't feel much
safer.  The $50 limit on unauthorized charges is what makes me feel safer.

Cheers - Bill


-
Bill Frantz   | Due process for all| Periwinkle -- Consulting
(408)356-8506 | used to be the | 16345 Englewood Ave.
[EMAIL PROTECTED] | American way.  | Los Gatos, CA 95032, USA



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Bodo Moeller
[EMAIL PROTECTED] (Peter Gutmann):

> [0] Note that my SSL implementation follows the standard SSL ladder diagram
> rather than the state-machine that SSL implementations are usually
> described as, which made it trivial to switch over for SSHv2 use.  I've
> never understood why every explanation of the SSL protocol I've ever seen
> uses ladder diagrams but once they talk about implementation details they
> assume you're doing it as a state machine, which makes it vastly harder to
> implement.  For example all the stuff about pending cipher suites and
> whatnot follows automatically (and transparently) from the ladder diagram,
> but is a real pain to sort out in a state machine.

Using an explicit state machine helps to get code suitable for
multiplexing within a single thread various connections using
non-blocking I/O.


-- 
Bodo Mvller <[EMAIL PROTECTED]>
PGP http://www.informatik.tu-darmstadt.de/TI/Mitarbeiter/moeller/0x36d2c658.html
* TU Darmstadt, Theoretische Informatik, Alexanderstr. 10, D-64283 Darmstadt
* Tel. +49-6151-16-6628, Fax +49-6151-16-6036



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Bill Frantz
At 7:40 AM -0700 6/4/03, Eric Murray wrote:
>Note that there isn't a real running global PKI using SPKI
>or PGP either.

I'm not sure SPKI was ever meant to be a global PKI.  It was more meant to
authorization in a "verifier-centric" system.

Cheers - Bill


-
Bill Frantz   | Due process for all| Periwinkle -- Consulting
(408)356-8506 | used to be the | 16345 Englewood Ave.
[EMAIL PROTECTED] | American way.  | Los Gatos, CA 95032, USA



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Bodo Moeller
On Thu, Jun 05, 2003 at 10:11:45PM +1200, Peter Gutmann wrote:
> Bodo Moeller <[EMAIL PROTECTED]> writes:

>> Using an explicit state machine helps to get code suitable for multiplexing
>> within a single thread various connections using non-blocking I/O.

> Is there some specific advantage here, or is it an academic exercise?
> [...]   I have a vague idea from discussions with some
> OpenSSL-engine developers that they had some requirement for supporting async
> hardware in non-threaded environments, [...] the
> discussions tended to devolve into griping sessions about how hard async
> crypto hardware was to work with, not helped by comments like "That's because
> you're taking the path of most resistance, just use threads" :-).

I don't mind working with threads, but there's a lot of software out
there that uses single-threaded multiplexing, and adding SSL/TLS to
such software becomes much easier if the SSL/TLS library supports this
multiplexing paradigm.  (Not that it would be impossible otherwise --
another option, for Unix anway, is to fork off a processes that
handles a SSL/TLS connection and communicates with the main process
via a pipe.)


-- 
Bodo Mvller <[EMAIL PROTECTED]>
PGP http://www.informatik.tu-darmstadt.de/TI/Mitarbeiter/moeller/0x36d2c658.html
* TU Darmstadt, Theoretische Informatik, Alexanderstr. 10, D-64283 Darmstadt
* Tel. +49-6151-16-6628, Fax +49-6151-16-6036



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Rich Salz
  The problems that this creates are demonstrated by what happens when
  technically skilled users are required to work with certificates.
If you haven't already seen it, I highly recommend Don Davis's 
"compliance defects" paper (and slides!) available at 
http://world.std.com/~dtd.  Abstract follows:
 Public-key cryptography has low infrastructural overhead because
 public-key users bear a substantial but hidden administrative burden.
  A public-key security system trusts its users
 to validate each others' public keys rigorously and to manage
 their own private keys securely. Both tasks are hard to do well,
 but public-key security systems lack a centralized infrastructure
 for enforcing users' discipline.  A "compliance defect" in a
 cryptosystem is such a rule of operation that is both difficult
 to follow and unenforceable.  This paper presents five compliance
 defects that are inherent in public-key cryptography; these
 defects make public-key cryptography more suitable for server-to-server
 security than for desktop applications.



--
Rich Salz, Chief Security Architect
DataPower Technology http://www.datapower.com
XS40 XML Security Gatewayhttp://www.datapower.com/products/xs40.html


Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Eric Rescorla
"James A. Donald" <[EMAIL PROTECTED]> writes:

> --
> James A. Donald
> > > Or to say the same thing in different words -- why can't
> > > HTTPS be more like SSH?Why are we seeing a snow storm
> > > of scam mails trying to get us to login to e-g0ld.com?
> 
> Eric Rescorla
> > Because HTTPS is designed to let you talk to people you've 
> > never talked before, which is an inherently harder problem 
> > than allowing you to talk to people you have.
> 
> In attempting to solve the hard problem, it fails to make
> provision for solving the easy problem.

Nonsense. One can simply cache the certificate, exactly as
one does with SSH. In fact, Mozilla at least does exactly
this if you tell it to. The reason that this is uncommon
is because the environments where HTTPS is used
are generally spontaneous and therefore certificate caching
is less useful.

-Ekr

-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Peter Gutmann
Eric Murray <[EMAIL PROTECTED]> writes:

>Too often people see something like Peter's statement above and say "oh, it's
>that nasty ASN.1 in X.509 that is the problem, so we'll just do it in XML
>instead and then it'll work fine" which is simply not true. The formatting of
>the certificates is such a minor issue that it is lost in the noise of the
>real problems.  And Peter publishes a fine tool for printing ASN.1, so the
>"human readable" argument is moot.
>
>Note that there isn't a real running global PKI using SPKI or PGP either.

A debate topic I've thought of occasionally in the last year or two: If
digital signatures had never been invented, would we now be happily using
passwords, SecurIDs, challenge-response tokens, etc etc to do whatever we need
rather than having spent the last 20-odd years fruitlessly chasing the PKI
dream?  There was some interesting work being done on non-PKI solutions to
problems in the 1970s before it all got drowned out by PKI, but most of it
seems to have stagnated since then outside a few niche areas like wholesale
banking, where it seems to work reasonably well.

(Hmm, now *that* would make an interesting panel session for the next RSA
 conference).

Peter.



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Rich Salz
> In attempting to solve the hard problem, it fails to make
> provision for solving the easy problem.

That's a deployment issue, not a technical issue.  D-H key exchange, for
example, would be just fine.  It just so happens that the SSL creators had
a particular business goal in mind:  e-commerce, with a "certificate"
re-assuring the nervous customer that they were handing their credit card
to jcrew.com, not, jscrew.com.  Yes, SSL was invented to solve a
particular problem.  They did a reasonable job at it.
/r$
--
Rich Salz Chief Security Architect
DataPower Technology  http://www.datapower.com
XS40 XML Security Gateway http://www.datapower.com/products/xs40.html



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread Anne & Lynn Wheeler
At 12:02 PM 6/4/2003 +0100, Dave Howe wrote:
For that matter, our system here discards the CC after use (the pre-auth
step with the merchant bank agent gives us back a "fulfillment handle" that
can only be used to fulfill or cancel that individual transaction - but of
course Amazon *want* to keep your CC details so they can do their
fast-checkout patented thingy.
the ground rules given the x9a10 working group for the x9.59 standard was 
to preserve the integrity of the financial infrastructure for all (credit, 
debit, stored-value, POS, internet, non-internet, aka ALL) electronic 
retail payments. it was one of the things that led us down the path of 
certless operation. We had previously done the work on the original payment 
gateway and had to perform various kinds of due diligence on all the major 
CA vendors  which started to dawn on us that stale, static certificates 
were actually redundant and superfluous in the financial business process.
http://www.garlic.com/~lynn/aadsm5.html#asrn2
http://www.garlic.com/~lynn/aadsm5.html#asrn3

sort of the clinker was starting to do operational and performance profile 
on any of the existing payment networks  and it was evident that there 
was a huge mismatch between the existing payment transaction payload size 
and any of the commonly used certificates (even the drastically simplified 
replying-party-only certificates carrying only an account number and public 
key).

Two major characteristics of X9.59 was that it would provide 1) end-to-end 
authentication (aka the consumers financial institution would be the one 
responsible for performing authentication) and 2) account numbers used in 
X9.59 transactions could not be used in unauthenticated transactions.

Some of the '90s digitally signature oriented specifications had 
authentication occurring at the internet boundary and stripping off the 
certificate (avoiding the extreme certificate payload penalty in the 
payment network). The downside was that the party performing the 
authentication didn't necessarily have the consumer's interest in mind and 
Visa subsequently presented statistics at a ISO standards meeting on the 
number of transactions flowing through the network 1) with a flag claiming 
to have been digitally signature authenticated and 2) they could prove that 
no digital signature technology was ever involved.

Evesdropping, sniffing or harvesting account numbers in the current 
infrastructure (at any point in the process, by insiders or outsiders, 
traditionally financial exploits have been 90 percent insiders) can result 
in fraudulent transactions. As a result, existing account numbers 
effectively become a form of shared-secret and need to be protected. With 
the X9.59 business rule requiring the account number to only be used in 
authenticated transactions, simple harvesting of a X9.59 account number 
doesn't result in fraud. Issuing financial institutions then can use 
existing business processes that support mapping of different account 
numbers to the same account.  A discussion of the security proportional to 
risk with regard to credit card transactions:
http://www.garlic.com/~lynn//2001h.html#61 Net banking, is it safe?

The issue with the use of SSL for protecting credit card transactions isn't 
addressing all or even the major vulnerability to the infrastructure. 
Eliminating the account number as a form of shared secret addresses all of 
the vulnerabilities, not just the transaction-in-flight vulnerability 
addressed by SSL. As a byproduct of addressing all of the shared-secret 
related vulnerabilities, it also eliminates the need to use SSL for 
protecting the shared secret while being transmitted.

Detailed report of its use in the NACHA debit network trials can be found at
http://internetcouncil.nacha.org/News/news.html
scroll down to "July 23, 2001: Digital Signatures Can Secure ATM Card Payments"
More details of X9.59 standard:
http://www.garlic.com/~lynn/index.html#x959
--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm


Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread James A. Donald
--
Everyone in America has several shared secrets identifying them 
-- the number of the beast to identify them to the state, and 
their credit card numbers identifying them to various financial 
institutions, plus a hundred passwords to  login to their
email, their bank, their network provider, e-gold, etc.

The PKI idea was that we would instead use PK in place of 
shared secrets, but if an ordinary person had a private key, 
what could he use it for?

The spam that seeks to get us to login to e-g0ld and the 
BankOf4merica.com works because the logins are based on shared 
secrets, not private keys, and the networks are setup to rely 
on shared secrets because there is no practical alternative. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 r9lUivpSt7tWiPOxVr17a9sjkgXnnbC5matqsa6/
 4UovWiFVbzH8bFEhVsekeydmrrDmez+5/B/3ZSo4B



Re: Maybe It's Snake Oil All the Way Down

2003-06-05 Thread James A. Donald
--
James A. Donald
> > > > Or to say the same thing in different words -- why 
> > > > can't HTTPS be more like SSH?Why are we seeing a 
> > > > snow storm of scam mails trying to get us to login to 
> > > > e-g0ld.com?

Eric Rescorla
> > > Because HTTPS is designed to let you talk to people 
> > > you've never talked before, which is an inherently harder 
> > > problem than allowing you to talk to people you have.

James A. Donald:
> > In attempting to solve the hard problem, it fails to make 
> > provision for solving the easy problem.

Eric Rescorla
> Nonsense. One can simply cache the certificate, exactly as 
> one does with SSH. In fact, Mozilla at least does exactly 
> this if you tell it to. The reason that this is uncommon is 
> because the environments where HTTPS is used are generally 
> spontaneous and therefore certificate caching is less useful.

Certificate caching is not the problem that needs solving.  The 
problem is all this spam attempting to fool people into logging 
in to fake BofA websites and fake e-gold websites, to steal 
their passwords or credit card numbers 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 /UOLlqGTeq9SAB5W/aJJuwULFBNMCVzKJnIRlhES
 48E3I0Yo+68OTvTwztxirTXc41yFVicJtskuBB/dU



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Ian Grigg
Bill Stewart wrote:
> 
> At 11:38 AM 06/03/2003 -0400, Ian Grigg wrote:
> >I (arbitratrily) define the marketplace for SSL as browsing.
> ...
> >There, we can show statistics that indicate that SSL
> >has penetrated to something slightly less than 1% of servers.
> 
> For transmitting credit card numbers on web forms,
> I'd be surprised if there were 1% of the servers that *don't* use SSL/TLS.


I've seen it a lot.  Not that I pay much
attention, but I'd suspect it is less
than 10%, but much more than 1%.  Also, a lot
of credit card numbers get delivered by email.
These are all to small time merchants who have
MOTO agreements without the net part, but take
the CCs anyway.  After all, a sale is a sale,
and nobody ever heard of a credit card number
being lost over the net...



OK, I'm teased by this:  how many sites use
open unencrypted CC delivery?  I went to google
and searched on:  " Virtually all deployed browsers support SSL, except a few
> special-purpose versions.  The web servers supporting
> almost all of the web support SSL if they have keys installed.
> While many of them haven't bothered paying money for certified keys
> or doing self-signed keys, I'd be surprised if it's really
> as low as 1%.  What's your source for that figure?



http://www.securityspace.com/s_survey/sdata/200305/index.html

Total SSL servers 131,566.  Now go to here:

http://www.securityspace.com/s_survey/data/200305/domain.html

Total webservers 10,432,910 (derived by 5280096 / 0.5061).

That gives SSL penetration as 10,432,910 / 131,566 == 1.26%


(Darn!  I was wrong, it's slightly more than 1%, not less.
I should be stoned and cursed!)



> While only a small fraction of web pages, and a much smaller
> fraction of web bits transmitted, use SSL, that's appropriate,
> because most web pages are material the publisher wants the public to see,
> so eavesdropping isn't particularly part of the threat model,
> and even integrity protection is seldom a realistic worry.



Hmmm...  You might say that, but I would have said it
was the other way around!  There is - surprisingly -
not much of a threat model for eavesdropping of credit
cards (and - shockingly - even less of an MITM threat
model).

It's easier for a crook to break in and hack the DB, and
pick up tens of thousands than to haunt the net looking
for an elusive 16 digit number out of a browser page.

But, there is a big personal cost with reputational
information.  Few people would want to see my credit
card info, but I can think of lots that would be keen
on seeing my adult browsing, my gaming addition, or
my participation in my kleptomaniacal therapy group,
not to mention anything embarrassing I might get up
to!



What I find curious is why all those open source people
worked so hard to build in the crypto to protect credit
cards, but didn't want to protect anything else.  I can
understand Netscape programmers - they wanted to sell
secure servers for cash.

But I don't understand why Apache and KDE and Mozilla
deliver software tuned to protect credit cards.  It
would make sense if they were all paid to do this by
the credit card companies ... but they aren't, are
they?  What's their incentive?



> (By contrast, eavesdropping protection and integrity protection
> are critical to telnet-like applications, so SSH is a big win.)
> 
> It's nice to have routine web traffic encrypted,
> so that non-routine traffic doesn't stand out,
> and so that traffic analysis is much harder,
> but there is a significant CPU hit from the public-key phase,
> which affects the number of pages per hour that can be served.



We run a dozen or more web servers here, and
I can never tell the difference between the
unprotected ones and the protected ones, so
I'm not sure what to make of the argument
that SSL should be reserved for "important
credit card numbers".

I think CPU has gotted so cheap that running
out of CPU is a great sign of a successful
business, no more.  The last time I made a
serious business decision based on CPU
horsepower was back in 1989.  We are almost
at the point where raw PCs can do 1000 RSAs
per second.  Companies like Visa and Mastercard
process in the order of 1000 - 10,000 transactions
per second.  Which means if they were using an
efficient payment system - one or 2 RSAs per
transaction - they could be now thinking about
putting their entire crypto processing on one PC.

Maybe it's only an issue if one is serving
continuously... in which case, maybe one could
either "use less crypto" like switch back to
smaller keys - way more secure than no keys -
or buy a faster box?


-- 
iang



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Dave Howe
Bill Frantz wrote:
> I know of one system that takes credit cards over HTTPS, and then
> sends the credit card number, encrypted with GPG to a backend system
> for processing.
For that matter, our system here discards the CC after use (the pre-auth
step with the merchant bank agent gives us back a "fulfillment handle" that
can only be used to fulfill or cancel that individual transaction - but of
course Amazon *want* to keep your CC details so they can do their
fast-checkout patented thingy.



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Donald Eastlake 3rd
On Tue, 3 Jun 2003, John Kelsey wrote:

> Date: Tue, 03 Jun 2003 10:42:01 -0400
> From: John Kelsey <[EMAIL PROTECTED]>
> Subject: Re: Maybe It's Snake Oil All the Way Down
> 
> ...
> 
> I keep wondering how hard it would be to build a cordless phone system on 
> top of 802.11b with some kind of decent encryption being used.  I'd really 
> like to be able to move from a digital spread spectrum cordless phone 
> (which probably has a 16-bit key for the spreading sequence or some such 
> depressing thing) to a phone that can't be eavesdropped on without tapping 
> the wire.

See http://www.silicon.com/news/148/1/3828.html?source=nh

> ...
>
> --John Kelsey, [EMAIL PROTECTED]
> PGP: FA48 3237 9AD5 30AC EEDD  BBC8 2A80 6948 4CAA F259

Thanks,
Donald
==
 Donald E. Eastlake 3rd   [EMAIL PROTECTED]
 155 Beaver Street  +1-508-634-2066(h) +1-508-851-8280(w)
 Milford, MA 01757 USA   [EMAIL PROTECTED]



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Peter Gutmann
"James A. Donald" <[EMAIL PROTECTED]> writes:

>I never figured out how to use a certificate to authenticate a client to a
>web server, how to make a web form available to one client and not another.
>Where do I start?

There's a two-level answer to this problem.  At an abstract level, doing
client certs isn't hard, there are various HOWTOs around for Apache, Microsoft
have Technet/MSDN papers on it for IIS, etc etc.  At a practical level, it's
almost never used because it's just Too Hard.  That's not the SSL client-cert
part, it's the using-X.509 part.  To save having to type in a long
explanation, I'll lift a representative paragraph from a (not-yet-published,
don't ask :-) paper on PKI usability:

  There is considerable evidence from mailing lists, Usenet newsgroups and web
  forums, and directly from the users themselves, that acquiring a certificate
  is the single biggest hurdle faced by users [1].  For example various user
  comments indicate that it takes a skilled technical user between 30 minutes
  and 4 hours work to obtain a certificate from a public CA that performs
  little to no verification, depending on the CA and the procedure being
  followed.  Obtaining one from non-public CAs that carry out various levels
  of verification before issuing the certificate can take as long as a month.
  A representative non-technical user who tried to obtain an (unverified)
  certificate from a public CA took well over an hour for the process, which
  involved [...] eventually the user gave up.

and that doesn't even get into the mess of managing private keys, handling
revocation, etc etc etc ad nauseum:

  The problems that this creates are demonstrated by what happens when
  technically skilled users are required to work with certificates.  The
  OpenSSL toolkit [2][3] includes a Perl script CA.pl that allows users to
  quickly generate so-called clown suit certificates (ones that 'have all the
  validity of a clown suit' when used for identification purposes [4]), which
  is widely-used in practice.  The cryptlib toolkit [5][6] contains a similar
  feature in the form of Xyzzy certificates (added with some resistance and
  only after the author grew tired of endless requests for it), ones with
  dummy X.500 names, an effectively infinite lifetime, and no restrictions on
  usage.  Most commercial toolkits include similar capabilities, usually
  disguised as 'test certificates' for development purposes only, which end up
  being deployed in live environments because it.s too difficult to do it the
  way X.509 says it should be done.  Certificates used with mailers that
  support the STARTTLS option consist of ones that are 'self-signed, signed-by
  the default Snake Oil CA, signed by an unknown test CA, expired, or have the
  wrong DN' [7].  The producer of one widely-used Windows MUA reports that in
  their experience 90% of the STARTTLS-enabled servers that they encounter use
  self-signed certificates [8].  This reduces the overall security of the
  system to that of unauthenticated Diffie-Hellman key exchange, circa 1976.
  In all of these cases, the entire purpose of certificates has been
  completely short-circuited by users because it.s just too difficult to do
  the job properly.  The problematic nature of X.509 is echoed in publications
  both technical and non-technical, with conference papers and product
  descriptions making a feature of the fact that their design or product works
  without requiring a PKI.  For example, one recent review of email security
  gateways made a requirement for consideration in the review that the product
  'have no reliance on PKI' [9].  As an extreme example of this, the inaugural
  PKI Research Workshop, attended by expert PKI users, required that
  submitters authenticate themselves with plaintext passwords because of the
  lack of a PKI to handle the task [10][11].

>As a result we each have a large number of shared secret passwords, whereby
>we each log into a large number of webservers.  Was this what the people who
>created this protocol intended?

The assumption of the protocol's creators was that someone would figure out
how to make X.509 PKI work by the time SSL took off, and everyone would have
their own certificates and whatnot.

At least they got *most* of the design right :-).

Peter.



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Ng Pheng Siong
On Tue, Jun 03, 2003 at 03:04:54PM -0700, James A. Donald wrote:
> I never figured out how to use a certificate to authenticate a
> client to a web server, how to make a web form available to one
> client and not another.  Where do I start?

Start by looking up the OpenSSL wrappers for your favourite high-level
"scripting" language. There exists wrappers for Perl, Python, tcl, Ruby,
etc. Some popular languages have several.

Many of these programming language environments come with HTTP server
implementations, and many of the OpenSSL wrappers hook into said HTTP
server code to add HTTPS, and a number demonstrate how to do client-side
certificates.

My M2Crypto adds HTTPS to the popular web application server Zope
(www.zope.org) and has some code to hook client-side certificates into
Zope's own user authentication machinery. (By faking HTTP basic
authentication, just like Apache's SSL do.) Once you have that, you can
choose to serve whatever content you want.


> What I and everyone else does is use a shared secret, a
> password stored on the server, whereby the otherwise anonymous
> client gets authenticated, then gets an ephemeral cookie
> identifying him.. 

It seems HMAC'ing cookies are getting popular for this purpose.
OpenACS, another popular web application server uses this:

   http://openacs.org/doc/openacs-4/security-design.html

My Python crypto kit has an implementation of the scheme described here:

http://www.pdos.lcs.mit.edu/cookies/pubs/webauth.html

I'll be interested to hear this list's view on such schemes. From my
app-plumber's perspective, such a technique for is good enough provided it
is 'secure' enough.

People understand passwords. Private keys, certificates, smart cards, etc.,
are more difficult. (I recall a paper on PGP UI useability testing called
"Why Johnny cannot encrypt" or something like that.)


> As a result we each have a large number of shared secret
> passwords, whereby we each log into a large number of
> webservers.  Was this what the people who created this protocol
> intended?

Actually, this is the crypto-wielding-open-source-hacker-wannabe's wet
dream: So what you need now to track (or generate strong) passwords is a
GUI "password safe"! (Like the one offered on (the old?) Counterpane site.)

Again, Perl, Python, Ruby, yada yada, you name it, people are going to
implement them for free. ;-)

Especially since there are usually 3-5 GUI toolkits and 2-4 database toolkits
for these language environments. Enough combinations to suit everyone.


-- 
Ng Pheng Siong <[EMAIL PROTECTED]> 

http://firewall.rulemaker.net  -+- Manage Your Firewall Rulebase Changes
http://www.post1.com/home/ngps -+- Open Source Python Crypto & SSL



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Ian Grigg
Tim Dierks wrote:
> 
> At 09:11 AM 6/3/2003, Peter Gutmann wrote:
> >"Lucky Green" <[EMAIL PROTECTED]> writes:
> > >Given that SSL use is orders of magnitude higher than that of SSH, with no
> > >change in sight, primarily due to SSL's ease-of-use, I am a bit puzzled by
> > >your assertion that ssh, not SSL, is the "only really successful net crypto
> > >system".
> >
> >I think the assertion was that SSH is used in places where it matters, while
> >SSL is used where no-one really cares (or even knows) about it.  Joe Sixpack
> >will trust any site with a padlock GIF on the page.  Most techies won't access
> >a Unix box without SSH.  Quantity != quality.
> 
> I have my own opinion on what this assertion means. :-) I believe it
> intends to state that ssh is more successful because it is the only
> Internet crypto system which has captured a large share of its use base.
> This is probably true: I think the ratio of ssh to telnet is much higher
> than the ratio of https to http, pgp to unencrypted e-mail, or what have you.


Certainly, in measureable terms, Tim's description
is spot on.  I agree with Peter's comments, but
that's another issue indeed.


> However, I think SSL has been much more successful in general than SSH, if
> only because it's actually used as a transport layer building block rather
> than as a component of an application protocol. SSL is used for more
> Internet protocols than HTTP: it's the standardized way to secure POP,
> IMAP, SMTP, etc. It's also used by many databases and other application
> protocols. In addition, a large number of proprietary protocols and custom
> systems use SSL for security: I know that Certicom's SSL Plus product
> (which I originally wrote) is (or was) used to secure everything from
> submitting your taxes with TurboTax to slot machine jackpot notification
> protocols, to the tune of hundreds of customers. I'm sure that when you add
> in RSA's customers, those of other companies, and people using
> OpenSSL/SSLeay, you'll find that SSL is much more broadly used than ssh.


Design wins!  Yes, indeed, another way of measuring
the success is to measure the design wins.  Using
this measure, SSL is indeed ahead.  This probably
also correlates with the wider support that SSL
garners in the cryptography field.


> I'd guess that SSL is more broadly used, in a dollars-secured or
> data-secure metric, than any other Internet protocol. Most of these uses
> are not particularly visible to the consumer, or happen inside of
> enterprises. Of course, the big winners in the $-secured and data-secured
> categories are certainly systems inside of the financial industry and
> governmental systems.


That would depend an awful lot on what was meant
by "dollars-secured" and "data-secured" ?  Sysadmins
move some pretty hefty backups by SSH on a routine
basis.

-- 
iang



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Dave Howe
At 10:09 AM 6/2/03 -0400, Ian Grigg wrote:
>  (One doesn't hear much about
> crypto phones these days.  Was this really a need?)
As a minor aside - most laptops can manage pgpfone using only onboard
hardware these days, either using an integrated modem or (via infrared) a
mobile phone.



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread James A. Donald
--
On 3 Jun 2003 at 15:04, James A. Donald wrote:
> I never figured out how to use a certificate to authenticate 
> a client to a web server, how to make a web form available to 
> one client and not another.  Where do I start?
>
> What I and everyone else does is use a shared secret, a 
> password stored on the server, whereby the otherwise 
> anonymous client gets authenticated, then gets an ephemeral 
> cookie identifying him..   I cannot seem to find any how-tos 
> or examples for anything better, whether for IIS or apache.
>
> As a result we each have a large number of shared secret 
> passwords, whereby we each log into a large number of 
> webservers.  Was this what the people who created this 
> protocol intended?

Or to say the same thing in different words -- why can't HTTPS 
be more like SSH?Why are we seeing a snow storm of scam
mails trying to get us to login to e-g0ld.com? 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 QtiFX0Q654gHh54NAMlLGE1FGDveixyzL0ZnAOVS
 4hprBkT1zeYk/HdBOXiquwvz5vLUwF/21wW1Jf411



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Anne & Lynn Wheeler
some generic reasons for hooking radius (or one of the AAA technologies) 
into your webserver for authentication are:

1) supports a variety of authentication mechanisms on an account by account 
basis. day one, none of the users actually need to see any difference 
(single administrative interface supporting all the client authentication 
options that might be in use). existing userid/password, challenge/response 
and in the referenced asuretee url, ecdsa digital signature.

2) single administrative interface for both client authentication options 
as well as all of their authorization and privilege options.

3) client database is accessable in real-time by the webserver, real-time 
updates can occur to both authentication information as well 
as  authorization, permission and privilege information using single 
consistent administrative operation

4) there is no disconnect between client administration and static, stale, 
redundant and superfluous certificates that are a subset of a r/o 
administrative database entry. (RADIUS) Updates can take place in real time 
and immediately reflected. The certificate story (as mentioned previously, 
created for offline, disconnected environment) basically would do something 
like a) invalidate the old certificate, b) issue new CRLs, c) possibly 
update a OCSP LDAP, d) update the master database permissions entry for 
that client, e) generate a certificate that represents a subset of the 
master information, f) distribute it to the client and f) then have the 
client install the new certificate. This of course becomes unnecessary if 
the certificate doesn't actually contain any information and the webserver 
accesses the authorization and permissions from an online database. 
However, as has repeatedly been pointed out before, if the certificate 
doesn't contain any information and the webserver is accessing an online 
database for authorizations and permissions ... then the webserver can 
access the online database for the authentication material also. The 
certificate then is static, stale, redundant and superfluous and you are 
back to a single online, real-time comprehensive administrative facility 
(like radius) that supports client/account specifics for authentication, 
authorization, permissions, accounting, privileges, etc.



--
Anne & Lynn Wheelerhttp://www.garlic.com/~lynn/
Internet trivia 20th anv http://www.garlic.com/~lynn/rfcietff.htm


[eay@pobox.com: Re: Maybe It's Snake Oil All the Way Down]

2003-06-04 Thread Eric Murray
- Forwarded message from Eric Young <[EMAIL PROTECTED]> -

Date: Wed, 04 Jun 2003 01:05:24 +1000
From: Eric Young <[EMAIL PROTECTED]>
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.3)
  Gecko/20030312
X-Accept-Language: en-us, en
To: [EMAIL PROTECTED]
X-Orig-To: [EMAIL PROTECTED]
CC: EKR <[EMAIL PROTECTED]>, Eric Murray <[EMAIL PROTECTED]>,
   Scott Guthery
  <[EMAIL PROTECTED]>, Rich Salz <[EMAIL PROTECTED]>,
   Bill
  Stewart <[EMAIL PROTECTED]>, cypherpunks <[EMAIL PROTECTED]>,
   [EMAIL PROTECTED]
Subject: Re: Maybe It's Snake Oil All the Way Down
In-Reply-To: <[EMAIL PROTECTED]>

Ian Grigg wrote:

>It's like the GSM story, whereby 8 years
>down the track, Lucky Green cracked the
>crypto by probing the SIMs to extract
>the secret algorithm over a period of
>many months (which algorithm then fell to
>Ian Goldberg and Dave Wagner in a few hours).
>
>In that case, some GSM guy said that, it
>was good because it worked for 8 years,
>that shows the design was good, doesn't
>it?
>
>And Lucky said, now you've got to replace
>hundreds of millions of SIMs, that's got
>to be a bad design, no?
>  
>
Well the point here is that the data encryption in GSM is not relevant to
the people running the network.  The authentication is secure,
so there is no fraud, so they still get the money from network
usage.  Privacy was never really there since
the traffic is not encrypted once it hit the base station, so the
relevant government agencies can be kept happy.
The encryption was only relevant to protect the consumers
from each other.

eric (hopefully remembering things correctly)

- End forwarded message -



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread John Young
The White House Communications Agency is also working
hard to secure presidential communications, with legacy
systems needing ever-increasing maintenance and upgrades,
the market continuing to outpace the big-ticket legacy
clunker equipment, too expensive to chuck outright, yet having
flaws begging for discovery, patches galore (most relying
upon obscurity and secrecy), and the operators from the
four military branches which run the system turning over
regularly and each new wave needing special training to 
work the patchwork klutz, with retiring old salts who are
the only ones who know how the hybrids work and whether
they are truly secure, and not least, NSA doing it damndest
to get new systems installed in all the prez's habitats and
vehicles and layovers around the world, deploying crypto
tools partly off the shelf, partly purpose-built at Ft Meade -- 
and the whole precarious mess subject to a 20-year-old 
pulling a thumb out of the dike and letting flow proof that the 
leader of the free world is up to what you'd expect despite 
the multi-million rig to hide the obvious. Rumor is that 98%
of what is handled top secretly is trivial fluff, as with most
mil comm, SIGINT, cellphone, microwave, fiber-optic, so that
snake oil is apt protection. If all telecomm was shut down no
more would change than pulling the plug on television.

The other 2% is what the billions and billions is trying to find
among the EM cataract of plaintext and speak smoke and whine 
-- by whoever may be plotting a world of pure bugfuck. But that
could also be discovered by thoughtful analysis of any singular
mania, whether religion, higher-ed, sport, stock market, politics, 
or mil-biz.

Here's a recent account from "Army Communicator" of 
what's up at ever busier and harried and thumbplugging
WHCA:

  http://cryptome.org/whca2003.pdg  (680KB)

WHCA itself is recruiting thumbs:

  http://www.disa.mil/whca



[eb@comsec.com: Re: Maybe It's Snake Oil All the Way Down]

2003-06-04 Thread Eric Murray
- Forwarded message from Eric Blossom <[EMAIL PROTECTED]> -

Date: Tue, 3 Jun 2003 13:25:50 -0700
From: Eric Blossom <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
X-Orig-To: John Kelsey <[EMAIL PROTECTED]>
Cc: [EMAIL PROTECTED], EKR <[EMAIL PROTECTED]>,
   Scott Guthery
  <[EMAIL PROTECTED]>, Rich Salz <[EMAIL PROTECTED]>,
   Bill
  Stewart <[EMAIL PROTECTED]>, cypherpunks <[EMAIL PROTECTED]>,
   [EMAIL PROTECTED]
Subject: Re: Maybe It's Snake Oil All the Way Down
In-Reply-To: <[EMAIL PROTECTED]>
User-Agent: Mutt/1.4i

On Tue, Jun 03, 2003 at 10:42:01AM -0400, John Kelsey wrote:
> At 10:09 AM 6/2/03 -0400, Ian Grigg wrote:
> ...
> > (One doesn't hear much about
> >crypto phones these days.  Was this really a need?)

Yes, I believe there is a need.

In my view, there are two factors in the way of wide spread adoption:
cost and ease of use.

Having spent many years messing with these things, I've come to the
conclusion that what I personally want is a cell phone that implements
good end-to-end crypto.  This way, I've always got my secure
communication device with me, there's no "bag on the side", and it can
be made almost completely transparent.

> And for cellphones, I keep thinking we need a way to sell a secure 
> cellphone service that doesn't involve trying to make huge changes to the 
> infrastructure, ...

Agreed.  Given a suitably powerful enough Java or whatever equipped
cell phone / pda and an API that provides access to a data pipe and
the speaker and mic, you can do this without any cooperation from the
folks in the middle.  I think that this platform will be common within
a couple of years.  The Xscale / StrongARM platform certainly has
enough mips to handle both the vocoding and the crypto.

Also on the horizon are advances in software radio that will enable
the creation of ad hoc self organizing networks with no centralized
control.  There is a diverse collection of people supporting this
revolution in wireless communications.  They range from technologists,
to economists, lawyers, and policy wonks.  For background on spectrum
policy issues see http://www.reed.com/openspectrum,
http://cyberlaw.stanford.edu/spectrum or http://www.law.nyu.edu/benklery

Free software for building software radios can be found at the 
GNU Radio web site http://www.gnu.org/software/gnuradio

Eric

- End forwarded message -



[eb@comsec.com: Re: Maybe It's Snake Oil All the Way Down]

2003-06-04 Thread Eric Murray
- Forwarded message from Eric Blossom <[EMAIL PROTECTED]> -

Date: Tue, 3 Jun 2003 15:50:37 -0700
From: Eric Blossom <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
X-Orig-To: John Kelsey <[EMAIL PROTECTED]>
Cc: [EMAIL PROTECTED], EKR <[EMAIL PROTECTED]>,
   Scott Guthery
  <[EMAIL PROTECTED]>, Rich Salz <[EMAIL PROTECTED]>,
   Bill
  Stewart <[EMAIL PROTECTED]>, cypherpunks <[EMAIL PROTECTED]>,
   [EMAIL PROTECTED]
Subject: Re: Maybe It's Snake Oil All the Way Down
In-Reply-To: <[EMAIL PROTECTED]>
User-Agent: Mutt/1.4i

On Tue, Jun 03, 2003 at 06:17:12PM -0400, John Kelsey wrote:
> At 01:25 PM 6/3/03 -0700, Eric Blossom wrote:
> ...

> I agree end-to-end encryption is worthwhile if it's available, but even 
> when someone's calling my cellphone from a normal landline phone, I'd like 
> it if at least the over-the-air part of the call was encrypted.  That's a 
> much bigger vulnerability than someone tapping the call at the base station 
> or at the phone company.

GSM and CDMA phones come with the crypto enabled.  The crypto's good
enough to keep out your neighbor (unless he's one of us) but if you're
that paranoid, you should opt for the end-to-end solution.  The CDMA
stuff (IS-95) is pretty broken: *linear* crypto function, takes 1
second worst case to gather data sufficient to solve 42 equations in
42 unknowns, but again, what's your threat model?  Big brother and
company are going to get you at the base station...

At our house we've pretty much given up on wired phone lines.  We use
cell phones as our primary means of communication.  Turns out that
with the bundled roaming and long distance, it works out cheaper than
what we used to pay for long distance service.  There is that pesky
location transponder problem though.

> ...which will basically never be secured end-to-end if 
> this requires each of those people to buy a special new phone, or do some 
> tinkering with configuring secure phone software for their PDA.  "Hmmm, 
> which key size do I need?  Is 1024 bits long enough?  Why do I have to move 
> the mouse around, again, anyway?"

It doesn't have to be hard.  No requirement for PKI.  Just start with
an unauthenticated 2k-bit Diffie-Hellman and be done with it.

Eric

- End forwarded message -



RE: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Tim Dierks
At 09:11 AM 6/3/2003, Peter Gutmann wrote:
"Lucky Green" <[EMAIL PROTECTED]> writes:
>Given that SSL use is orders of magnitude higher than that of SSH, with no
>change in sight, primarily due to SSL's ease-of-use, I am a bit puzzled by
>your assertion that ssh, not SSL, is the "only really successful net crypto
>system".
I think the assertion was that SSH is used in places where it matters, while
SSL is used where no-one really cares (or even knows) about it.  Joe Sixpack
will trust any site with a padlock GIF on the page.  Most techies won't access
a Unix box without SSH.  Quantity != quality.
I have my own opinion on what this assertion means. :-) I believe it 
intends to state that ssh is more successful because it is the only 
Internet crypto system which has captured a large share of its use base. 
This is probably true: I think the ratio of ssh to telnet is much higher 
than the ratio of https to http, pgp to unencrypted e-mail, or what have you.

However, I think SSL has been much more successful in general than SSH, if 
only because it's actually used as a transport layer building block rather 
than as a component of an application protocol. SSL is used for more 
Internet protocols than HTTP: it's the standardized way to secure POP, 
IMAP, SMTP, etc. It's also used by many databases and other application 
protocols. In addition, a large number of proprietary protocols and custom 
systems use SSL for security: I know that Certicom's SSL Plus product 
(which I originally wrote) is (or was) used to secure everything from 
submitting your taxes with TurboTax to slot machine jackpot notification 
protocols, to the tune of hundreds of customers. I'm sure that when you add 
in RSA's customers, those of other companies, and people using 
OpenSSL/SSLeay, you'll find that SSL is much more broadly used than ssh.

I'd guess that SSL is more broadly used, in a dollars-secured or 
data-secure metric, than any other Internet protocol. Most of these uses 
are not particularly visible to the consumer, or happen inside of 
enterprises. Of course, the big winners in the $-secured and data-secured 
categories are certainly systems inside of the financial industry and 
governmental systems.

 - Tim



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Adam Shostack
On Wed, Jun 04, 2003 at 01:11:51AM +1200, Peter Gutmann wrote:
| "Lucky Green" <[EMAIL PROTECTED]> writes:
| 
| >I trust that we can agree that the volume of traffic and number of
| >transactions protected by SSL are orders of magnitude higher than those
| >protected by SSH. As is the number of users of SSL. The overwhelming majority
| >of which wouldn't know ssh from telnet. Nor would they know what to do at a
| >shell prompt and therefore have no use for either ssh or telnet.
| 
| Naah, that third sentence is wrong.  It's:
| 
|   The overwhelming majority of [SSL users] wouldn't know SSL from HTTP with a
|   padlock GIF in the corner.
| 
| >Given that SSL use is orders of magnitude higher than that of SSH, with no
| >change in sight, primarily due to SSL's ease-of-use, I am a bit puzzled by
| >your assertion that ssh, not SSL, is the "only really successful net crypto
| >system".
| 
| I think the assertion was that SSH is used in places where it matters, while
| SSL is used where no-one really cares (or even knows) about it.  Joe Sixpack
| will trust any site with a padlock GIF on the page.  Most techies won't access
| a Unix box without SSH.  Quantity != quality.
| 
| If you could wave a magic wand and make one of the two protocols vanish, I'd
| notice the loss of SSH immediately (I couldn't send this message for
| starters), but it would take days or weeks before I noticed the loss of SSL.

One of the papers at the security and econ workshop last week asserted
that the reason ssh took off was actually because it makes life easier
if you need to munge X displays.

ADam

-- 
"It is seldom that liberty of any kind is lost all at once."
   -Hume



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Eric Rescorla
Ian Grigg <[EMAIL PROTECTED]> writes:
> Eric Rescorla wrote:
> True, although, that begs the question as
> to how they learn.  Only by doing, I'd say.
> I think one learns a lot more from making
> mistakes and building ones own attempt than
> following the words of wise.
One learns by *practicing*.

That said, though, there's next to no need for people to know how
to design their own communications security protocols, so it's
not really that important for them to learn. 

> OK.  Then I am confused about the post that
> came out recently.  It would be very interesting
> to hear the story, written up.
The rough version of it is in my book.

-Ekr

-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Ian Grigg
Lucky Green wrote:
> 
> Ian Grigg wrote:
> > Also, a lot of cryptosystems are put together
> > by committees.  SSH was originally put together
> > by one guy.  He did the lot.  Allegedly, a fairly
> > grotty protocol with a number of weakneses, but
> > it was there and up and running.  And SSH-2 is
> > apparantly nice, elegant and easy to understand,
> > now that it has been fixed up.
> 
> ssh2 is in essence a re-invention of what SSL did without having to use
> X.509 keys. This reinvention was, IMHO, largely the result of the
> limitations of the ssh1 design.

OK.  Learning more every day :-)

> > (SSH is the only really successful net crypto
> > system, IMHO, in that it actually went into its
> > market and made a mark.  It's the only cryptosystem
> > that is as easy to use as its non-crypto competitor,
> > telnet.  It's the only one where people switch and
> > never return.)
> 
> I trust that we can agree that the volume of traffic and number of
> transactions protected by SSL are orders of magnitude higher than those
> protected by SSH. As is the number of users of SSL. The overwhelming
> majority of which wouldn't know ssh from telnet. Nor would they know
> what to do at a shell prompt and therefore have no use for either ssh or
> telnet.

Indeed!  Although I trust that we can also look at many
different ways of measuring success.

In order to *compare* success, like for like, we have
to start with an understanding of the marketplace for
each system, and assume that the marketplace for each
application is its universe.

I (arbitratrily) define the marketplace for SSL as
browsing.  (I.e., HTTP, as used between a browser
and a webserver.  The SSL protected part might be
referred to as HTTPS.  This of course ignores all
the other users of the protocol.)

There, we can show statistics that indicate that SSL
has penetrated to something slightly less than 1% of
servers.  It would of course be interesting to see
what the bandwidth figures are like, for example,
but I wouldn't be surprised if they are also less
than 1% (think about all those yahoo monsters that
overflow your POTS).

The fact that a user of SSL is neither aware nor
capable of being protected by SSH is irrelevant,
neither is a sysadmin concerned in his job with
protecting his work with SSL.

(Actually that's not true;  there was an SSL terminal
system for a while, as an adjunct to SSLeay, but
that is a dead or dying protocol, rapidly replaced
by SSH whenever the two entered competition.  Which
is a good thing, the SSL terminal was a nightmare
to get going, due to its insistance on hand crafting
certificates.).

> Given that SSL use is orders of magnitude higher than that of SSH, with
> no change in sight, primarily due to SSL's ease-of-use, I am a bit
> puzzled by your assertion that ssh, not SSL, is the "only really
> successful net crypto system".

SSL's 1% penetration into the browsing market doesn't
strike me as successful.

If I was "selling SSL" as a business, I'd be looking
at the other 99% and wondering why it's just sitting
there, not being sold.  As there are big expensive
companies doing just that;  then I guess they have
tried.

Have a look at the penetration reports on
http://www.securityspace.com/

On the other hand, SSH, as a cryptosystem, as an
application (think: replacement for telnet, not as
competitor to the SSL protocol) penetrates its market
very well.  I have no more than anecdotal evidence
for that, but any sysadmin knows that once they
started using SSH, they would never go back to the
alternate unless forced, kicking and screaming.

It would be very interesting to find out what SSH
v. telnet traffic looks like.

That's what I mean by success.  Within its market
place, SSH rules.

-- 
iang



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Ian Grigg
Eric Rescorla wrote:
> 
> Ian Grigg <[EMAIL PROTECTED]> writes:
> > Eric Murray wrote:
> > It may be that the SSL underlying code is
> > perfect.  But that the application is weak
> > because the implementor didn't understand
> > how to drive it;  in which case, if he can
> > roll his own, he may end up with a more
> > secure overall package.
> I don't think this is likely to be true. In my experience,
> people who learn enough to design their own thing also learn
> enough to be able to do SSL properly.

True, although, that begs the question as
to how they learn.  Only by doing, I'd say.
I think one learns a lot more from making
mistakes and building ones own attempt than
following the words of wise.

> > > SSLv2, which was also designed by an
> > > individual, also had major flaws.  And that was the
> > > second cut!  I haven't seen v1, maybe Eric can
> > > shed some light on how bad it was.
> >
> > [ Someone commented before that v1 was not deemed
> > serious (Marc A?) and v2 was the more acceptable
> > starting point (Weinsteins?). ]
> That's not true as far as I know. V1 and V2 were designed
> by the same guy (Kipp Hickman). V1 is actually very similar
> to V2, except that the integrity stuff is all screwed up.
> As far as I can tell, the fact of the matter is that Kipp
> didn't understand the security issues until Abadi and
> to some extent Schiffman sold them some clues.


OK.  Then I am confused about the post that
came out recently.  It would be very interesting
to hear the story, written up.


> > Sure.  If someone does roll their own, then they
> > should get it reviewed.
> That's not my experience. WEP and PPTP come to mind.

Ah, good point:  There should be some
point on that list about building ones
cryptosystem outside the domain of an
institution, which tends to have too
many conflicting requirements, and
cannot limit itself to a simple system.

(And, yes, some protocols don't get
peer reviewed.  I wasn't debating that.)


> > But, that assumes an awful lot.  For a start,
> > that it exists.  SSL is touted as the answer
> > to everything, but it seems to be a connection
> > oriented protocol, which would make it less
> > use for speech, media, mail, chat (?), by way
> > of example.

> SSL is quite fine for chat, actually. It's one of the
> major things that people use for IM. The issue with
> speech and media isn't connection-orientation but
> rather datagram versus stream data.

I knew I was in trouble on chat, that's
why I stuck the interrogation mark in
there :-)  We recently added an email-like
capability to our (homegrown) crypto
system, and intend to expand that to
chat.  But, in order to do that, we
have to expand the crypto subsystem
(SOX) to include connection-oriented
modes.

[ Hence, an open question floating
around here is "why don't we use SSL"
which hasn't been definitively answered
as yet. ]

> > Then there is understanding, both of the
> > protocol, and the project's needs.  I know
> > that when I'm in a big project and I come
> > across a complex new requirement, often, it
> > is an open question as to whether make or
> > buy is the appropriate choice.  I do know
> > that 'make' will always teach me about the
> > subject, and eventually, it will teach me
> > which one to buy, or it will give me a
> > system tuned to my needs.

> The history of people who go this course suggests otherwise.
> They generally get lousy solutions.

I think it would be very interesting to
do a study of all the cryptosystems out
there and measure what succeeds, what
doesn't, what's secure, and what's not.

What cost too much money and what saved
money.

One of the issues that we see is that
too many security people assume that
"insecure" is "bad".  What they fail
to perceive is that an insecure system
is often sufficient for the times and
places.

WEP for example is perfectly fine, unless
you are attacked by a guy with a WEP
cracking kit!  Then it's a perfectly
lousy cryptosubsystem.

It's like the GSM story, whereby 8 years
down the track, Lucky Green cracked the
crypto by probing the SIMs to extract
the secret algorithm over a period of
many months (which algorithm then fell to
Ian Goldberg and Dave Wagner in a few hours).

In that case, some GSM guy said that, it
was good because it worked for 8 years,
that shows the design was good, doesn't
it?

And Lucky said, now you've got to replace
hundreds of millions of SIMs, that's got
to be a bad design, no?

(Lucky might be able to confirm the real
story there.)

Different ways of looking at the same
thing.  They are both valid points of
view.  To work out the difference, we
need to go to costs and benefits.  Who
won and who lost?  I never heard how
it panned out.

-- 
iang



RE: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Peter Gutmann
"Lucky Green" <[EMAIL PROTECTED]> writes:

>I trust that we can agree that the volume of traffic and number of
>transactions protected by SSL are orders of magnitude higher than those
>protected by SSH. As is the number of users of SSL. The overwhelming majority
>of which wouldn't know ssh from telnet. Nor would they know what to do at a
>shell prompt and therefore have no use for either ssh or telnet.

Naah, that third sentence is wrong.  It's:

  The overwhelming majority of [SSL users] wouldn't know SSL from HTTP with a
  padlock GIF in the corner.

>Given that SSL use is orders of magnitude higher than that of SSH, with no
>change in sight, primarily due to SSL's ease-of-use, I am a bit puzzled by
>your assertion that ssh, not SSL, is the "only really successful net crypto
>system".

I think the assertion was that SSH is used in places where it matters, while
SSL is used where no-one really cares (or even knows) about it.  Joe Sixpack
will trust any site with a padlock GIF on the page.  Most techies won't access
a Unix box without SSH.  Quantity != quality.

If you could wave a magic wand and make one of the two protocols vanish, I'd
notice the loss of SSH immediately (I couldn't send this message for
starters), but it would take days or weeks before I noticed the loss of SSL.

Peter.



Re: Maybe It's Snake Oil All the Way Down

2003-06-04 Thread Peter Gutmann
Ian Grigg <[EMAIL PROTECTED]> writes:

>It's also very much oriented to x.509 and similar certificate/PKI models,
>which means it is difficult to use in web of trust (I know this because we
>started on the path of adding web of trust and text signing features to x.509
>before going back to OpenPGP), financial and nymous applications whereby
>trust is bootstrapped a different way.

That's a red herring.  It happens to use X.509 as its preferred bit-bagging
format for public keys, but that's about it.  People use self-signed certs,
certs from unknown CAs [0], etc etc, and you don't need certs at all if you
don't need them, I've just done an RFC draft that uses
shared secret keys for mutual authentication of client and server, with no
need for certificates of any kind, so the use of
certs, and in particular a hierarchical PKI, is merely an optional extra.
It's no more required in SSL than it is in SSHv2.

>Has anyone read Ferguson and Schneier's _Practical Cryptography_ ?  Does it
>address this issue of how an outsider decides how to "make or buy"?  I just
>read the reviews on Amazon, they are ... entertaining!

They spend a nontrivial portion of the book reinventing SSL/SSHv2.  I guess
they lean towards the roll-your-own side of the argument :-).  I'm firmly in
the opposite camp (see "Lessons Learned in Implementing and Deploying Crypto
Software", links off my home page at http://www.cs.auckland.ac.nz/~pgut001/).
I think that providing an abstract description of a fairly complex security
protocol *in a book targeted at security novices* and then hoping that they
manage to implement it correctly is asking for trouble.  OTOH it's fun going
through the thought processes involved in designing the protocol.  I just wish
they'd applied the process to SSL or SSHv2 instead, so that at the end of it
they could tell the reader to go out and grab an implementation that someone
else has got right for them.

Peter.

[0] The vendor of one widely-used MTA once told me that 90% of the certs they
saw used in STARTTLS applications were non-big name CA-issued ones (self-
signed, etc etc).



RE: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Lucky Green
Ian Grigg wrote:
> Also, a lot of cryptosystems are put together
> by committees.  SSH was originally put together
> by one guy.  He did the lot.  Allegedly, a fairly
> grotty protocol with a number of weakneses, but
> it was there and up and running.  And SSH-2 is
> apparantly nice, elegant and easy to understand,
> now that it has been fixed up.

ssh2 is in essence a re-invention of what SSL did without having to use
X.509 keys. This reinvention was, IMHO, largely the result of the
limitations of the ssh1 design.

> (SSH is the only really successful net crypto
> system, IMHO, in that it actually went into its
> market and made a mark.  It's the only cryptosystem
> that is as easy to use as its non-crypto competitor,
> telnet.  It's the only one where people switch and
> never return.)

I trust that we can agree that the volume of traffic and number of
transactions protected by SSL are orders of magnitude higher than those
protected by SSH. As is the number of users of SSL. The overwhelming
majority of which wouldn't know ssh from telnet. Nor would they know
what to do at a shell prompt and therefore have no use for either ssh or
telnet.

Given that SSL use is orders of magnitude higher than that of SSH, with
no change in sight, primarily due to SSL's ease-of-use, I am a bit
puzzled by your assertion that ssh, not SSL, is the "only really
successful net crypto system".

--Lucky



Re: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Tim May
On Monday, June 2, 2003, at 07:09  AM, Ian Grigg wrote:


PGP was also mildly successful, and was done by
one guy, PRZ.  The vision was very clear.  All others
had to do was to fix the bugs...  Sadly, free versions
never quite made the jump into GUI mail clients, so
widespread success was denied to it.
I would've characterized PGP version 2, 1992, as the first usable 
version. And it was done by about half a dozen people. The first 
version was not, to my knowledge, actually used by anyone.

It might have done better had creaping featuritus and the "integration 
with mailers and other programs" and the "better GUI" distractions not 
dissipated so much energy.

Also, the Clipper chip politics and the belief that PRZ was about to be 
arrested gave PGP a certain kind of notoriety...it became "cool" 
("bad," "def") to use PGP.

These days, "that's _so_ 90s."

--Tim May



Re: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Eric Rescorla
Ian Grigg <[EMAIL PROTECTED]> writes:
> Eric Murray wrote:
> It may be that the SSL underlying code is
> perfect.  But that the application is weak
> because the implementor didn't understand
> how to drive it;  in which case, if he can
> roll his own, he may end up with a more
> secure overall package.
I don't think this is likely to be true. In my experience,
people who learn enough to design their own thing also learn
enough to be able to do SSL properly.

> > SSLv2, which was also designed by an
> > individual, also had major flaws.  And that was the
> > second cut!  I haven't seen v1, maybe Eric can
> > shed some light on how bad it was.
> 
> [ Someone commented before that v1 was not deemed
> serious (Marc A?) and v2 was the more acceptable
> starting point (Weinsteins?). ]
That's not true as far as I know. V1 and V2 were designed
by the same guy (Kipp Hickman). V1 is actually very similar
to V2, except that the integrity stuff is all screwed up.
As far as I can tell, the fact of the matter is that Kipp 
didn't understand the security issues until Abadi and
to some extent Schiffman sold them some clues.

> > Peer review is not "design by comittie".
> 
> Let me clarify.  SSL - the protocol - was not
> designed by committee, but, the size of the teams
> involved in the crypto systems was in excess of
> the people who were intimately familiar with the
> protocol.  For the most familiar example, browsing,
> there were, it seems, many people involved in the
> overall grafting of SSL into the original HTML/HTTP
> system.
As far as I know, that's not the case. The original Netscape
team was very small and there really weren't any significant
choices to be made.

> > It is
> > the way to get strong protocols.  When I have to roll my
> > own (usually because its working in a limited environment
> > and I don't have a choice)
> > I get it reviewed.  The protocol designer usually misses
> > something in his own protocol.
> 
> Sure.  If someone does roll their own, then they
> should get it reviewed.
That's not my experience. WEP and PPTP come to mind.

> > > I'd say that conditions for Internet crypto system
> > > success would include:
> > 
> > 0. USE EXISTING SECURITY PRIMITIVES
> 
> :-)
> 
> I know this is the mantra of the field.
> 
> Quesion is:  which PRIMITIVES?
> 
> 1.  RSA?
> 2.  SSL, written from the RFC?
> 3.  OpenSSL, the toolkit?  EKR's fine effort?
> 4.  RSADSI security consultants, selling you
> theirs?
> 5.  ...
I would say the highest level primitives you can get away with.

> But, that assumes an awful lot.  For a start,
> that it exists.  SSL is touted as the answer
> to everything, but it seems to be a connection
> oriented protocol, which would make it less
> use for speech, media, mail, chat (?), by way
> of example.
SSL is quite fine for chat, actually. It's one of the 
major things that people use for IM. The issue with
speech and media isn't connection-orientation but
rather datagram versus stream data.

> Then there is understanding, both of the
> protocol, and the project's needs.  I know
> that when I'm in a big project and I come
> across a complex new requirement, often, it
> is an open question as to whether make or
> buy is the appropriate choice.  I do know
> that 'make' will always teach me about the
> subject, and eventually, it will teach me
> which one to buy, or it will give me a
> system tuned to my needs.
The history of people who go this course suggests otherwise.
They generally get lousy solutions.

-Ekr

-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Ian Grigg
Eric Murray wrote:
> 
> On Mon, Jun 02, 2003 at 10:09:06AM -0400, Ian Grigg wrote:
> > A lot of the tools and blocks are too hard to
> > understand.  "Inaccessible" might be the proper
> > term.  This might apply to, for example, SSL,
> > and more so to IPSec.  These have a lower survival
> > rate, simply because as developers look at them,
> > their eyes glaze over and they move on.  I heard
> > one guy say that "you can read SSH in an hour
> > and understand what's going on, but not SSL."
> 
> Some who can't understand SSL won't be able to do better.
> Especially since there is at least one very good book on it.

That presupposes that one can do "better"
using SSL because SSL is "better".  It is
a challenge to translate SSL's strong peer
reviewed heritage into a secure crypto
system.

In practice, if the tool is hard to use, an
implementation opens itself up for problems
in its usage of SSL.  There can be bugs in
the interface, bugs in the architecture
reflected by the complexity of the interface,
and there can be bugs in the underlying tools.

It may be that the SSL underlying code is
perfect.  But that the application is weak
because the implementor didn't understand
how to drive it;  in which case, if he can
roll his own, he may end up with a more
secure overall package.

> > Also, a lot of cryptosystems are put together
> > by committees.  SSH was originally put together
> > by one guy.  He did the lot.
> 
> The original SSH protocol had holes so large that
> you could drive a truck through them.   Tatu posted
> it to various lists and got lots of advice on
> how to clean it up.  It still had holes that were being
> found years later.

Yep.  But the application got up and going,
he didn't wait for the protocol to be perfected,
which mean that the the application had a much
greater chance of ultimate success, and many
more scenarios were protected than otherwise
would have been.

Now it's a good protocol (Peter G reports that
it is highly analogous to SSL, but with its own
packet formats).  It's hole-filled first effort
doesn't seem to have done it so much harm.

> SSLv2, which was also designed by an
> individual, also had major flaws.  And that was the
> second cut!  I haven't seen v1, maybe Eric can
> shed some light on how bad it was.

[ Someone commented before that v1 was not deemed
serious (Marc A?) and v2 was the more acceptable
starting point (Weinsteins?). ]

> Peer review is not "design by comittie".

Let me clarify.  SSL - the protocol - was not
designed by committee, but, the size of the teams
involved in the crypto systems was in excess of
the people who were intimately familiar with the
protocol.  For the most familiar example, browsing,
there were, it seems, many people involved in the
overall grafting of SSL into the original HTML/HTTP
system.  Hence, SSL as a protocol might be a fine
piece of work.  SSL as a browsing application is
flawed, and that's partly because too many different
people and agendas were involved.

(I think the design-by-committee criticism would
stick more strongly to IPSec.)

> It is
> the way to get strong protocols.  When I have to roll my
> own (usually because its working in a limited environment
> and I don't have a choice)
> I get it reviewed.  The protocol designer usually misses
> something in his own protocol.

Sure.  If someone does roll their own, then they
should get it reviewed.

> > I'd say that conditions for Internet crypto system
> > success would include:
> 
> 0. USE EXISTING SECURITY PRIMITIVES

:-)

I know this is the mantra of the field.

Quesion is:  which PRIMITIVES?

1.  RSA?
2.  SSL, written from the RFC?
3.  OpenSSL, the toolkit?  EKR's fine effort?
4.  RSADSI security consultants, selling you
theirs?
5.  ...


> which allows you to
> 
> >   4.  Concentrate on the application, not the crypto.
> 
> Rolling your own crypto is where 95% of crypto apps fail...
> the developers either take too much time on it to the detrimient
> of the useability because it is the sexy thing to work on, or
> they write an insecure algorithm/protocol/system.Usually
> they do both!

It's true that if there is a perfectly good
alternative available, it is probably more
expensive to roll your own than to use the
perfectly good alternative.

But, that assumes an awful lot.  For a start,
that it exists.  SSL is touted as the answer
to everything, but it seems to be a connection
oriented protocol, which would make it less
use for speech, media, mail, chat (?), by way
of example.

It's also very much oriented to x.509 and
similar certificate/PKI models, which means
it is difficult to use in web of trust (I
know this because we started on the path of
adding web of trust and text signing features
to x.509 before going back to OpenPGP),
financial and nymous applications whereby
trust is bootstrapped a different way.

Then there is understanding, both of the
protocol, and the project's needs.  I know
that when I'm in a big project and I come
across a complex new

Re: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Peter Gutmann
Ian Grigg <[EMAIL PROTECTED]> writes:

>Also, a lot of cryptosystems are put together by committees.  SSH was
>originally put together by one guy.  He did the lot.  Allegedly, a fairly
>grotty protocol with a number of weakneses, but it was there and up and
>running.  And SSH-2 is apparantly nice, elegant and easy to understand, now
>that it has been fixed up.

Actually SSHv2 is just SSL with a different packet format (when I did my SSHv2
implementation I recycled the code from the SSL engine, it was that close
[0]).  That's probably a good indication that SSL/SSHv2 is a fairly optimal
(security/functionality/implementability/etc) design for an application-level
security protocol if two groups independently came up with the same design,
which brings us back the original question of why on earth Nullsoft tried to
roll their own.

Peter.

[0] Note that my SSL implementation follows the standard SSL ladder diagram
rather than the state-machine that SSL implementations are usually
described as, which made it trivial to switch over for SSHv2 use.  I've
never understood why every explanation of the SSL protocol I've ever seen
uses ladder diagrams but once they talk about implementation details they
assume you're doing it as a state machine, which makes it vastly harder to
implement.  For example all the stuff about pending cipher suites and
whatnot follows automatically (and transparently) from the ladder diagram,
but is a real pain to sort out in a state machine.



Re: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Eric Murray
On Mon, Jun 02, 2003 at 10:09:06AM -0400, Ian Grigg wrote:
> A lot of the tools and blocks are too hard to
> understand.  "Inaccessible" might be the proper
> term.  This might apply to, for example, SSL,
> and more so to IPSec.  These have a lower survival
> rate, simply because as developers look at them,
> their eyes glaze over and they move on.  I heard
> one guy say that "you can read SSH in an hour
> and understand what's going on, but not SSL."

Some who can't understand SSL won't be able to do better.
Especially since there is at least one very good book on it.


> Also, a lot of cryptosystems are put together
> by committees.  SSH was originally put together
> by one guy.  He did the lot.

The original SSH protocol had holes so large that
you could drive a truck through them.   Tatu posted
it to various lists and got lots of advice on
how to clean it up.  It still had holes that were being
found years later.

SSLv2, which was also designed by an
individual, also had major flaws.  And that was the
second cut!  I haven't seen v1, maybe Eric can
shed some light on how bad it was.

Peer review is not "design by comittie".  It is
the way to get strong protocols.  When I have to roll my
own (usually because its working in a limited environment
and I don't have a choice)
I get it reviewed.  The protocol designer usually misses
something in his own protocol.

> I'd say that conditions for Internet crypto system
> success would include:


0. USE EXISTING SECURITY PRIMITIVES

which allows you to

>   4.  Concentrate on the application, not the crypto.

Rolling your own crypto is where 95% of crypto apps fail...
the developers either take too much time on it to the detrimient
of the useability because it is the sexy thing to work on, or
they write an insecure algorithm/protocol/system.Usually
they do both!


Eric



Re: Maybe It's Snake Oil All the Way Down

2003-06-03 Thread Ian Grigg
A lot of the tools and blocks are too hard to
understand.  "Inaccessible" might be the proper
term.  This might apply to, for example, SSL,
and more so to IPSec.  These have a lower survival
rate, simply because as developers look at them,
their eyes glaze over and they move on.  I heard
one guy say that "you can read SSH in an hour
and understand what's going on, but not SSL."

(This was the point raised by the chap who
recently wanted to role his own from a pouch
of fine cut RSA.)

Also, a lot of cryptosystems are put together
by committees.  SSH was originally put together
by one guy.  He did the lot.  Allegedly, a fairly
grotty protocol with a number of weakneses, but
it was there and up and running.  And SSH-2 is
apparantly nice, elegant and easy to understand,
now that it has been fixed up.

(SSH is the only really successful net crypto
system, IMHO, in that it actually went into its
market and made a mark.  It's the only cryptosystem
that is as easy to use as its non-crypto competitor,
telnet.  It's the only one where people switch and
never return.)

PGP was also mildly successful, and was done by
one guy, PRZ.  The vision was very clear.  All others
had to do was to fix the bugs...  Sadly, free versions
never quite made the jump into GUI mail clients, so
widespread success was denied to it.

I'd say that conditions for Internet crypto system
success would include:

  1.  One guy, or one very small, very close team.

  2.  The whole application is rolled out, ready to use.

  3.  Crypto is own-rolled, tuned to the application.

  4.  Concentrate on the application, not the crypto.

  5.  The application meets a ready need, and

  6.  The app is easy to use.

  7.  User doesn't need to ask anyone's permission.

These aren't very strong indicators of success, if
only because there have been so few fires, for so
much smoke.

Counterexamples are speakfreely, which was again
one lone hacker (John Walker?).  Maybe it stalled
on latter points.  (One doesn't hear much about
crypto phones these days.  Was this really a need?)

My own "interested" protocol (SOX, done by Gary H,
not me) trys to meet the above criterion and hasn't
succeeded, like all other money protocols.  I leave
speculation on why success is still just around the
corner to others :-)



So, I'm with Scott on that.  When it comes down
to it, there's an awful lot of smoke, and precious
little real life crypto success out there.  It's
no wonder that people roll their own.

-- 
iang



Re: Maybe It's Snake Oil All the Way Down

2003-06-02 Thread Eric Rescorla
"Scott Guthery" <[EMAIL PROTECTED]> writes:
> Suppose.  Just suppose.  That you figured out a factoring
> algorithm that was polynomial.  What would you do?  Would
> you post it immediately to cypherpunks?Well, OK, maybe
> you would but not everyone would.  In fact some might
> even imagine they could turn a sou or two.  And you can
> bet the buyer wouldn't be doing any posting. With apologies
> to Bon Ami, "Hasn't cracked yet" is not a compelling security 
> story.

It's vastly better than "just designed last week by someone
who has no relevant experience"

-Ekr

-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



Re: Maybe It's Snake Oil All the Way Down

2003-06-02 Thread Rich Salz
> There are a number of standard building blocks (3DES, AES, RSA, HMAC,
> SSL, S/MIME, etc.). While none of these building blocks are known
> to be secure ..

So for the well-meaning naif, a literature search should result in "no
news is good news."  Put more plainly, if you looked up hash and didn't
find news of a SHA break, then you should know to use SHA.  That assumes
you've heard of SHA in the first place.

Perhaps a few "best practices" papers are in order.  They might help
the secure (distributed) computing field a great deal.
/r$
--
Rich Salz Chief Security Architect
DataPower Technology  http://www.datapower.com
XS40 XML Security Gateway http://www.datapower.com/products/xs40.html



Re: Maybe It's Snake Oil All the Way Down

2003-06-02 Thread Eric Rescorla
"Scott Guthery" <[EMAIL PROTECTED]> writes:
> When I drill down on the many pontifications made by computer
> security and cryptography experts all I find is given wisdom. Maybe 
> the reason that folks roll their own is because as far as they can see 
> that's what everyone does.  Roll your own then whip out your dick and 
> start swinging around just like the experts.
>  
> Perhaps I'm not looking in the right places. I wade through papers from 
> the various academic cryptography groups, I hit the bibliographies 
> regularly, I watch the newgroups, and I follow the patent literature.  After 
> you blow the smoke away, there's always an "assume a can opener" 
> assumption. The only thing that really differentiates the experts from the 
> naifs is the amount of smoke.

Hmm I'd characterize the situation a little differently.

There are a number of standard building blocks (3DES, AES, RSA, HMAC,
SSL, S/MIME, etc.). While none of these building blocks are known
to be secure, we know that:

(1) They have withstood a lot of concerted attempts to attack them.
(2) Prior attempts at building such systems revealed a lot of problems
which these building blocks are designed to avoid.
(3) People who attempt to design new systems generally make some
of the mistakes from (2) and so generally design a system inferior
to the standard ones.

We're slowly proving the correctness of these building blocks and
replacing the weaker ones with others that rely upon tighter
proofs (e.g. OAEP for PKCS-1) but it's a long process. However, I don't
think it's helpful to design a new system that doesn't have any 
obvious advantages over one of the standard systems.

-Ekr


-- 
[Eric Rescorla   [EMAIL PROTECTED]
http://www.rtfm.com/



RE: Maybe It's Snake Oil All the Way Down

2003-06-02 Thread Scott Guthery
Suppose.  Just suppose.  That you figured out a factoring
algorithm that was polynomial.  What would you do?  Would
you post it immediately to cypherpunks?Well, OK, maybe
you would but not everyone would.  In fact some might
even imagine they could turn a sou or two.  And you can
bet the buyer wouldn't be doing any posting. With apologies
to Bon Ami, "Hasn't cracked yet" is not a compelling security 
story.
 
Cheers, Scott

-Original Message- 
From: Rich Salz [mailto:[EMAIL PROTECTED] 
Sent: Sun 6/1/2003 6:16 PM 
To: Eric Rescorla 
Cc: Scott Guthery; cypherpunks; [EMAIL PROTECTED] 
    Subject: Re: Maybe It's Snake Oil All the Way Down



> There are a number of standard building blocks (3DES, AES, RSA, HMAC,
> SSL, S/MIME, etc.). While none of these building blocks are known
> to be secure ..

So for the well-meaning naif, a literature search should result in "no
news is good news."  Put more plainly, if you looked up hash and didn't
find news of a SHA break, then you should know to use SHA.  That assumes
you've heard of SHA in the first place.

Perhaps a few "best practices" papers are in order.  They might help
the secure (distributed) computing field a great deal.
/r$
--
Rich Salz Chief Security Architect
DataPower Technology  http://www.datapower.com
XS40 XML Security Gateway http://www.datapower.com/products/xs40.html



Re: Maybe It's Snake Oil All the Way Down

2003-06-02 Thread Adam Shostack
The assumption that "having cracked a cipher" leads to "can make lots
of money from the break" is one held mostly by those who have never
attacked real systems, which have evolved with lots of checks and
balances.

The very best way to make money from cracking ciphers seems to be to
patent the break, and the fixes, and then consult to those who use the
cipher, because they need your expertise to fix their systems.  P. may
have a patent on this method.

Adam


On Sun, Jun 01, 2003 at 07:05:44PM -0400, Scott Guthery wrote:
| Suppose.  Just suppose.  That you figured out a factoring
| algorithm that was polynomial.  What would you do?  Would
| you post it immediately to cypherpunks?Well, OK, maybe
| you would but not everyone would.  In fact some might
| even imagine they could turn a sou or two.  And you can
| bet the buyer wouldn't be doing any posting. With apologies
| to Bon Ami, "Hasn't cracked yet" is not a compelling security 
| story.
|  
| Cheers, Scott
| 
|   -Original Message- 
|   From: Rich Salz [mailto:[EMAIL PROTECTED] 
|   Sent: Sun 6/1/2003 6:16 PM 
|   To: Eric Rescorla 
|   Cc: Scott Guthery; cypherpunks; [EMAIL PROTECTED] 
|       Subject: Re: Maybe It's Snake Oil All the Way Down
|   
|   
| 
|   > There are a number of standard building blocks (3DES, AES, RSA, HMAC,
|   > SSL, S/MIME, etc.). While none of these building blocks are known
|   > to be secure ..
|   
|   So for the well-meaning naif, a literature search should result in "no
|   news is good news."  Put more plainly, if you looked up hash and didn't
|   find news of a SHA break, then you should know to use SHA.  That assumes
|   you've heard of SHA in the first place.
|   
|   Perhaps a few "best practices" papers are in order.  They might help
|   the secure (distributed) computing field a great deal.
|   /r$
|   --
|   Rich Salz Chief Security Architect
|   DataPower Technology  http://www.datapower.com
|   XS40 XML Security Gateway http://www.datapower.com/products/xs40.html

-- 
"It is seldom that liberty of any kind is lost all at once."
   -Hume



Re: Maybe It's Snake Oil All the Way Down

2003-06-01 Thread Major Variola (ret)
At 08:32 PM 5/31/03 -0400, Scott Guthery wrote:
>Hello, Rich ...
>
>When I drill down on the many pontifications made by computer
>security and cryptography experts all I find is given wisdom.  Maybe
>the reason that folks roll their own is because as far as they can see
>that's what everyone does.  Roll your own then whip out your dick and
>start swinging around just like the experts.

Are you trying to confirm that either the WASTE folks are homosexual, or
puerile,
as one might guess from the names of some of their projects?  (Not that
either impugns their code.)

On the other hand, both AES and 3DES are US gov't approved.  Which is
sufficient reason to use Blowfish.

Some of the other critiques of WASTE methods are substantial, however,
in particular the SSL recommendations are useful tidbits to remember.