Re: [cryptography] Allergy for client certificates

2013-10-10 Thread Guido Witmond
On 10/09/13 15:50, Michael Rogers wrote:
 On 09/10/13 10:56, Guido Witmond wrote:
 You might want to take a look at my experiments. It's a user agent
 that does all the key management for you.
 
 It even does it with never asking anything more difficult than
 what username you want to have at a site.
 
 Hi Guido,
 
 It looks like you've worked around the UX issues by inserting an
 EC-aware proxy between the client and server. Who would be responsible
 for deploying such proxies?

That proxy lives in the end user's computers. Right now, the user needs
to install the proxy. I hope to get time and funding to make it a
Firefox plug in. I hope that when it proofs useful browsers will adopt it.


 What happens if a user creates an EC account from a client machine
 with an EC-aware proxy and then wants to use the account from a client
 machine without a proxy?

You need a user agent that is aware of EC. The alternative is to do all
the rsa-calculations with grey cells. :-(


 This touches on another question I've been meaning to ask you: what
 happens if a user creates an account from a client machine, thus
 installing a client cert on that machine, and then wants to use the
 account from another machine?

Just share the certificate with Firefox Sync like you share password
accounts and bookmarks.


 Also, what happens if a user installs a client cert on a machine and
 then walks away, leaving their client cert exposed to the next user?
 With passwords there's an expectation that once you've logged out, the
 next user can't log into your account. But client certs break that
 expectation.

First, don't do that. Using public access computers (at a library) is a
bad idea anyway. Even if you can trust the library administrators, you
don't know was at that computer before you.

Second, I expect that every user has their own device. And that device
has enough protections against abuse by others. For example, a screen
lock that monitors if it is you who's using the phone and requests a
fingerprint scan and a pin code or swipe style to unlock. (Cypherpunk
Snowcrash style).

Third. You have many certificates. Some have more value than others. You
(the end user) has the option to encrypt certain private keys with a
passphrase and a short unlock-interval. While not bothering with that
for other keys.

Each certificate is a separate identity. You don't share certificates
over multiple web sites.

Would you leave your phone in a hurry then someone can use those keys
at their respective web sites to impersonate you. Until the screen saver
kicks in.

If your phone doesn't allow unprotected access to the filesystem where
the keys are stored, they can't copy any of the keys.


These are important usability questions that will need to be addressed
sooner or later in future. However, they are all client side decisions.
The server is not involved at all. So when the need comes, it can be
implemented quickly. And different people can have different solutions.

However, I want to keep it simple for now. I want to show how easy it
can be to use certificates, to break that horrible browser UX.


Regards, Guido.



signature.asc
Description: OpenPGP digital signature
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Allergy for client certificates

2013-10-10 Thread Guido Witmond
On 10/09/13 16:47, stef wrote:
 On Wed, Oct 09, 2013 at 02:50:59PM +0100, Michael Rogers wrote:
 This touches on another question I've been meaning to ask you: what
 happens if a user creates an account from a client machine, thus
 installing a client cert on that machine, and then wants to use the
 account from another machine?
 
 i guess the user has to use the crappy ui of the browser to extract it. while
 the browser vendors are polishing rounded transparent tabs instead.

Talking about leaving your users in the dark


 Also, what happens if a user installs a client cert on a machine and
 then walks away, leaving their client cert exposed to the next user?
 With passwords there's an expectation that once you've logged out, the
 next user can't log into your account. But client certs break that
 expectation.
 
 indeed, client auth is bound to the browser in this sense and needs to be
 understood by the users, this is a cognitive entry barrier to usage.

There is nothing that prevents a proper browser to share client
certificates with their private key in Firefox Sync among a users' devices.

In fact that would be good as a backup strategy too. Losing a private
key means losing the account.

Regards, Guido.



signature.asc
Description: OpenPGP digital signature
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


[cryptography] Cryptographers condemn US National Security Agency’s tapping and tampering, but mathematicians shrug.

2013-10-10 Thread Eugen Leitl

http://www.nature.com/news/researchers-split-over-nsa-hacking-1.13911

Researchers split over NSA hacking

Cryptographers condemn US National Security Agency’s tapping and tampering,
but mathematicians shrug.

Ann Finkbeiner 08 October 2013

The National Security Agency is the largest employer of mathematicians in the
United States.

PATRICK SEMANSKY/ASSOCIATED PRESS

The US National Security Agency (NSA) has upset a great many people this
year. Since June, newspapers have been using documents leaked by former
intelligence worker Edward Snowden to show how the secretive but powerful
agency has spied on the communications of US citizens and foreign
governments. Last month, the media reported that the NSA, which is based in
Fort Meade, Maryland, had undermined Internet security standards. The
revelations have sparked international outrage at the highest levels — even
the president of Brazil cancelled a visit to the United States because of the
spying.

Yet amid the uproar, NSA-supported mathematicians and computer scientists
have remained mostly quiet, to the growing frustration of others in similar
fields. “Most have never met a funding source they do not like,” says Phillip
Rogaway, a computer scientist at the University of California, Davis, who has
sworn not to accept NSA funding and is critical of other researchers’
silence. “And most of us have little sense of social responsibility.”

Mathematicians and the NSA are certainly interdependent. The agency declares
that it is the United States’ largest maths employer, and Samuel Rankin,
director of the Washington DC office of the American Mathematical Society,
estimates that the agency hires 30–40 mathematicians every year. The NSA
routinely holds job fairs on university campuses, and academic researchers
can work at the agency on sabbaticals. In 2013, the agency’s mathematical
sciences programme offered more than US$3.3 million in research grants.

Furthermore, the NSA has designated more than 150 colleges and universities
as centres of excellence, which qualifies students and faculty members for
extra support. It can also fund research indirectly through other agencies,
and so the total amount of support may be much higher. A leaked budget
document says that the NSA spends more than $400 million a year on research
and technology — although only a fraction of this money might go to research
outside the agency itself.

“I understand what’s in the newspapers, but the NSA is funding serious
long-term fundamental research and I’m happy they’re doing it.” Many US
researchers, especially those towards the basic-research end of the spectrum,
are comfortable with the NSA’s need for their expertise. Christopher Monroe,
a physicist at the University of Maryland in College Park, is among them. He
previously had an NSA grant for basic research on controlling cold atoms,
which can form the basis of the qubits of information in quantum computers.
He notes that he is free to publish in the open literature, and he has no
problems with the NSA research facilities in physical sciences,
telecommunications and languages that sit on his campus. Monroe is
sympathetic to the NSA’s need to track the develop­ment of quantum computers
that could one day be used to crack codes beyond the ability of conventional
machines. “I understand what’s in the newspapers,” he says, “but the NSA is
funding serious long-term fundamental research and I’m happy they’re doing
it.”

Dena Tsamitis, director of education, outreach and training at Carnegie
Mellon University’s cybersecurity research centre in Pittsburgh,
Pennsylvania, also wants to maintain the relationship. She oversees visitors
and recruiters from the NSA but her centre gets no direct funding. She says
that her graduate students understand the NSA’s public surveillance to be “a
policy decision, not a technology decision. Our students are most interested
in the technology.” And the NSA, she says — echoing many other researchers —
“has very interesting technology problems”.

The academics who are professionally uneasy with the NSA tend to lie on the
applied end of the spectrum: they work on computer security and cryptography
rather than pure mathematics and basic physics. Matthew Green, a
cryptographer at Johns Hopkins University in Baltimore, Maryland, says that
these researchers are unsettled in part because they are dependent on
protocols developed by the US National Institute of Standards and Technology
(NIST) to govern most encrypted web traffic. When it was revealed that the
NSA had inserted a ‘back door’ into the NIST standards to allow snooping,
some of them felt betrayed. “We certainly had no idea that they were
tampering with products or standards,” says Green. He is one of 47
technologists who on 4 October sent a letter to the director of a group
created last month by US President Barack Obama to review NSA practices,
protesting because the group does not include any independent technologists.

Edward Felten, who studies 

[cryptography] was this FIPS 186-1 (first DSA) an attemped NSA backdoor?

2013-10-10 Thread Adam Back

Some may remember Bleichenbacher found a random number generator bias in the
original DSA spec, that could leak the key after soem number of signatures
depending the circumstances.

Its described in this summary of DSA issues by Vaudenay Evaluation Report
on DSA

http://www.ipa.go.jp/security/enc/CRYPTREC/fy15/doc/1002_reportDSA.pdf

   
Bleichenbacher's attack is described in section 5.


The conclusion is Bleichenbacher estimates that the attack would be
practical for a non-negligible fraction of qs with a time complexity of
2^63, a space complexity of 2^40, and a collection of 2^22 signatures.  We
believe the attack can still be made more efficient.

NIST reacted by issuing special publication SP 800-xx to address and I
presume that was folded into fips 186-3.  Of course NIST is down due to the
USG political level stupidity (why take the extra work to switch off the web
server on the way out I dont know).

That means 186-1 and 186-2 were vulnerable.

An even older NSA sabotage spotted by Bleichenbacher?

Anyway it highlights the significant design fragility in DSA/ECDSA not just
in the entropy of the secret key, but in the generation of each and every k
value, which leads to the better (but non-NIST recommended) idea adopted by
various libraries and applied crypto people to use k=H(m,d) so that the
signture is determinstic in fact, and the same k value will only be used
with the same message (which is harmless as thts just reissuing the bitwise
same signature).  


What happens if a VM is rolled back including the RNG and it outputs the
same k value to a different network dependeng m value?  etc.  Its just
unnecessarily fragile in its NIST/NSA mandated form.

Adam
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] was this FIPS 186-1 (first DSA) an attemped NSA backdoor?

2013-10-10 Thread James A. Donald

On 2013-10-10 23:30, Adam Back wrote:Of course NIST is down due to the
USG political level stupidity (why take the extra work to switch off 
the web

server on the way out I dont know).


Note that the obamacare websites are still open, and that parks that are 
normally operated by private contractors who normally pay rent to the 
government for their concession stands now have government employees 
present to prevent people from operating them.


So chances are that NIST is still busily plotting against security, but 
has turned of outside access to its websites.


It would seem that the 85% of government that is still operating is the 
part that no voters will notice, and the 15% that is shut down is the 
part that voters are likely to notice, and, the government hopes, put 
pressure on the Republican party.


Logically therefore, we should shut down the 85%, and keep the 15% open.



___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Allergy for client certificates

2013-10-10 Thread ianG

On 9/10/13 01:41 AM, Tony Arcieri wrote:


We use client certs extensively for S2S authentication where I work
(Square).

As for web browsers, client certs have a ton of problems:



I have successfully used them in a PHP website of my own design.  I just 
plugged away until they worked.  I grant they have a ton of problems, 
but it may be a case of half-empty or half-full.  Here's my point by 
point experiences, partly because the whole exercise for me was in order 
to find out...




1) UX is *TERRIBLE*. Even if you you tell your browser to use a client
cert for a given service, and you go back to that service again,
browsers often don't remember and prompt you EVERY TIME to pick which
cert to use from a giant list. If you have already authenticated against
a service with a given client cert, and that service's public key hasn't
changed, there's absolutely no reason to prompt the user every single
time to pick the cert from all of the client certs they have installed.



Yes, that part doesn't work.  So what my site did was to take every cert 
provided and hook it up to the account in question.  This was a major 
headache to code up because I had to interpolate the contents of the 
certs and do things like match email addresses.


It has a number of interesting edge cases such as correct name but 
different email address.  Also, the name isn't unique, or is it?


I solved these edge cases by leaning on CAcert's systems of governance, 
and simply asking the user:  is this you?  If they lie, I can fall 
back on Arb to solve everything.  (OK, I cheated a bit there to get it 
to work, there are other solutions and other possibilities, but I wanted 
seamless non-support solution.)




2) HTML keygen tag workflow is crap and confusing. It involves
instructing users to install the generated cert in their browser, which
is weird and unfamiliar to begin with. Then what? There's no way to
automatically direct users elsewhere, you have to leave a big list of
instructions saying Please install the cert, then after the cert is
installed (how will the user know?) click this link to continue



This is a problem that is outsourced from the website/user to the 
CA/user interface.  It can be done.  I don't know how the coding is 
done, but CAs do handle this well enough.  I think again it is just a 
matter of plugging away until you get the code going.


(What is not easy is using the certs for email.  That's a fail, unless 
you are using some form of automatic certs distribution.)



3) Key management UX is crap: where are my keys? That varies from
browser to browser. Some implement their own certificate stores. Others
use the system certificate store. How do I get to my keys? For client
certs to replace passwords, browsers need common UI elements that make
managing, exporting, and importing keys an easy process.



It is true that key management is crap, but how much do you care?  As 
long as the keys work, everything is cool ... for *users*.  They never 
need to look at keys.  Only developers need that ... ;-)


(Yes, this is skipping the whole privacy question, but having worked 
through it, you aren't going to do any worse with certs.)




Passwords may be terrible, but they're familiar and people can actually
use them to successfully log in. This is not the case for client certs.
They're presently way too confusing for the average user to understand.



I think, if you can crack the get a cert in the browser problem, that 
whole equation flips around.  In my experience, client certs worked way 
easier than passwords.  They just worked.


The big benefit we had in our community was that our target audience 
already had to have put their cert into their browser, it was part of 
the Assurer test.


What was not easy is the websites.  Taking random site X like a wiki and 
engaging it for immediate auth with the cert is hard, mostly because 
these systems out there have never really considered certs, and often 
enough they haven't even considered SSL.





iang



ps;   More here:
http://wiki.cacert.org/Technology/KnowledgeBase/ClientCerts/theOldNewThing
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography