Re: [cryptography] SSL session resumption defective (Re: What project would you finance? [WAS: Potential funding for crypto-related projects])

2013-07-02 Thread Ryan Sleevi
On Tue, July 2, 2013 2:02 pm, Paul Hoffman wrote:
  On Jul 2, 2013, at 1:52 PM, Ben Laurie b...@links.org wrote:

  Alternatively, we stay in this world, clients expire sessions hourly,
  and we're all happy.

  Is this what most recent browsers do? They expire their TLS sessions after
  an hour? That would be nice.

  --Paul Hoffman

Firefox and Chrome use a 24-hour period, as recommended - see
http://mxr.mozilla.org/nss/source/lib/ssl/sslnonce.c#21

CryptoAPI/SChannel defaults to 10 hours, but I don't know if IE is
tweaking that at all - see dwSessionLifespan for
http://msdn.microsoft.com/en-us/library/windows/desktop/aa379810(v=vs.85).aspx

OS X/SecureTransport uses ten minutes as the default - see
SESSION_CACHE_TTL in
http://www.opensource.apple.com/source/Security/Security-55179.11/libsecurity_ssl/security_ssl/appleSession.c

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] Web Cryptography API (W3C Working Draft 8 January 2013)

2013-03-09 Thread Ryan Sleevi
On Sat, March 9, 2013 5:25 pm, Tony Arcieri wrote:
  On Sat, Mar 9, 2013 at 4:16 PM, Jeffrey Walton noloa...@gmail.com wrote:

  The Web Cryptography Working Group looks well organized, provides a
  very good roadmap, and offers good documentation.
  http://www.w3.org/2012/webcrypto/.


  I have a blog post about it forthcoming, but I'd like to share the tl;dr
  version here:

  The normative parts of the specification seem mostly fine.

  The specification provides no normative advice about what algorithms to
  use, and worse, provides a non-normative listing of algorithms which are
  not authenticated encryption modes (for symmetric ciphers, the only mode
  listed in the spec is AES-GCM)

That is correct. This is not a How to use cryptography spec. This is an
API.

This is not an evangelical API. I realize the crypto clergymen may not
like this, but APIs that proselytize do not somehow educate more.  They
merely get in the way of people who know what they're doing, and the
people who don't know what they're doing will find plenty of other ways to
screw up (eg: XSS, XSRF, insecure cookies, clickjacking, framing, etc).
Plus, it only takes one Stack Overflow question  answer, or one bad
W3CSchools post (redundant much?), to undermine whatever message was
intended for those crypto black sheep.



  At the very least, I'd like to see the non-normative examples section
  expanded to include a lot more authenticated encryption modes (EAX mode
  comes to mind, and seeing support for NaCl algorithms like crypto_box and
  crypto_secretbox would be super). Right now they give some rather poor
  recommendations, for example they recommend CBC mode which is fraught with
  problems.

What use case makes the NaCl algorithms (whose specification is merely
'use NaCl', which boils down to Use Salsa+Curve25519) worthwhile? If you
don't trust developers to be able to use the API correctly, what makes you
think that they can sufficiently understand the security guarantees that
NaCl does - and *doesn't* - provide. And how can we be sure that the
problems that NaCl sets out to solve are the same problems developers want
or need to solve, especially when all the evidence suggests otherwise?

Arguably, the answer for whatever use case you imagine NaCl meeting can
almost certainly be met through JOSE, if and when it ever gets its act
together. If you want something high level, use something designed to be
interoperable (and hopefully, JOSE will actually use JSON by then). As
much respect as I have for DJB, Sodium's existence is proof positive of
what NaCl does and doesn't set out to do.

Finally, the recommendations are for what implementations should support.
There is not any mandatory to implement suite at this point. Instead, it's
looking at what are the algorithms in vast, sweeping use today in a number
of protocols and applications, and that developers will expect or need
supported to implement a variety of applications *that already exist
today*.


  Finally, it'd be great to see someone like NIST or ECRYPT provide browser
  vendors with normative advice on algorithms to standardize on. The
  existing
  WebCrypto spec leaves browser vendors to their own devices, and in that
  eventuality, the browser venders will probably wind up implementing the
  W3C
  spec's (poorly chosen) non-normative recommendations.

NIST or ECRYPT? Why not KISA or GOST? After all, everyone loves SEED and
GOST 28147-89...

The answer is that the choice of algorithms were motivated by two factors:
1) As stated in the charter, exposing (some of) the cryptographic services
already inherent in browser applications today. [In order to provide
constant time, correct, validated implementations of the algorithms -
things impossible in JS today]

2) The choice of algorithms that are meaningful to web application
developers - which includes the W3C SysApps WG - which has an *entirely*
different security model than the drive by web. Support for legacy
algorithms in order to support those esoteric protocols like SSH, PGP,
or S/MIME (or would you rather your browser bake them in? *shudder*), as
well as the choice of algorithms that are suitable for future work (and,
notably, being explored in JOSE)


  For an in-depth look at the problems, I'd recommend checking out Matt
  Green's blog post:

  http://blog.cryptographyengineering.com/2012/12/the-anatomy-of-bad-idea.html

Matt's post, besides being entertaining and certainly with some
meritorious points, basically sums up as No backwards compatibility, and
only give people what the priesthood accept. Respectfully, that doesn't
lead to more secure code, nor does it lead to what smart people - people
who know what they're doing - *actually* want or need (as done through
repeated surveys from participants and non-participants of the WG).

While I can understand that, on a list such as this, people are well
trained to turn their noses upwards at bad cryptography, this is not
going to usher in some Apocalypse that 

Re: [cryptography] OAEP for RSA signatures?

2013-01-26 Thread Ryan Sleevi
On Sat, January 26, 2013 5:53 pm, Peter Gutmann wrote:
  ianG i...@iang.org writes:

 Could OAEP be considered reasonable for signatures?

  You need to define appropriate.  For example if you mean interoperable
  then OAEP isn't even appropriate for encryption, let alone signatures.  If
  you're worried about timing channels then OAEP is also pretty
  inappropriate
  for any use.  PKCS #1 OTOH will interop with pretty much anything, and you
  can
  do the padding check in close enough to constant time that it doesn't
  matter.

  Peter.

... Did you just suggest that the timing channels in PKCS#1 v1.5 are
easier to get right than the timing channels of OAEP? The same PKCS#1 v1.5
encryption that's confounding people a decade [1] after the original
attacks [2]?

Encrypt vs signatures assign, what am I missing here? Implementing OAEP
validation in constant time is trivial compared to the pain of not leaking
if the padding was correct for PKCS#1.

[1]
http://www.nds.rub.de/media/nds/veroeffentlichungen/2012/07/11/XMLencBleichenbacher.pdf
[2] http://archiv.infsec.ethz.ch/education/fs08/secsem/Bleichenbacher98.pdf

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How much does it cost to start a root CA ?

2013-01-05 Thread Ryan Sleevi
On Sat, January 5, 2013 10:10 pm, John Case wrote:

  Jon,

  Many thanks for this very informative post - really appreciated.

  Some comments, below...


  On Sat, 5 Jan 2013, Jon Callas wrote:

  Now that $250K that I spent got an offline root CA and an intermediate
  online CA. The intermediate was not capable of supporting workloads that
  would make you a major business. You need a data center after that, that
  supports the workloads that your business requires. But of course, you
  can grow that with your customer workload, and you can buy the
  datacenter space you need.


  You're the second person in this thread to mention hardware and datacenter
  costs ... and while I don't want to drift too far into a blood and guts
  sysadmin rundown, I am curious...  Are you talking about the customer
  facing, retail side of things with the webservers and the load balancers
  and all of the things that make a robust web presence or are you talking
  strictly the x.509 components ?

  Because it seems to me (naive ?) that even a very high volume x.509
  signing operation is ... maybe a pair of good 1u servers and a rack at a
  decent (sas70/pci/blah/blah) datacenter ... ?  Ok, a firewall and maybe
  some IDS system ... but we're still only a handful of 1u boxes and a
  quarter of a rack...

  Perhaps it's this kind of thinking that leads to failed audits :)

It will, it does, and the information is readily available from the
previous post.

https://www.cabforum.org/Baseline_Requirements_V1_1.pdf Sections 14
through 16

Additionally, https://www.cabforum.org/Network_Security_Controls_V1.pdf
describes a series of controls jointly developed by the browsers and CAs.
While I'm not aware of any Browser program requiring them *yet*, I think
any person concerned about the trust online would say Yes, these are all
sensible requirements - stuff that should be obvious for any entity
granted the ability to affect global Internet trust.

You can further find the details of the *existing* requirements for
Physical Security by looking through the recognized Audit programs, such
as WebTrust. See http://www.webtrust.org/homepage-documents/item54279.pdf
- in particular, Sections 3.4 and 3.5

Is it a perfect system? No. But even if the CA/Browser Forum is not fully
open (yet?), improvements can certainly be made to and through Mozilla,
given the openness and transparency that they maintain with their root
certificate policies.
https://lists.mozilla.org/listinfo/dev-security-policy as always - where
you can discuss things such as Mozilla's proposed policy changes,
http://www.mozilla.org/projects/security/certs/policy/WorkInProgress/InclusionPolicy.html



  There are rumors, which you've read here about how there are lots of
  underhanded obstacles in the way of becoming a CA. My experience is that
  the only underhanded part of the industry is that no one in it dispels
  the rumors that there are underhanded obstacles in your path. This is
  pretty much the first time I have, so I suppose I'm as guilty as anyone
  else.


  That's nice to know, and I'm heartened that all the way into 2012 this is
  still the case, but ... boy oh boy does this look and smell like a
  marketplace ripe for monopolization and a cartel ... it's almost a classic
  case.

  I think the presence of a major browser that is a community, independent
  effort is an interesting wrinkle, and the fickleness of the browsing
  public (how fast did chrome shoot up the charts ?  Safari ?) adds a
  wrinkle too, but ... there's no way the large, entrenched players aren't
  sitting around thinking gee we have a nice thing going here...  Not a
  conspiracy theory, just common sense...

You're disregarding the dynamics at play here. The CA's don't set the
requirements - the browsers do.

Yes, the browsers take input from the CAs, but they also (and in
particular, Mozilla) take input from their constituents. Whether you're a
closed-source vendor listening to your customers or an open-source
organization with a public process, there's still a great desire from the
browser vendors to engage the community. Nor is it in the browser vendors'
interests to ignore their users or their users' security. I don't think
any browser wants to be known as the *less* secure browser - we're all
jockeying to be *more* secure, especially where it matters most.

Any defensiveness is no doubt due to the fact that trust in the system
is shared between all participants - lose faith in one CA, and you lose
faith in all CAs. In that sense, existing CAs - particularly entranced
ones - have incentives to improve the state of the trust and security in
the overall system - the same thing users and browsers want most as well.
If the cost of improving the controls and security of the system is that
it means excluding CAs that are not prepared for the solemn public trust
that comes from being in the root stores, then that seems like a win for
all concerned parties.

I'm not trying to write an 

Re: [cryptography] How much does it cost to start a root CA ?

2013-01-04 Thread Ryan Sleevi
On Fri, January 4, 2013 12:59 pm, Greg Rose wrote:
  You could ask the folks at CAcert... I imagine Ian Grigg will also chime
  in. Certification costs a lot, and as you have observed, the incumbents
  try very hard to keep you out. Despite some reasonable sources of funding,
  CAcert still didn't succeed.

  Greg.

Can you explain how, exactly, incumbents leverage any power to keep new
entrants out?

The policies are set by the browsers/root store operators - not CAs.

Microsoft -
http://social.technet.microsoft.com/wiki/contents/articles/3281.introduction-to-the-microsoft-root-certificate-program.aspx
Apple - http://www.apple.com/certificateauthority/ca_program.html
Mozilla - http://www.mozilla.org/projects/security/certs/policy/
Opera - http://www.opera.com/docs/ca/

Consistent among them is that they require a WebTrust or ETSI audit -
audits which were designed to reflect the collective shared policies of
the browsers. Not collective action by CAs.

More recently, the browsers have begun to increase the minimum
requirements they expect of their root store participants, in light of
several prominent failures. These are memorialized in the CA/Browser
Forum's Baseline Requirements (
https://www.cabforum.org/Baseline_Requirements_V1_1.pdf ), which were
driven by browsers seeking to find a consistent, common agreement about
the requirements of their members.

CACert's failures have nothing to do with the actions of any incumbent CA,
but through an inability so far to meet the requirements set forth by the
browser programs they were seeking to be included in. Even Ian has
attested that Mozilla's policy is both clear and fair in this regard.


Additionally, there are not, as the original poster suggested, only 30
root CAs. This can be trivially discovered by examining the lists of CAs
included in these programs - which are all public.

Mozilla - http://www.mozilla.org/projects/security/certs/included/
Microsoft -
http://social.technet.microsoft.com/wiki/contents/articles/14215.windows-and-windows-phone-8-ssl-root-certificate-program-member-cas.aspx
Apple -
http://opensource.apple.com/source/security_certificates/security_certificates-55024.2/
(OS X 10.8.2)
Opera - http://my.opera.com/rootstore/blog/


A lot of speculation on this thread, but the answers are readily and
trivially available.

Cheers,
Ryan


  On 2013 Jan 4, at 11:41 , John Case wrote:

 
  Let's assume hardware is zero ... it's a really variable cost, so I
  assume (correct me if I'm wrong) that it is a trivial cost compared to
  legal and audit costs, etc.
 
  So what does it cost to start a root CA, get properly audited (as I see
  the root CAs are) and get yourself included into, say, firefox or chrome
  ?
 
  A followup question would be:
 
  Is inclusion of a root CA in the major browsers a shall issue process
  ? hat is, you meet the criteria and you get in ?  Or is it a subjective,
  political process ?
 
  Finally, it seems to me that since there re so few root CAs (~30 ?) and
  the service provided is such an arbitrary, misunderstood one, that
  existing CAs would be actively trying to prevent new entrants ... and
  establish themsevles as toll collectors with a pseudo monopoly ... what
  evidence (if any) do we have that they are pursuing such an ecosystem ?
 
  Thank you.
  ___
  cryptography mailing list
  cryptography@randombit.net
  http://lists.randombit.net/mailman/listinfo/cryptography

  ___
  cryptography mailing list
  cryptography@randombit.net
  http://lists.randombit.net/mailman/listinfo/cryptography


___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How much does it cost to start a root CA ?

2013-01-04 Thread Ryan Sleevi
On Fri, January 4, 2013 3:06 pm, James A. Donald wrote:
  On 2013-01-05 8:05 AM, Ryan Sleevi wrote
  Can you explain how, exactly, incumbents leverage any power to keep new
  entrants out?

  Such behavior is necessarily a deviation from official truth, from the
  way certification is supposed to work, thus the only way to observe such
  behavior would be if emails leaked, as in the climategate files where we
  saw how peer review actually worked..

  Analogously, regulators, financial audits and ratings agencies were
  supposed to ensure that banks only invested in safe stuff.  When the
  proverbial hit the fan, it became apparent that regulators, financial
  audits and ratings agencies in practice ensured that banks only invested
  in politically correct stuff, but no one can explain how, exactly, this
  happened - well it is pretty obvious how it happened, and one can make a
  pretty good guess how it happened, but there is no direct official
  evidence as to how it happened.

While I appreciate a good bit of paranoia and tin-foil hat wagging as much
as the next person, I think your analogy breaks down pretty critically.

In the case you referenced, it was the role of auditors and regulators to
keep people out / keep people honest, and they failed, and so more people
/ dishonest people got in. However, the speculation about CA collusion
requires the CAs to be working hard to keep new entrants out - the exact
*opposite* behaviour.

Such a conspiracy requires auditors colluding to keep new entrants out. To
be quite frank, I would be surprised if anyone on this list, concerned
about security, would be saddened or upset if they heard horror stories of
WebTrust auditors finding actionable concerns that kept new entrants out -
such as failures to adhere to their policies or unaddressed security
concerns.

At best, it means the market is incentivizing auditors to closely examine
new entrants for best practices. Is that a bad thing and does it really
demonstrate a vast CA conspiracy? Has there ever been a new CA, attempting
to get audited, who has said with a straight face that the audits are
unreasonably thorough? Shouldn't that be the bare minimum for having the
ability to affect trust globally?

So at best, we have FUD and unsubstantiated speculation about auditors
being too strict - at the same time that the browsers are working to
make the requirements more strict.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] PKI in practice: is there a list of (widely deployed) client-certs-issuing CAs?

2012-04-27 Thread Ryan Sleevi
  A question for those who follow PKI usage trends.

  Is there a list of CAs that issue X.509 end-user certificates?

  Here is the rationale for the question:

  If an end-user has a certificate, he (more or less consciously) controls
  a private key. Suppose one deploys a web server that cares *only* about
  end-user public keys, e.g. it keeps track of end-user reputation and
  that's it for trust management. Then any type of certificate is good
  enough (self-signed, auto-issued, issued by a regular
  client-cert-issuing CA).

  This web server can have an immediate potential user base if it
  negotiates the TLS session with a long list of CA distinguished names
  (in the CertificateRequest message).

  The management tools for the contemplated web server scheme would
  include an issuer DN extraction utility from end-user or CA certificates
  so that the list may be augmented based on casual observations. Also,
  the SSL debugging tools will report the contents of CertificateRequest
  messages from public servers supporting client certs.

  Anyone went through such data collection before?

  Thanks in advance.

  --
  - Thierry Moreau

  CONNOTECH Experts-conseils inc.
  9130 Place de Montgolfier
  Montreal, QC, Canada H2M 2A1

  Tel. +1-514-385-5691
  ___
  cryptography mailing list
  cryptography@randombit.net
  http://lists.randombit.net/mailman/listinfo/cryptography


Why not just send an empty list for certificate_authorities in the
CertificateRequest? Most (all?) user-facing TLS clients will then presume
the site has no restriction, and select from all of the available client
certs that the user may have.

Trying to stuff all the names - not to mention cross-signed intermediates,
which are often necessary - and you're likely to blow out the record
limits. Many TLS implementations, particularly TLS middleboxen, do not
like fragmented handshake messages, so trying to stuff the Whole World
into the request will likely break things considerably.

I'm not sure how such a scheme would work for self-signed user certs, they
inevitably would not match your pre-programmed list of DNs.

See, for example, http://support.microsoft.com/kb/933430

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography