Re: [Cryptography] RSA recommends against use of its own products.

2013-09-29 Thread James A. Donald

On 2013-09-27 09:54, Phillip Hallam-Baker wrote:


Quite, who on earth thought DER encoding was necessary or anything 
other than incredible stupidity?


I have yet to see an example of code in the wild that takes a binary 
data structure, strips it apart and then attempts to reassemble it to 
pass to another program to perform a signature check. Yet every time 
we go through a signature format development exercise the folk who 
demand canonicalization always seem to win.


DER is particularly evil as it requires either the data structures to 
be assembled in the reverse order or a very complex tracking of the 
sizes of the data objects or horribly inefficient code. But XML 
signature just ended up broken.


We have a compiler that generates C code from ASN.1 code.  Does it not 
generate code behind the scenes that does all this ugly stuff for us 
without us having to look at the code?


I have not actually used the compiler, and I have discovered that hand 
generating code to handle ASN.1 data structures is a very bad idea, but 
I am told that if I use the compiler, all will be rainbows and unicorns.


You go first.
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

[Cryptography] NIST about to weaken SHA3?

2013-09-29 Thread Christoph Anton Mitterer
Hey.

Not sure whether this has been pointed out / discussed here already (but
I guess Perry will reject my mail in case it has):

https://www.cdt.org/blogs/joseph-lorenzo-hall/2409-nist-sha-3


This makes NIST seem somehow like liars,... on the one hand they claim
to surprised by the alleged NSA-conspiracy around Dual_EC_DRBG and that
this would be against their intentions... on the other hand it looks as
if they'd be trying the same thing again.


Cheers,
Chris.

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


[Cryptography] TLS2

2013-09-29 Thread ianG

On 28/09/13 20:07 PM, Stephen Farrell wrote:


b) is TLS1.3 (hopefully) and maybe some extensions for earlier
versions of TLS as well



SSL/TLS is a history of fiddling around at the edges.  If there is to be 
any hope, start again.  Remember, we know so much more now.  Call it 
TLS2 if you want.


Start with a completely radical set of requirements.  Then make it so. 
There are a dozen people here who could do it.


Why not do the requirements, then ask for competing proposals?  Choose 
1.  It worked for NIST, and committees didn't work for anyone.


A competition for TLS2 would bring out the best and leave the bureaurats 
fuming and powerless.




iang
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] RSA equivalent key length/strength

2013-09-29 Thread Jerry Leichter
On Sep 28, 2013, at 3:06 PM, ianG wrote:
 Problem with the NSA is that its Jekyll and Hyde. There is the good side
 trying to improve security and the dark side trying to break it. Which
 side did the push for EC come from?
 What's in Suite A?  Will probably illuminate that question...
The actual algorithms are classified, and about all that's leaked about them, 
as far as I can determine in a quick search, is the names of some of them, and 
general properties of a subset of those - e.g., according to Wikipedia, BATON 
is a block cipher with a key length of 320 bits (160 of them checksum bits - 
I'd guess that this is an overt way for NSA to control who can use stolen 
equipment, as it will presumably refuse to operate at all with an invalid key). 
 It looks as if much of this kind of information comes from public descriptions 
of equipment sold to the government that implements these algorithms, though a 
bit of the information (in particular, the name BATON and its key and block 
sizes) has made it into published standards via algorithm specifiers.  cryptome 
has a few leaked documents as well - again, one showing BATON mentioned in 
Congressional testimony about Clipper.

Cryptographic challenge:  If you have a sealed, tamper-proof box that 
implements, say, BATON, you can easily have it refuse to work if the key 
presented doesn't checksum correctly.  In fact, you'd likely have it destroy 
itself if presented with too many invalid keys.  NSA has always been really big 
about using such sealed modules for their own algorithms.  (The FIPS specs were 
clearly drafted by people who think in these terms.  If you're looking at them 
while trying to get software certified, many of the provisions look very 
peculiar.  OK, no one expects your software to be potted in epoxy (opaque in 
the ultraviolet - or was it infrared?); but they do expect various kinds of 
isolation that just affect the blocks on a picture of your software's 
implementation; they have no meaningful effect on security, which unlike 
hardware can't enforce any boundaries between the blocks.)

Anyway, this approach obviously depends on the ability of the hardware to 
resist attacks.  Can one design an algorithm which is inherently secure against 
such attacks?  For example, can one design an algorithm that's strong when used 
with valid keys but either outright fails (e.g., produces indexes into 
something like S-boxes that are out of range) or is easily invertible if used 
with invalid keys (e.g., has a key schedule that with invalid keys produces all 
0's after a certain small number of rounds)?  You'd need something akin to 
asymmetric cryptography to prevent anyone from reverse-engineering the checksum 
algorithm from the encryption algorithm, but I know of no fundamental reason 
why that couldn't be done.
-- Jerry

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] RSA recommends against use of its own products.

2013-09-29 Thread Jerry Leichter
On Sep 26, 2013, at 7:54 PM, Phillip Hallam-Baker wrote:
 ...[W]ho on earth thought DER encoding was necessary or anything other than 
 incredible stupidity?...
It's standard.  :-)

We've been through two rounds of standard data interchange representations:

1.  Network connections are slow, memory is limited and expensive, we can't 
afford any extra overhead.  Hence DER.
2.  Network connections are fast, memory is cheap, we don't have to worry about 
them - toss in every last feature anyone could possibly want.  Hence XML.

Starting from opposite extremes, committees of standards experts managed to 
produce results that are too complex and too difficult for anyone to get right 
- and which in cryptographic contexts manage to share the same problem of 
multiple representations that make signing such a joy.

BTW, the *idea* behind DER isn't inherently bad - but the way it ended up is 
another story.  For a comparison, look at the encodings Knuth came up with in 
the TeX world.  Both dvi and pk files are extremely compact binary 
representations - but correct encoders and decoders for them are plentiful.  
(And it's not as if the Internet world hasn't come up with complex, difficult 
encodings when the need arose - see IDNA.)

-- Jerry


___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] RSA equivalent key length/strength

2013-09-29 Thread Lodewijk andré de la porte
2013/9/29 James A. Donald jam...@echeque.com

 (..) fact, they are not provably random, selected (...)

fixed that for you

It seems obvious that blatant lying about qualities of procedures must have
some malignant intention, yet ignorance is as good an explanation. I don't
think lying the other way would solve anything. It's obviously not
especially secure.
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

Re: [Cryptography] RSA equivalent key length/strength

2013-09-29 Thread James A. Donald

On 2013-09-30 03:14, Lodewijk andré de la porte wrote:
2013/9/29 James A. Donald jam...@echeque.com 
mailto:jam...@echeque.com


(..) fact, they are not provably random, selected (...)

fixed that for you

It seems obvious that blatant lying about qualities of procedures must 
have some malignant intention, yet ignorance is as good an 
explanation. I don't think lying the other way would solve anything. 
It's obviously not especially secure.



The NIST ec curves are provably non random, and one can prove that NIST 
is lying about them, which is circumstantial but compelling evidence 
that they are backdoored:


   From: Gregory Maxwellgmaxw...@gmail.com  mailto:gmaxw...@gmail.com
   To: This mailing list is for all discussion about theory, design, and 
development of Onion Routing.
tor-t...@lists.torproject.org  mailto:tor-t...@lists.torproject.org
   Subject: Re: [tor-talk] NIST approved crypto in Tor?
   Reply-To:tor-t...@lists.torproject.org  
mailto:tor-t...@lists.torproject.org

   On Sat, Sep 7, 2013 at 4:08 PM, anonymous coward
   anonymous.cow...@posteo.de  mailto:anonymous.cow...@posteo.de  wrote:

   Bruce Schneier recommends **not** to use ECC. It is safe to
   assume he knows what he says.

   I believe Schneier was being careless there. The ECC parameter
   sets commonly used on the internet (the NIST P-xxxr ones) were
   chosen using a published deterministically randomized procedure.
   I think the notion that these parameters could have been
   maliciously selected is a remarkable claim which demands
   remarkable evidence.

   On Sat, Sep 7, 2013 at 8:09 PM, Gregory Maxwellgmaxw...@gmail.com  
mailto:gmaxw...@gmail.com  wrote:

   Okay, I need to eat my words here.

   I went to review the deterministic procedure because I wanted to see
   if I could repoduce the SECP256k1 curve we use in Bitcoin. They
   don’t give a procedure for the Koblitz curves, but they have far
   less design freedom than the non-koblitz so I thought perhaps I’d
   stumble into it with the “most obvious” procedure.

   The deterministic procedure basically computes SHA1 on some seed and
   uses it to assign the parameters then checks the curve order, etc..
   wash rinse repeat.

   Then I looked at the random seed values for the P-xxxr curves. For
   example, P-256r’s seed is c49d360886e704936a6678e1139d26b7819f7e90.

   _No_ justification is given for that value. The stated purpose of
   the “veritably random” procedure “ensures that the parameters cannot
   be predetermined. The parameters are therefore extremely unlikely to
   be susceptible to future special-purpose attacks, and no trapdoors
   can have been placed in the parameters during their generation”.

   Considering the stated purpose I would have expected the seed to be
   some small value like … “6F” and for all smaller values to fail the
   test. Anything else would have suggested that they tested a large
   number of values, and thus the parameters could embody any
   undisclosed mathematical characteristic whos rareness is only
   bounded by how many times they could run sha1 and test.

   I now personally consider this to be smoking evidence that the
   parameters are cooked. Maybe they were only cooked in ways that make
   them stronger? Maybe

   SECG also makes a somewhat curious remark:

   “The elliptic curve domain parameters over (primes) supplied at each
   security level typically consist of examples of two different types
   of parameters — one type being parameters associated with a Koblitz
   curve and the other type being parameters chosen verifiably at
   random — although only verifiably random parameters are supplied at
   export strength and at extremely high strength.”

   The fact that only “verifiably random” are given for export strength
   would seem to make more sense if you cynically read “verifiably
   random” as backdoored to all heck. (though it could be more
   innocently explained that the performance improvements of Koblitz
   wasn’t so important there, and/or they considered those curves weak
   enough to not bother with the extra effort required to produce the
   Koblitz curves).


___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

Re: [Cryptography] RSA equivalent key length/strength

2013-09-29 Thread James A. Donald
Gregory Maxwell on the Tor-talk list has found that NIST approved 
curves, which is to say NSA approved curves, were not generated by the 
claimed procedure, which is a very strong indication that if you use 
NIST curves in your cryptography, NSA can read your encrypted data.


As computing power increases, NSA resistant RSA key have become 
inconveniently large, so have to move to EC keys.


NIST approved curves are unlikely to be NSA resistant.

Therefore, everyone should use Curve25519, which we have every reason to 
believe is unbreakable.

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] RSA recommends against use of its own products.

2013-09-29 Thread Peter Gutmann
Phillip Hallam-Baker hal...@gmail.com writes:

Quite, who on earth thought DER encoding was necessary or anything other than
incredible stupidity?

At least some X.500/LDAP folks thought they could do it.  Mind you, we're
talking about people who believe in X.500/LDAP here...

Peter.

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] RSA equivalent key length/strength

2013-09-29 Thread Viktor Dukhovni
On Mon, Sep 30, 2013 at 10:07:14AM +1000, James A. Donald wrote:

 Therefore, everyone should use Curve25519, which we have every
 reason to believe is unbreakable.

Superceded by the improved Curve1174.

http://cr.yp.to/elligator/elligator-20130527.pdf 

-- 
Viktor.
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] NIST about to weaken SHA3?

2013-09-29 Thread James A. Donald

On 2013-09-30 13:12, Christoph Anton Mitterer wrote:

https://www.cdt.org/blogs/joseph-lorenzo-hall/2409-nist-sha-3


This makes NIST seem somehow like liars

If one lie, all lies.


___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] NIST about to weaken SHA3?

2013-09-29 Thread Viktor Dukhovni
On Mon, Sep 30, 2013 at 05:12:06AM +0200, Christoph Anton Mitterer wrote:

 Not sure whether this has been pointed out / discussed here already (but
 I guess Perry will reject my mail in case it has):
 
 https://www.cdt.org/blogs/joseph-lorenzo-hall/2409-nist-sha-3

I call FUD.  If progress is to be made, fight the right fights.

The SHA-3 specification was not weakened, the blog confuses the
effective security of the algorithtm with the *capacity* of the
sponge construction.

The actual NIST Proposal strengthens SHA-3 relative to the authors'
most performant proposal (http://eprint.iacr.org/2013/231.pdf
section 6.1) by rounding up the capacity of the sponge construction
to 256 bits for both SHA3-224 and SHA3-256, and rounding up to 512
bits for both SHA3-384 and SHA3-512 (matching the proposal in
section 6.2).

The result is that the 256-capacity variant gives 128-bit security
against both collision and first preimage attacks, while the 512-bit
capacity variant gives 256-bit security.  This removes the asymmetry
in the security properties of the hash.  Yes, this is a performance
trade-off, but it seems entirely reasonable.  Do you really need
256 bits of preimage resistance with 128-bit ciphersuites, or 512
bits of preimage resistance with 256-bit ciphersuites?

SHA2-256's  O(256) bits of preimage resistance was not a design
requirement, rather it needed 128-bits of collision resistance,
the stronger preimage resistance is an artifact of the construction.

For a similar sentiment see:

http://crypto.stackexchange.com/questions/10008/why-restricting-sha3-to-have-only-two-possible-capacities

-- 
Viktor.
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] RSA recommends against use of its own products.

2013-09-29 Thread James A. Donald

On 2013-09-29 23:13, Jerry Leichter wrote:

BTW, the *idea* behind DER isn't inherently bad - but the way it ended up is 
another story.  For a comparison, look at the encodings Knuth came up with in 
the TeX world.  Both dvi and pk files are extremely compact binary 
representations - but correct encoders and decoders for them are plentiful.



DER is unintelligble and incomprehensible.  There is, however, an open 
source complier for ASN.1


Does it not produce correct encoders and decoders for DER?  (I have 
never used it)

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography