I think you missed my point. I wasn't saying that I *liked* canonicalization 
and normalization but rather that you *need* either it or something 
*equivalent* in order to get the same set of bits out of either end. Like you, 
what I was saying was that you need *something* that makes your bit stream 
stable. I'm actually in favor of the JOSE approach, which is exactly to keep 
the original bits around by protecting them with Base64url over the network so 
that they don't get munched. As an implementor, this is fantastic, as it makes 
my code much simpler in terms of both generation and consumption. I have the 
JSON objects where I need them and the raw streams where I need them, and I can 
easily keep them separated with a reasonable assumption about them not being 
confused with each other. For heaven's sake, don't do canonicalization. But at 
the same time, don't assume that a JSON blob is going to be kept pristine as a 
string by any JSON-processing system.

So we're actually in violent agreement on this point. In my view, JOSE's 
approach of keeping the original bits around is the desirable one by far.

 -- Justin


On Jan 29, 2015, at 8:53 AM, Phillip Hallam-Baker 
<[email protected]<mailto:[email protected]>> wrote:

On Thu, Jan 29, 2015 at 8:14 AM, Justin Richer 
<[email protected]<mailto:[email protected]>> wrote:
Relying on side-effects of a handful of contemporary implementations is 
dangerous at best and absolutely foolhardy at worst, especially when it comes 
to security systems. You *need* to have a formal canonicalization, 
normalization, or serialization in order for these things to work in practice.

Otherwise, you're betting on luck, and that's just daft.

-1E200

Canonicalization is the stupidest idea in computer security. It is never ever 
necessary and never ever implemented reliably.

A digital signature signs a sequence of bits. So if you ever want to check a 
signature again, make sure you keep hold of your original sequence of bits. 
Simple!

I see people say that canonicalization is 'essential' in every discussion of 
signatures. What I have never seen is an example of something that is a 
reasonable thing to do that goes wrong if you don't have C15N.

And by reasonable, I do not mean 'take a cert, store it in X.500 directory, 
reassemble'. DER encoding is the stupidest stupid of all the steaming piles of 
stupid in ASN.1. BER meets the needs of X.509 just as well, as was proved by 
the fact that the Web ran quite happily on BER encoded certs until some 
spoilsport let on what we had been doing.


The reason I am proposing JSON Container is precisely to avoid the need for 
canonicalization. That does not work in XML Dig sig and it won't work for JSON.

My proposal has three parts:

1) A blob of data where the only requirement is that it must be a valid JSON 
encoding. Changing this is permitted. Want to add another signature, go ahead! 
So not only are syntactic differences allowed, semantic changes are allowed as 
well.

2) A separator marker to unambiguously define the end of (1) and the start of 
(3)

3) The sequence of bits that was signed.


A signature in (1) cannot refer to any part of (1), it can only reference (3) 
and is by default the whole of (3) but could be a well defined range inside (3) 
instead.

That is it, no JSON canonicalization or unique serialization required. This is 
a proposal for wrapping arbitrary content data with a wrapper that contains 
JSON metadata which might include signatures. There is no reason for either the 
signature or the data to be in a canonical form.

If you want to have signed metadata, you would need to first wrap the content 
with a JSON Container with the metadata and then wrap that with a second 
container with the signature.
_______________________________________________
jose mailing list
[email protected]<mailto:[email protected]>
https://www.ietf.org/mailman/listinfo/jose

_______________________________________________
jose mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/jose

Reply via email to