Hi Guys,

Pardon me if you think I was hyperbolic,
The discussion got derailed by the bogus claims about hash functions' 
vulnerability.

F.Y.I: Using ES6 serialization methods for JSON primitive types is headed for 
standardization in the IETF.
https://www.ietf.org/mail-archive/web/jose/current/msg05716.html

This effort is backed by one of the main authors behind the current de-facto 
standard for Signed and Encrypted JSON, aka JOSE.
If this is in your opinion is a bad idea, now is the right time to shoot it 
down :-)

This efforts also exploits the ability of JSON.parse() and JSON.stringify() honoring 
object "Creation Order".

JSON.canonicalize() would be a "Sorting" alternative to "Creation Order" 
offering certain advantages with limiting deployment impact to JSON serializers as the most 
important one.

The ["completely broken"] sample code was only submitted as a proof-of-concept. 
I'm sure you JS gurus can do this way better than I :-)

Creating an alternative based on [1,2,3] seems like a rather daunting task.

Thanx,
Anders
https://github.com/cyberphone/json-canonicalization

1] http://wiki.laptop.org/go/Canonical_JSON
2] https://gibson042.github.io/canonicaljson-spec/
3] https://gist.github.com/mikesamuel/20710f94a53e440691f04bf79bc3d756

On 2018-03-17 22:29, Mike Samuel wrote:


On Fri, Mar 16, 2018 at 9:42 PM, Anders Rundgren <[email protected] 
<mailto:[email protected]>> wrote:

    Scott A:
    https://en.wikipedia.org/wiki/Security_level 
<https://en.wikipedia.org/wiki/Security_level>
    "For example, SHA-256 offers 128-bit collision resistance"
    That is, the claims that there are cryptographic issues w.r.t. to Unicode 
Normalization are (fortunately) incorrect.
    Well, if you actually do normalize Unicode, signatures would indeed break, 
so you don't.

    Richard G:
    Is the [highly involuntary] "inspiration" to the JSON.canonicalize() 
proposal:
    https://www.ietf.org/mail-archive/web/json/current/msg04257.html 
<https://www.ietf.org/mail-archive/web/json/current/msg04257.html>
    Why not fork your go library? Then there would be three implementations!

    Mike S:
    Wants to build a 2000+ line standalone JSON canonicalizer working on string 
data.
    That's great but I think that it will be a hard sell getting these guys 
accept the Pull Request:
    https://developers.google.com/v8/ <https://developers.google.com/v8/>
    JSON.canonicalize(JSON.parse("json string data to be canonicalized")) would 
IMHO do the same job.
    My (working) code example was only provided to show the principle as well 
as being able to test/verify.


I don't know where you get the 2000+ line number.
https://gist.github.com/mikesamuel/20710f94a53e440691f04bf79bc3d756 comes in at 
80 lines.
That's roughly twice as long as your demonstrably broken example code, but far 
shorter than the number you provided.

If you're being hyperbolic, please stop.
If that was a genuine guesstimate, but you just happened to be off by a factor 
of 25, then I have less confidence that
you can weigh the design complexity tradeoffs when comparing your's to other 
proposals.


    On my part I added canonicalization to my ES6.JSON compliant Java-based 
JSON tools.  A single line did 99% of the job:
https://github.com/cyberphone/openkeystore/blob/jose-compatible/library/src/org/webpki/json/JSONObjectWriter.java#L928 <https://github.com/cyberphone/openkeystore/blob/jose-compatible/library/src/org/webpki/json/JSONObjectWriter.java#L928>
    for (String property : canonicalized ? new 
TreeSet<String>(object.properties.keySet()) : object.properties.keySet()) {


    Other mentioned issues like HTML safety, embedded nulls etc. would apply to 
JSON.stringify() as well.
    JSON.canonicalize() would inherit all the features (and weaknesses) of 
JSON.stringify().


Please, when you attribute a summary to me, don't ignore the summary that I 
myself wrote of my arguments.

You're ignoring the context.  JSON.canonicalize is not generally useful because 
it undoes safety precautions.
That tied into one argument of mine that you left out: JSON.canonicalize is not 
generally useful.  It should probably not
be used as a wire or storage format, and is entirely unsuitable for embedding 
into other commonly used web application
languages.

You also make no mention of backwards compatibility concerns when this depends 
on things like toJSON, which is hugely important
when dealing with long lived hashes.

When I see that you've summarized my own thoughts incorrectly, even though I 
provided you with a summary of my own arguments,
I lose confidence that you've correctly summarized other's positions.


_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to