Thanks for the review. Very good comments on validity, these are things that 
the working group should discuss. My comments below on your three validity 
comments.

* Validity - Hey guys, it is less than 30 years from now to 2050.  You need
to be able to encode dates after that point.

[John] This comes from RFC 7925 which only allows UTCTime (YYMMDDHHMMSSZ) and 
not GeneralizedTime (YYYYMMDDHHMMSSZ). Agree that it would be good to support 
also GeneralizedTime. GeneralizedTime could be represented as a int (1 + 8 
bytes) or a byte string (1 + 5 bytes). This comment indicates that the COSE WG 
might want to support a superset of RFC 7925, which should then be reflected in 
the new charter.

* Validity - What is the size difference between this encoding and just
using the Unix time function?

[John] There is no significant size difference. The current encoding is 
actually slightly less efficient (fraction of a single bit), but both encodings 
takes 4 bytes. The reason the current encoding was chosen was that I felt 
unixtime is easy to implement incorrectly. Most people would probably get the 
number of days in different month correctly, but leap years is tricky with 2000 
and 2020 being leap years, but not 2100, even the POSIX committee got it wrong. 
Many non-UNIX systems and programming languages does not have a "Unix time 
function", and many existing time functions like mktime() in C++ use the local 
time zone which increases the risk that people implement unixtime wrongly. Any 
wrongly implemented time encoding/decoding would be a security problem (at 
least it the native version). Would be good to have a discussion in the group 
on what it the easiest option to implement.

* Validity - Is there an intent that all dates can be compared in a
compressed for or only in an uncompressed form?

[John] We have not considered that, but both the current and encoding and Unix 
Time are strictly increasing so it is easy to check if compressed value < 
current time. Unix Time additionally makes it easy to calculate the distance 
between to compressed value, while the current encoding does not.

John

-----Original Message-----
From: Jim Schaad <[email protected]>
Date: Saturday, 26 September 2020 at 05:34
To: "[email protected]" 
<[email protected]>
Cc: "[email protected]" <[email protected]>
Subject: Review draft-mattsson-cose-cbor-cert-compress-01
Resent from: <[email protected]>
Resent to: <[email protected]>, <[email protected]>, 
<[email protected]>, John Mattsson <[email protected]>, 
<[email protected]>
Resent date: Saturday, 26 September 2020 at 05:34

Here is a first review of the document

* I do seem to have a problem with the statement "maximum compression that
an be expected with CBOR".   From what I have looked at the compression is
almost completely due to the use of dictionaries against a specific profile,
with occasional re-encoding than being based on CBOR.  Among other things, I
think that the title of this document is currently misleading.

*  I do not believe that the COSE WG is the place that discussions about
what should be placed in subject/issuer names.  It does make sense to look
at how to compress if those decisions have been made.

* Serial Number - are you going to take advantage of the compression
available from a signed integer to a binary value by stripping the one
possible leading byte?

* Validity - Hey guys, it is less than 30 years from now to 2050.  You need
to be able to encode dates after that point.

* Validity - What is the size difference between this encoding and just
using the Unix time function?

* Validity - Is there an intent that all dates can be compared in a
compressed for or only in an uncompressed form?

* Subject - "An EUI-6 mapped from a 48-bit MAC address is encoded as the MAC
address in a CBOR byte string

* subjectPublicKeyInfo - Tell me about the mapping table.

* extensions - move all of the text after the first paragraph into it's own
section.  You are going to need to make some restriction text on extensions
so that the encoding process will work.

* signatureValue - when computing the signature is the length of the array
changed?

* signatureValue - Need to talk about deterministic encoding here

* CDDL - making the signatureValue come before the signatureAlgorithm makes
it a pain to use only a prefix for the purpose of signature validate and
computation

* One point which Hannes and I would disagree on is that I think you should
upgrade your version of ASN.1 and put in all of the appropriate constraints.
This could include things like the path length in the basicConstraints
extension being required to be set to zero.  

* It would be quite reasonable to create a list of checks that a certificate
must go through before it can be compressed.  One example of that would be
that the certificate to-be-signed needs to be validated as really being DER
encoded and not have some mistaken BER encoding in it.  This of course
includes all of the extensions.

* I do not understand the rules around encoding subjectAltName at all.

* What happens in the future if the profile for IoT certificates is modified
because of things that would be useful.  The first thing that pops to mind
are the extended certificate types that are required both for EST and for
ANIMA.

Jim



_______________________________________________
COSE mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/cose

Reply via email to