On Tue, Mar 29, 2011 at 12:48 PM, Todd Heberlein <[email protected]> wrote: >> In particular, the beginning of the OCTET_STRING_t's buffer begins with two >> bytes (decimal values 12 and 21). Am I supposed to skip these? For example, >> the following code where I skip these first two bytes seems to work, but it >> seems like a big hack: > > OK, it seems that the second byte (21 in this case) is the number of > characters encoded (I haven't tried non-ASCII characters). The OCTET_STRING_t > for bundle_version begins 12, 5 (where the string resolves to "1.0.2", five > characters). I still don't know what the 12 means. And does this mean that an > OCTET_STRING can encode at most 256 bytes?
You've probably already figured this out, but no, OCTET_STRING can have as many bytes as it wants. You can store up to 127 bytes using the simple, single byte length field. But if the highest order bit is 1 then the length field is either a "length-of-length" field or "indeterminate length" field (depending on the particular "length" value) where the data is terminated by a end-of-content sequence. Again, I think using lber is the best way to handle this data :) I can't remember why, but once upon a time I wrote an encoder/decoder and had to learn all the subtitles. _______________________________________________ Cocoa-dev mailing list ([email protected]) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [email protected]
