Yeah, I do wish JSON had a binary literal type.  This is obviously a bug
in my JSON-RPC code, but also an issue we need to solve for the UI.
When we send binary to the webUI, what is our intent?  I think that
displaying it as base64 encoded text is not generally what the user
wants.  I think displaying a link that will allow them to download the
file is generally a better idea.  Perhaps the Param should indicate how
it should be handled in the webUI.

Asking the question what should the UI be displaying for binary data is a good question to ask. In the specific case of certificates I think displaying it in PEM format is pretty reasonable, a user could cut-n-paste that and have it be useful. I think it would also be good to have a link as you suggest for the user to click on that would perform a download and store the data in a file (probably with a choice of PEM format or DER binary).

As for the other binary data we have yet to deal with, but for which we think there might be a a future need, I think we'll just have to deal with that in the UI on a case by case basis depending on what the data is. Would keytabs be one possible example of such a future need for binary data available via the UI?

The python JSON encoder class does give us the option to hook into the
encoder and check if the object is a str object and then base64 encode.
But that doesn't help us at the opposite end. How would we know when
unmarshaling that a given string is supposed to be base64 decoded back
into binary data? We could prepend a special string and hope that string
never gets used by normal text (yuck). Keeping a list of what needs
base64 decoding is not an option within JSON because at the time of
decoding we have no information available about the context of the JSON
objects.

I think sending it as a dict with a special key, something like:

   {'__base64__': b64encode(my_str)}


Yes, that's a good idea and one I hadn't thought of. My understanding of what you're proposing is this:

For JSON *only* we pre-scan the object to be JSON encoded and any place we discover a str object we replace that str object in the object containing it with a dict {'__base64__': base64_ecoded_value}.

We then send this through the JSON transport.

On the JSON receiving end after JSON decodes back into a Python object we scan the object and every place we find a dict with a '__base64__' key we replace that dict with the base64 decoded value of the '__base64__' key.

I have started to develop a prototype of this code and it seems to work. The functions modify the data "in place" replacing objects within their containers.

I should by the end of the day have enough working to see if this solves the exception generating the backtrace on the Services page.

However, even though this would get us over the hump with regards to passing binary data through JSON, which I think we should do I continue to believe we should not be passing certificates and certficate requests as DER binary data. Based on prior responses there seems to be consensus that certificates should be passed as PEM encoded strings. That's something which can be patched later.

So my general plan is to get binary data working in JSON, then later patch things so that certs and csr's are PEM encoded. This gives us two good things: the ability to pass binary data in a general way which is probably a worthwhile thing to support in the framework even though we won't be using once certs are passed in PEM format, but may be useful down the road for other items, and then a consistent PEM format for certs. How does this sound?
--
John Dennis <jden...@redhat.com>

Looking to carve out IT costs?
www.redhat.com/carveoutcosts/

_______________________________________________
Freeipa-devel mailing list
Freeipa-devel@redhat.com
https://www.redhat.com/mailman/listinfo/freeipa-devel

Reply via email to