On Jun 30, 10:16 am, Christopher Smith <[email protected]> wrote:
> You could always extend the compiler, but I bet you could get away with a 
> simple preprocessor that aliases types and represents those larger integers 
> as raw bytes.

I guess. This is an interesting and general problem. Practically every
system like protobuf needs to solve it. Python itself, for example,
solves it for pickle by allowing you to write custom methods for your
classes to reduce them to well known types like tuples. And that's a
fairly decent solution, though it feels like it involves a lot of
copying and (again) the creation of a custom translation layer. Though
at least it's not a post-pickle layer, it happens during the pickling
and unpickling process.

I don't think the custom translation can be avoided. But I do think it
can be better integrated into the system.

I would like to see options for types or fields in Protobuf that
allowed you to specify the name of the type to use in a particular
language for representing the value.  Each of the base types would
have an interface they expected the language type to implement in
order to translate the value back and forth.

Protobuf's integer type can already represent integers of arbitrary
precision, it's just that not every language has an arbitrary
precision integer type. My idea would solve this problem by requiring
you to specify the (for example) C++ type to use when deserializing a
large integer. If you didn't, the protobuf compiler would generate an
error.

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/protobuf?hl=en.

Reply via email to