Re: Serializing Large Collections: SerializePartial ParsePartial

2008-11-12 Thread bmadigan
I think the idea is to break up very large data sets into smaller packets so they can be 'streamed'. When I think of something like seismic data, stream based event handling makes the most sense. Can the data points be processed individually somehow, or do you need access to all of them (in

Re: Serializing Large Collections: SerializePartial ParsePartial

2008-10-22 Thread Kenton Varda
The Partial serialize and parse routines actually do something completely unrelated: they allow the message to be missing required fields. So, that doesn't help you. I'm afraid protocol buffers are not designed for storing very large collections in a single message. Instead, you should be

Re: Serializing Large Collections: SerializePartial ParsePartial

2008-10-22 Thread bruce . weertman
OK, that makes sense. Thanks for the quick reply. I work at a seismic earthquake data center. We're looking at using protocol buffers as a means of internally moving around processed chunks of data. Seems to work pretty well, as long as the chunks aren't too large (which is a problem one way or