that can be serialized.
You can certainly use protocol buffers in large data sets, but it's not
recommended to have your entire data set be represented by a single
message.
Instead, see if you can break it up into smaller messages.
On Mon, May 17, 2010 at 1:05 PM, sanikumbh saniku
on the size of data that can be serialized.
You can certainly use protocol buffers in large data sets, but it's not
recommended to have your entire data set be represented by a single message.
Instead, see if you can break it up into smaller messages.
On Mon, May 17, 2010 at 1:05 PM, sanikumbh
if you really need to parse larger messages, but it is
generally
not recommended. Additionally, ByteSize() returns a 32-bit integer, so
there's an implicit limit on the size of data that can be serialized.
You can certainly use protocol buffers in large data sets, but it's not
recommended
I wanted to get some opinion on large data sets and protocol buffers.
Protocol Buffer project page by google says that for data 1
megabytes, one should consider something different but they don’t
mention what would happen if one crosses this limit. Are there any
known failure modes when it comes
an implicit limit on the size of data that can be serialized.
You can certainly use protocol buffers in large data sets, but it's not
recommended to have your entire data set be represented by a single message.
Instead, see if you can break it up into smaller messages.
On Mon, May 17, 2010 at 1
Hi,
In the documentation here:
http://code.google.com/apis/protocolbuffers/docs/techniques.html#large-data
it specifies that if you are dealing in messages larger than a
megabyte each, it may be time to consider an alternate strategy.
My question is: does this apply to messages which are
On Tue, Oct 6, 2009 at 9:34 AM, Brenden Matthews bren...@diddyinc.comwrote:
it specifies that if you are dealing in messages larger than a
megabyte each, it may be time to consider an alternate strategy.
My question is: does this apply to messages which are large because
they themselves