Hi,
I generate a huge number of the same messages and save them one by one in a
file. Each message is generated and then saved on the fly. This way I do not
keep in memory large array of messages, only one at a time. Everything works
fine. The largest message written is about 2K (serialized str
On Mar 6, 2011, at 12:19 , ksamdev wrote:
libprotobuf ERROR google/protobuf/io/coded_stream.cc:147] A protocol
message was rejected because it was too big (more than 67108864
bytes). To increase the limit (or to disable these warnings), see
CodedInputStream::SetTotalBytesLimit() in google/p
How come? I explicitly track the larges message written to the file
with: http://goo.gl/SAKlU
Here is an example of output I get:
[1 ProtoBuf git.hist]$ ./bin/write data.pb && echo "---===---" && ./bin/read
data.pb
Saved: 100040 events
Largest message size writte: 1815 bytes
---===---
File has:
I think I found the source of the problem. The problem is that
CodedInputStream has internal counter of how many bytes are read so far with
the same object.
In my case, there are a lot of small messages saved in the same file. I do
not read them at once and therefore do not care about large mes