I am writing a Java server and C++ client and trying to use Protocol
Buffers for encoding the socket communication. I have written the Java
server and tested it with a Java based client and all is well.
However, I am having some difficulty with the C++ portion.
I've wrapped the socket code in my own CopyingInputStream and
CopyingOutputStream sub-classes and am using those with the
CopyingInputStreamAdaptor and CopyingOutputStreamAdapdtor classes.
These classes are further wrapped in CodedInputStream and
CodedOutputStream instances for reading/writing the serialized data.
When I send messages I encode the message type and size as varints
then the message itself. When reading the server response in the C++
code I read the message type and size then pass the size to
CodedInputStream.PushLimit(). My server response is very small, 17
bytes so the initial read gets all the data (I verified that by
stepping through the code in the debugger).
In this case the type and size are each encoded as 1 byte so that
leaves 15 bytes for the message. What I am finding is that even after
the call to PushLimit() with the value 15 the CodedInputStream is
trying to read more bytes. Again, stepping through the code in the
debugger I can see that the 15 bytes are already buffered. I've poured
through the C++ docs and the description of limits states that
PushLimit will prevent CodedInputStream from reading any additional
This is a problem as the socket is doing a blocking read so it is
waiting for bytes that will not be sent.
Am I mis-understanding the behavior of PushLimit?
You received this message because you are subscribed to the Google Groups
"Protocol Buffers" group.
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to
For more options, visit this group at