I'm sending a message with about ~150k repeated items in it, total
size is about 3.3mb, and its taking me about 100ms to serialize it and
send it out.

Can I expect to do any better than this? What could I look into to
improve this?
- I have "option optimize_for = SPEED;" set in my proto file
- I'm compiling with -O3
- I'm sending my message in batches of 1000
- I'm using C++, on ubuntu, x64
- I'm testing all on one machine (e.g. client and server are on one
machine)

My message looks like:

message NodeWithNeighbors
{
        required Id nodeId = 1;
        repeated IdConfidence neighbors = 2;
}

message GetNeighborsResponse
{
        repeated NodeWithNeighbors nodesWithNeighbors = 1;
}

message IdConfidence
{
        required bytes id = 1;
        required float confidence = 2;
}

Where "bytes id" is used to send 16byte IDs (uuids).

I'm writing each message (batch) out like this:

        CodedOutputStream codedOutputStream(&m_ProtoBufStream);

        // Write out the size of the message
        codedOutputStream.WriteVarint32(message.ByteSize());
        // Ask the message to serialize itself to our stream adapter, which
ultimately calls Write on us
        // which we then call Write on our composed stream
        message.SerializeWithCachedSizes(&codedOutputStream);

In my stream implementation I'm buffering every 16kb, and calling send
on the socket once i have 16kb.

Thanks!

- Alex
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To post to this group, send email to protobuf@googlegroups.com
To unsubscribe from this group, send email to 
protobuf+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/protobuf?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to