3ku for you relply:).For the first one,I think your answer is quite
clear.

But to the second one,en,I want to produce the serialization of Req.

Let me explain again:). assume my application is like this:
0.server app wants to send 1,000,000 pageids to client
1.if server app sends 1,000,000 pages id and serialize it, it will
cost 1GB memory

2.but server app can just allocate 100MB memory. So obviously server
app can't send all pageids[1,000,000] to client

3.meanwhile the server app's protobuf is very clever. It[protobuf] can
calculate that "if server app has 100MB, it can just hold 10,000
pageids at most". So protobuf tells server that "Hi server,if you just
have 100MB memory,I can only hold 10,000 pageids"

4.so the server app knows it,so app just serialize 10,000 pageids into
memory instead of 1,000,000 pageids.

I hope I clarify it now.. If the protobuf doesn't implement it, do you
have any idea about it?.


On Jun 3, 12:40 am, Jason Hsueh <jas...@google.com> wrote:
> On Tue, Jun 1, 2010 at 6:21 AM, bnh <baoneng...@gmail.com> wrote:
> > I'm using a protobuf as the protocol for a distributed system.But now
> > I
> > have some questions about protobuf
>
> > a.Whether protobuf provides the inteface for user-defined allocator
> > because sometimes I find 'malloc' cost too much? I've tried TCmalloc,
> > but I think I can optimize the memory allocation according to my
> > application.
>
> No, there are no hooks for providing an allocator. You'd need to override
> malloc the way TCmalloc does if you want to use your own allocator.
>
>
>
>
>
> > b.Whethere protobuf provides a way to serialize a class/object
> > partially[Or do you have some ideas about it]? Because my application
> > is
> > very sensitive of memory usage.. Such as a class
>
> > class Req{
> > int userid;
> > vector<PageID> pageid;
> > };
>
> > I want to pack 1000 pageids into the Req. But if I pack all of them,
> > the
> > Req's size is about 1GB [hypothetically]. But I just have 100MB
> > memory,
> > so I just plan to pack pageids as many as possible until the memory
> > usage of Req is about 100MB. ['serialize object partially according to
> > memory usage'].
>
> Are you talking about producing the serialization of Req, with a large
> number of PageIds, or parsing such a serialization into an in-memory object?
> For the former, you can serialize in smaller pieces, and just concatenate
> the 
> serializations:http://code.google.com/apis/protocolbuffers/docs/encoding.html#optional
> For the latter, there is no way for you to tell the parser to stop parsing
> when memory usage reaches a certain limit. However, you can do this yourself
> if you split the serialization into multiple pieces.
>
>
>
> > --
> > You received this message because you are subscribed to the Google Groups
> > "Protocol Buffers" group.
> > To post to this group, send email to proto...@googlegroups.com.
> > To unsubscribe from this group, send email to
> > protobuf+unsubscr...@googlegroups.com<protobuf%2bunsubscr...@googlegroups.com>
> > .
> > For more options, visit this group at
> >http://groups.google.com/group/protobuf?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To post to this group, send email to proto...@googlegroups.com.
To unsubscribe from this group, send email to 
protobuf+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/protobuf?hl=en.

Reply via email to