You should make sure to destroy the CodedOutputStream before getting sizes
from ZeroCopyOutputStream. CodedOutputStream effectively asks for a buffer
from ZeroCopyOutputStream, and ByteCount() reflects the total count that
CodedOutputStream has asked for. Upon destruction, CodedOutputStream may
"return" buffer space to the stream implementation with BackUp(). Likewise,
you need to flush the GzipOutputStream before checking your underlying
ZeroCopyOutputStream - same sort of story there.

On Thu, Dec 1, 2011 at 5:24 PM, Vinay Bansal <[email protected]> wrote:

> Hey,
>
> I have run into one more problem.
>
> Whenever I try to find out the number of bytes written by the streams,
> CodedOutputStream.ByteSize() gives the right answer but
> ZeroCopyOutputStream.ByteSize() always gives 0 and GzipOutputStream
> always gives a fixed no. like 63356 and not the actual no. of bytes
> written by the stream.
>
> I would really appreciate the help.
>
> Thanks
>
> On Dec 1, 3:10 pm, Vinay Bansal <[email protected]> wrote:
> > Hey Oliver,
> >
> > I figured it out just a minutes before your reply.
> >
> > Thanks anyway
> >
> > Cheers,
> > -Vinay
> >
> > On Dec 1, 2:28 pm, Oliver Jowett <[email protected]> wrote:
> >
> >
> >
> >
> >
> >
> >
> > > On Thu, Dec 1, 2011 at 3:22 PM, Vinay Bansal <[email protected]>
> wrote:
> > > >  void write() {
> > > >    int fd = open("myfile", O_WRONLY), O_APPEND);
> > > >    google::protobuf::io::ZeroCopyOutputStream *out = new
> > > > google::protobuf::io::FileOutputStream(fd);
> > > >    google::protobuf::io::GzipOutputStream *gzipOut = new
> > > > google::protobuf::io::GzipOutputStream(out, options);
> > > >    google::protobuf::io::CodedOutputStream *codedOut = new
> > > > google::protobuf::io::CodedOutputStream(gzipOut);
> > > >    codedOut->WriteVarint32(message.ByteSize());
> > > >    message.SerializeToCodedStream(codedOut);
> > > >    close(fd);
> > > >  }
> >
> > > If you're doing that for every (small) message, then the compressor is
> > > never going to have a good chunk of data to work with; the compression
> > > dictionary will be reset for every message, plus a gzip header gets
> > > written each time. (For comparison, try a command-line gzip of a file
> > > containing only a single message - that's essentially what you're
> > > doing here)
> >
> > > You want to open the file once, create one GzipOutputStream, then
> > > write many messages to it.
> >
> > > Oliver
>
> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/protobuf?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/protobuf?hl=en.

Reply via email to