3ku for you relply:).For the first one,I think your answer is quite
clear.
But to the second one,en,I want to produce the serialization of Req.
Let me explain again:). assume my application is like this:
0.server app wants to send 1,000,000 pageids to client
1.if server app sends 1,000,000 pages
This really needs to be handled in the application since protobuf has no
idea which fields are expendable or can be truncated. What I was trying to
suggest earlier was to construct many Req protobufs and serialize those
individually. i.e., instead of 1 Req proto with 1,000,000 page ids,
construct
Ah, one option I missed is using an implementation of
io::ZeroCopyOutputStream like io::FileOutputStream, which uses a fixed size
buffer and flushes data to the file (socket) when the buffer is full. Then
serializing a large message won't consume a lot of memory. Perhaps this is
what you really
On Jun 3, 2010, at 14:18 , Nader Salehi wrote:
I was told that coded streams have issues when they are larger than
2GB. Is it true, and, if so, what are the issues?
If you have a single object that is 2GB in size, there are 32-bit
integers that will overflow. However, provided that you
On Jun 3, 2010, at 15:29 , Nader Salehi wrote:
It is not a single object; I am writing into a coded output stream
file which could grow to much larger than 2GB (it's more like 100GB).
I also have to read from this file.
Is there a performance hit in the above-mentioned scenario?
No, this
Note that writing a 100GB file using CodedStream is probably a bad idea
because:
- Readers will have to read the entire file sequentially; they will not be
able to seek to particular parts.
- One bit of corruption anywhere in the file could potentially render the
entire rest of the file
Please don't use reflection to reach into private internals of classes you
don't maintain. We have public and private for a reason. Furthermore,
this access may throw a SecurityException if a SecurityManager is in use.
On Mon, May 31, 2010 at 11:25 AM, David Dabbs dmda...@gmail.com wrote:
On Fri, May 28, 2010 at 11:25 AM, Hering hering.ch...@computer.org wrote:
Hi,
I am trying to build versions 2.3.0 and 2.3.0rc2 with Sun Studio 12
Update 1 compiler on a Solaris SPARC box without success. The
compilation goes well but the linking phase fails for both versions:
$ (
On Thu, Jun 3, 2010 at 3:15 PM, Nader Salehi sal...@alumni.usc.edu wrote:
To be clear, I do not encode the entire file! Each file contains many
small messages, each of which is stored as a length delimited record.
It is just that there are quite a few messages bundled in one file.
Right,
Updates:
Status: Accepted
Comment #1 on issue 194 by ken...@google.com: Failed to compile
protobuf::test with Visual Studio 2010: RepeatedFieldBackInsertIterator
should be Assignable and Copy Constructible
http://code.google.com/p/protobuf/issues/detail?id=194
I think we simply
On 06/03/2010 03:14 PM, Kenton Varda wrote:
On Fri, May 28, 2010 at 11:25 AM, Hering hering.ch...@computer.org
mailto:hering.ch...@computer.org wrote:
Hi,
I am trying to build versions 2.3.0 and 2.3.0rc2 with Sun Studio 12
Update 1 compiler on a Solaris SPARC box without
Updates:
Status: Accepted
Owner: ken...@google.com
Comment #3 on issue 192 by ken...@google.com: C++0x conformance issue:
using reserved keyword 'nullptr' as a name of variable
http://code.google.com/p/protobuf/issues/detail?id=192
Thanks, will include in next release.
--
Updates:
Status: Accepted
Comment #8 on issue 188 by ken...@google.com: protobuf fails to link after
compiling with LDFLAGS=-Wl,--as-needed because of missing -lpthread
http://code.google.com/p/protobuf/issues/detail?id=188
Thanks! Will make sure this gets into the next release.
--
Comment #10 on issue 103 by ken...@google.com: Protobuf 2.1.0 missing some
sort of pthread linking?
http://code.google.com/p/protobuf/issues/detail?id=103
Could this actually be the same problem solved by comment 7 in issue 188?
--
You received this message because you are subscribed to the
3ku for your reply:).
enActually my application is that I CAN allocate enough ram for the
in-memory message object. What I really want is to restrict the size of
messages which are sent/received by client/server. Right,now
ZeroCopyOutputStream can work.
But actually my app is like a RPC
It is not a single object; I am writing into a coded output stream
file which could grow to much larger than 2GB (it's more like 100GB).
I also have to read from this file.
Is there a performance hit in the above-mentioned scenario?
Nader
On 6/3/2010 15:03 Evan Jones writes:
On Jun 3, 2010,
To be clear, I do not encode the entire file! Each file contains many
small messages, each of which is stored as a length delimited record.
It is just that there are quite a few messages bundled in one file.
I'm assuming that Evan's assessment still stand?
Cheers,
Nader
On 6/3/2010 15:05
3ku for you reply.
Yes, actually I know this demand will add a great deal of complication.
So I just want to know whether you designers have some idea about how to
implement it.
en.actually I have test the performance of protobuf,thrift and some
other opensource products. protobuf's
Status: New
Owner: ken...@google.com
Labels: Type-Defect Priority-Medium
New issue 195 by jmccaskey: common.h should not have using namespace std;
http://code.google.com/p/protobuf/issues/detail?id=195
I have a project I'm considering using protocol buffers for, but this
project doesn't
Comment #11 on issue 103 by l...@dashjr.org: Protobuf 2.1.0 missing some
sort of pthread linking?
http://code.google.com/p/protobuf/issues/detail?id=103
It does seem to be fixed with that patch.
--
You received this message because you are subscribed to the Google Groups Protocol
Buffers
20 matches
Mail list logo