yes, i won't know the size beforehand so i have to store the buffers in an
array and then merge them into a buffer once the upload is complete. doing
this in js is a bit slower than using node-buffertools but not much (20-30%
for buffers < 1024 bytes and difference gets smaller the bigger the buffers
are). is good enough for what i need to do it in js without a dependency on
node-buffertools.
also, there is an issue with node-buffertools when you pass in a large
number of arguments using function.apply. it blows the stack over a certain
number of args. i added a method to node-buffertools to allow passing in an
array instead of using the arguments list, which gets around this
limitation. might be useful to somebody:
Handle<Value> Coalesce(const Arguments& args) {
HandleScope scope;
size_t size = 0;
Local<Array> aa = Local<Array>::Cast(args[0]);
int length = aa->Length();
for (int index = 0; index < length; ++index) {
size += Buffer::Length(aa->Get(index)->ToObject());
}
Buffer& dst = *Buffer::New(size);
uint8_t* s = (uint8_t*) Buffer::Data(dst.handle_);
for (int index = 0; index < length; ++index) {
Local<Object> b = aa->Get(index)->ToObject();
const uint8_t* data = (const uint8_t*) Buffer::Data(b);
size_t length = Buffer::Length(b);
memcpy(s, data, length);
s += length;
}
return scope.Close(dst.handle_);
}
btw - i am not doing any checks here to ensure the members of the array are
actual buffers as i wanted to see the raw speed...
here is the js function to do the same thing:
function coalesce(buffers) {
var len = buffers.length;
var blen = 0, off = 0, i = len, j = 0;
while(i--) {
blen += buffers[j++].length;
}
var bb = new Buffer(blen);
i = len, j=0;
while(i--) {
off += buffers[j++].copy(bb, off);
}
return bb;
}
On Wednesday, February 29, 2012 1:43:33 AM UTC, mscdex wrote:
>
> On Feb 28, 7:32 pm, billywhizz <[email protected]> wrote:
> > ok. understood. anyone know the fastest way to concatenate multiple
> > buffers into a single buffer? i need to concatenate multiple binary
> > buffers into one binary buffer when doing a file upload. the only way
> > i can think of is to do the following:
>
> If I know the total length beforehand, I always create a Buffer
> beforehand the size of the total length and just copy each incoming
> chunk to this Buffer, increasing a byte pointer along the way.
>
> Otherwise, using the array method is probably the next best fit for
> most cases.
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en