How are you delivering the eventual result, to whatever is getting it?  I 
assume you'll be passing it to some other software, and will need it all to be 
in a single contiguous memory block  (SCMB) at that point.

The only way to avoid copying the data into the result SCMB would be to 
allocate more space than you could possibly need and fill it up.  That would 
mean using the same allocated memory for a 10k download as for a 150mb download 
-- ugh.  OTOH, you'd only do a single allocation of that huge block of memory 
and you'd re-use it for each download.  (If you can receive more than one of 
these at a time you'll need N different [say] 256mb blocks of memory.)

Otherwise you'll have to move the data into a SCMB at some point.  What you 
want to do is to avoid copying any of it more than once.  Build an array of 
byte arrays (or a List<byte[]>) and start a new byte[] when the previous one 
has filled up.  When done, you'll know how many total bytes you have; allocate 
a SCMB of that size and copy each of the pieces into it, one at a time until 
done.  You won't have any "trailing null bytes" unless they're being sent to 
you (and then they're part of the data, aren't they?) -- when you copy each of 
the byte arrays into the final SCMB, you'll only copy the amount of data that's 
been stored into the particular array.

I think you'd want to intentionally cause each of the byte arrays to be 
allocated in the LOH, or even allocate them in unmanaged memory, as they won't 
be freed until you've built the final SCMB and copied everything into them.

Good luck...

At 04:58 PM 7/6/2006, dave wanta wrote
>I wish I could. But basically these are large blobs (usually GIS images, but
>can be regular images and sometimes large zips).
>
>Unfortunately I don't have control over the project, and only have this
>little box I can work in (read data from a socket, and optimize the
>memory ).
>
>I was using a MemoryStream, simply because it was quick and easy. I'm moving
>to a dynamically expanding byte array, but the Trimming part is what I
>was/am worried about.
>
>Cheers!
>Dave
>
>
>
>----- Original Message -----
>From: "David Lanouette" <[EMAIL PROTECTED]>
>To: <ADVANCED-DOTNET@DISCUSS.DEVELOP.COM>
>Sent: Thursday, July 06, 2006 3:37 PM
>Subject: Re: [ADVANCED-DOTNET] trim byte array
>
>
>> Dave,
>>
>> Do you really need to keep all the data around?  Could you read a few meg,
>> then process that data, then read more data and process that, etc... ?
>>
>> I've found it rare to "have to" deal with such large collections of data
>all
>> at once.
>>
>> Basically, I'm saying that maybe you could look at the problem from a
>> different point of view.
>>
>> HTH.
>>
>> ______________________________
>> - David Lanouette
>> - [EMAIL PROTECTED]
>>
>> "Excellence, then, is not an act, but a habit" - Aristotle
>>
>>
>>
>>
>>
>> > -----Original Message-----
>> > From: Discussion of advanced .NET topics.
>> > [mailto:[EMAIL PROTECTED] On Behalf Of dave wanta
>> > Sent: Thursday, July 06, 2006 4:26 PM
>> > To: ADVANCED-DOTNET@DISCUSS.DEVELOP.COM
>> > Subject: Re: [ADVANCED-DOTNET] trim byte array
>> >
>> > That would be a good work-around. But, I don't have write
>> > access (gotta love all of these constraints. ;-) )
>> >
>> > Cheers!
>> > Dave
>> > ----- Original Message -----
>> > From: "Barry Kelly" <[EMAIL PROTECTED]>
>> > To: <ADVANCED-DOTNET@DISCUSS.DEVELOP.COM>
>> > Sent: Thursday, July 06, 2006 3:20 PM
>> > Subject: Re: [ADVANCED-DOTNET] trim byte array
>> >
>> >
>> > dave wanta <[EMAIL PROTECTED]> wrote:
>> >
>> > > It's from a socket, so I don't know the size until i get the
>> > > terminating characters.
>> >
>> > You could probably consider streaming it to a temporary file,
>> > then, since I expect a download of many 100s of MB wouldn't
>> > be immediate. You wouldn't want many of those arrays hanging
>> > around during concurrent requests / responses etc.
>> >
>> > -- Barry
>> >
>> > --
>> > http://barrkel.blogspot.com/


J. Merrill / Analytical Software Corp

===================================
This list is hosted by DevelopMentorĀ®  http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com

Reply via email to