I'm sure this has been touched on before but can't seem to find a
definitive answer...
Can 'blob' or (MSSQL 'image') fields be read from the database in many
passes using a smaller buffer?
I have a 'file repository' in MSSQL where users will be uploading files
into and then reading from the database. Files will vary in size but I
will know the file size as they are inserted so I can track image
length. Can perl DBI be setup to retrieve files in multiple passes
using a 64k +- buffer instead of forcing {LongReadLen} to the size of
the image data? From what I gather forcing the buffer size will
dramatically eat memory and I'm to entirely certain perl will release
the memory afterward or that Windows will release it.
Basically just curious...I would like to find a way to make the memory
footprint as small as possible but time to market is required so if I
need to force LongReadLen I will.
Anyone have ideas or have dealt with this before...
Thanks in advance...
--
Bill McClintock - Web/Application Development
Worldcom - Colorado Springs (GOG)
vnet:622.0054 local:(719)265.0054
EMail:[EMAIL PROTECTED]
AOLIM:bm0054