I've been looking at the way LONGs are handled by DBI and it seems that the values are always read in full and returned in scalars (except for a single oracle specific example I found).

Isn't there a better way to handle blobs? One would assume that databases that implement blobs have a special interface for allowing the application to read individual blobs in smaller chunks when the data is needed.

I think it ought to be possible to retrieve a couple of iterators from a "select somelong, someotherlong from sometable" query, and then read chunks as needed, if the iterators are never used then the long data is never transfered to the application.

In some situations you want to store larger objects in long fields than you would have room for in memory, how would that work with the current implementation?

--
Regards Flemming Frandsen - http://dion.swamp.dk
PartyTicket.Net co founder & Yet Another Perl Hacker

Reply via email to