The Situation I have a CGI application on a platform that I cannot write files to the filesystem. The platform belongs to a very large maker of networking hardware, let's call them Sisqo for the sake of the argument. The restriction on writing files is part of their deployment/infosec restrictions. Now, the application has to implement a file storage and retrieval system ( amoungst other things ). This get's very interesting when you can't actually write files. So, we use Oracle... a file is a record containing some metadata and a BLOB field.
Now, since we can't write to the filesystem, I've got some code to shove headers at the browser to make it play nicely, and some more code to stream the BLOB from the database to the browser. The current solution seems to be "just make the database blob read buffer big enough", but with 50meg files in the database, the admins don't like me leaving 60meg processes around for 10 or 20 minutes ( obviously ). For those that are interested, I can make the streaming code available... I should turn it into a module at some point... but I digress. Oracle seems to have this lovely property where if you don't explicitly close a statement, the cursor hangs around forever... overflowing the open cursor and process limits... Now, the next most common answer at this point is "just explicitly ->finish your statements and close your connections". The problem is, that during this period that the download is in progress, the database statement is still open... so when they hit the "stop button" on the browser to stop the download, apache kills the process, and the program stops without a chance to ->finish and ->disconnect. SO With that situation in mind, is it NORMAL for Oracle to just leak and leak and leak? And secondly, is there anything I can do with DBI to make sure that things will get cleaned up properly. Thanks in advance for the assistance. Adam Kennedy
