On Nov 28, 2007 4:25 PM, Oleg Broytmann <[EMAIL PROTECTED]> wrote: > On Sat, Oct 27, 2007 at 05:12:33AM +0200, Bart wrote: > > http://www.initd.org/tracker/pysqlite/wiki/SnippetsBlobs > > ...works for me when I hand it 500MB of data. > > 1. The code > > import sys > infile = open(sys.argv[1], "rb") > data = infile.read() > infile.close() > > class Images(SQLObject): > image = BLOBCol() # or PickleCol, no difference... > > Images.createTable() > image = Images(image=data) > > Images._connection.cache.clear() > image = Images.get(1) > > outfile = open("test.dat", "wb") > outfile.write(image.image) > outfile.close() > > works for me for files up to 100M, but for bigger files OS just kills > python process due to MemoryError. > But if I have enough memory I don't think there would be any problem > with bigger data. At least I don't see a difference between 100M and 500M.
A MemoryError seems to signal the underlying malloc decides you're out of system memory - that's your computer, unrelated to the code or problem. > 2. The difference between the snippet and the way SQLObject uses Binary > is that the snippet uses parameter(s) and SQLObject generates query > strings. Well, query strings work for 100M binary files... You meant *don't* work for...? As I recall, the query size limit is/was as low as 64KB for one of the DB-API interfaces, which would mean that SQLObject-wise, the limiting factor to BLOB-support is the specific interface and backend, and unpredictable. > 3. Are you sure you really want to transfer 500M in one piece to and > from a database? Wouldn't it be better to use files and store filenames in > the DB? 500MB was just a test case to be very sure it was being handled with blobs. My data is actually on the order of 1 to 3MB for the most complex data I've handled (largely picking overhead), and yes, I'm sure. In my app, the data is a write-once, read-often thing, so it's basically just a cleaner variation of the explicitly file-based solution, if slightly slower. The file solution would open me up to a list of potential path access, security, dangling-reference, and cleanup problems I don't want to think about or code around just now. It's more of a hack than a solution when storing pickled data and not *files* files. I figured I could use SQLObject as a quick-and-dirty object persistance system that would also save me from having to write a lot of dull database code, but it sounds a lot like my objects are too big for its BLOB support to handle in the current design, so I'll have to design my way around it, or code for pysqlite2 directly. It's a pity, but I suppose I'll live:) Well, thanks anyway, --Bart ------------------------------------------------------------------------- SF.Net email is sponsored by: The Future of Linux Business White Paper from Novell. From the desktop to the data center, Linux is going mainstream. Let it simplify your IT future. http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4 _______________________________________________ sqlobject-discuss mailing list sqlobject-discuss@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/sqlobject-discuss