On 17/02/12 05:09, Frank Chang wrote:
Is it possible to find a faster alternative to sqlite3_blob_open?
sqlite3_blob_open is the fast alternative! However you are reading the
entire blob contents so it won't be that different than just using a regular
SQL query.
Also whatever you are using for profiling is rather useless. The
sqlite3_blob_open function call is apparently taking 0.08 seconds while the
if statement at the end comparing against zero is taking almost 4 times as long.
When it comes to non-trivial sized blobs, here are some best practises:
- Store as the last column in a table, or in a separate table. This is so
that SQLite doesn't have to skip over the blob in order to get other values
in a row (last column) and so that a read will get multiple rows at once
(separate table)
- Make sure your queries explicitly mention which columns are wanted
otherwise you'll get the blob every time which consumes more memory and CPU
- Bump up the page size of your database. The default of 1kb isn't that
good a fit
- While SQLite will always work with blobs (up to 2GB each), you can also
consider storing larger ones as files and then store the filename within SQLite
- Use the incremental blob api for random access to a blob's contents
- Use zeroblob on row creation to ensure that space is allocated for a blob
correctly. You may even add extra padding if items grow later so that the
row doesn't have to be rewritten on updating blob contents
- A vacuum every now and then after periods of writes will help
Roger
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users