Re: [sqlite] SQLite on limited Memory
"Tony Fenleish" <[EMAIL PROTECTED]> wrote: > You meantion avoid using sqlite3_get_table. I've been using sqlite 2.8.15 > and have postponed upgrading just because "it's been working". Are there > better memory handling and speed improvements for embedded devices in sqlite > 3, or are most of the improvements for machines with lots of ram and large > processors? > SQLite version 3 contains a number of enhancements designed specifically for embedded devices with limited memory. The database files for version 3 are about 33% smaller. Version 3 uses less stack and heap space. And the code footprint is about the same size. When malloc() fails in version 3.3.x, just just get an SQLITE_NOMEM error for that one function call. When malloc() fails in version 2, you get the SQLITE_NOMEM forever more. SQLite version 3 supports AUTOVACUUM. There are probably other embedded-device specific enhancments in version 3, but those are the ones that jump to mind. -- D. Richard Hipp <[EMAIL PROTECTED]>
Re: [sqlite] SQLite on limited Memory
You meantion avoid using sqlite3_get_table. I've been using sqlite 2.8.15 and have postponed upgrading just because "it's been working". Are there better memory handling and speed improvements for embedded devices in sqlite 3, or are most of the improvements for machines with lots of ram and large processors? From: Christian Smith <[EMAIL PROTECTED]> Subject: Re: [sqlite] SQLite on limited Memory Date: Tue, 25 Apr 2006 15:06:34 +0100 (BST) Otherwise, avoid loading the full result set into memory as you process it. Ie. Avoid sqlite3_get_table(). Christian
Re: [sqlite] SQLite on limited Memory
On Mon, 24 Apr 2006, Tony Fenleish wrote: >I've been developing an app on a linux embedded device with 128 M of RAM, >although the available RAM for queries is less than that. Now that the >databases are getting a little larger (120 MB), I'm having problems with >some queries that have large results. I watch my memory drop completely, >and then my sqlite client is killed. Now, as I understand it, the entire >database is not opened into memory, just the amount of data that is needed. > >According to documentation, 256 Bytes of memory is needed for every 1 MB of >data. Does anyone have insights as to the memory limitations of SQLite? > IIRC, the 256 bytes of memory per 1MB of data is to track dirty pages for rollback. The memory footprint of SQLite is controlled primarily by: - Page cache. Defaults to up to 2000 pages. At 1K pages, that's up to 2MB of cached page data. This number can be reduced as needed. - Temporary tables in memory or disk based tables. If memory based temporary store is used, then any temporary tables used for sorting, for example, will be memory resident. Temporary tables, however, can be moved to temporary disk files to compete with the page cache just like the main database. This is set using "PRAGMA temp_store=". See http://www.sqlite.org/pragma.html for details. Otherwise, avoid loading the full result set into memory as you process it. Ie. Avoid sqlite3_get_table(). If you are sorting large queries, switch to using disk based temporary store. Christian -- /"\ \ /ASCII RIBBON CAMPAIGN - AGAINST HTML MAIL X - AGAINST MS ATTACHMENTS / \
[sqlite] SQLite on limited Memory
I've been developing an app on a linux embedded device with 128 M of RAM, although the available RAM for queries is less than that. Now that the databases are getting a little larger (120 MB), I'm having problems with some queries that have large results. I watch my memory drop completely, and then my sqlite client is killed. Now, as I understand it, the entire database is not opened into memory, just the amount of data that is needed. According to documentation, 256 Bytes of memory is needed for every 1 MB of data. Does anyone have insights as to the memory limitations of SQLite?