> My application needs to run a lot of queries from a large sqlite
> database (>100G) with many small tables (>10,000). The performance of
> the queries are acceptable if the database is on a SSD drive, but can
> be 50 times or more slower on a regular or network drive.
 
> Because some users have slow disks and plenty of RAM (e.g. on a
> high-mem cluster system), I would like to create an in-memory database
> to cache frequently accessed tables. For example, if the user tells me
> he can allocate 20G RAM to cache the tables, I will create a :memory:
> database and use it to cache tables until the database reaches 20G in
> size. The question is: How do I know the size of an in-memory database
> so that I can keep it under the specified size?

What OS is it that does not have a block disk cache?

Have you tried simply allocating a large page cache?




_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to