On Thu, 24 Sep 2009 15:30:40 -0500, John Lynch
<johntly...@gmail.com> wrote:

>Sqlite Users,
>
>I am using sqliute user for a very large customer database.  I have 3
>seperate DB files with a table in each file.  The files are cust,acct, and
>web.
>
>Each DB table has  ~ 200 Million Rows.  When I attach web and acct to cust
>with perl DBI/DBD::Sqlite the process runs out of memory on windows (2G
>32bit process limit).  The memory usage grows out of control while querying
>all table joined together via the attach.
>
>When I used a flattened table of the same data (ie create table as select *
>from ...) via same attach to create the flattened table.  I have no
>problems.
>
>
>Any ideas on how to reduce the memory consumption when using attach?

1) Limit the size of the page cache with the appropriate
PRAGMAs. THe cache takes page_size (bytes) * cache_size
(pages) plus some overhead.

2) Optimize your queries, prevent full Cartesian products.

3) Limit the size of the resultset to what you really need.

e.g. http://www.sqlite.org/cvstrac/wiki?p=ScrollingCursor

4) Don't try to collect the full resultset into a Perl
array.

>Thank You

HTH
-- 
  (  Kees Nuyt
  )
c[_]
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to