Why are you doing that?

Richard

On Fri, Mar 23, 2012 at 2:26 AM, nick name <[email protected]>wrote:

> In one of my management scripts (which runs continuously, after setting up
> a web2py environment), I copy a complete sqlite database directory from
> another server (copied_db.sqlite, and *.table), open them with
> "DAL('sqlite://copiedfile.sqlite', auto_import=True, path='/tmp/copy_path)".
>
> I copy the files back and forth, and recreate the DAL object every 10-20
> seconds or so. Note that on linux, this means that the sqlite file opened
> every time is new (new copy replaces old one), even if it has the same name
> and same contents as before -- this use case is different than standard
> web2py use, in which the DAL always refers to the same file on disk (same
> inode, same everything).
>
> I noticed that after a few hours, memory consumption becomes huge (goes
> from 7M to 400M in less than an hour, _and_ continues to grow, despite the
> files not getting larger). At some point, the sqlite dbapi connect starts
> failing as well.
>
> Is there a pool or cache that keeps DAL objects alive behind the scenes?
>
> I have so far been unable to find what would keep the DALs alive; where
> should I look? The culprit could be anything that indirectly keeps a
> reference to the connection or the database, e.g. - a Field, a Table,
> anything like that. Any ideas where to look in web2py? my code doesn't
> store any of these objects, by maybe something in web2py does?
>
>
>

Reply via email to