> I think I know the answer to this so I'm looking for some
> conformation.  What I do is to prepare a relatively large and complex
> query. Then I run the same query forever  never doing a finalize. My
> assumption is I will have no memory leakage.

In principle this should be fine, and if it isn't then it would be a
bug somewhere, either in the SQLite code (very unlikely) or the
DBD::SQLite driver for Perl's DBI interface (also unlikely).

Simon wrote:

> When you are finished with the query you should either end on a
> _reset() or do a _finalize().  Or (harmlessly) do both.  If you do
> not do one of those, you may find that when you _close() some memory
> is not released for the statement and/or the database (I'm not sure
> which, but either way it's bad).

On the Perl/DBI side of things these actions are usually be taken care
of automatically when the associated objects holding the relevant
resources go out of scope.

On Sat Oct 04, 2014 at 03:16:23PM -0700, Jim Dodgen wrote:
> It might be I need more of a Perl DBI question the order of the
> statements I do are as follows
> 
> prepare  >> - Done once
> 
> execute  << done many times followed by: fetchrow_array << until
> exhausted
> 
> finish << never done
>
> I just don't see that the Execute/Fetchrow-array activity is going to
> leave a handle or some resource dangling

The above steps are exactly (but not exclusively) what the Perl DBI was
intended to support. Although I haven't specifically measured the
memory use, I do the above quite a lot without a problem.

By the way, the last call to fetchrow_array() (that returns 'undef')
implicitly calls finish() internally.

-- 
Mark Lawrence
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to