The problem
-----------

I'm using the  "ANALYZE" command of SQLITE to try and improve
performance for queries.

However, I'm finding that the actual analyze call can take a long time
to run for large amounts of rows (e.g. >20000). This is especially the
case if a lot of data has been changed between calls to analyze. It's
taking 10's of seconds to run.

The typical problematic pattern is as follows: start up a new blank
install of my app. "analyze" is run at startup which of course is
instantaneous since no data exists. The user then does his stuff and
results in lots of new rows added. At restart, "analyze" is ran again
which then takes ages.

Interestingly, on subsequent restarts, if the user has added no new
data, the next analyze is very quick, suggesting this is an
incremental cost.


One solution: Manually force the query analyzer to pick statememts
------------------------------------------------------------------

This is the solution described in the SQLITE docs (where you tell the
query engine to ignore certain fields for the purposes of indexes).
This is the last resort for me.

Possible Solution 2: Have hard coded results
--------------------------------------------

I've heard of other people doing this, but I don't know the details.
What you do is setup your DB to reflect a "typical" set of data. You
then run "ANALYZE" and hardcode the data into your app's DB table. You
never actually run ANALYZE in the wild.

So my questions:

1) Will this work?
2) Is it "dangerous"? e.g. could this completely confuse the query
optimiser, or is it the case that as long as the hard coded values are
"realistic" it doesn't matter that they don't reflect the reality of
the table (which is what the docs imply)
2) Has anyone experience of trying this?


Thanks,

John
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to