That ability to manually pre-populate the sqlite_stat1 table has been around for ages. But not many developers use it. Probably because it is tedious to do and developers have more important problems to work on.
A few weeks ago, we got a report of a performance problem from a support customer. Looking at their database, we found that they had created all of the tables, and most but not all of the indexes, added exactly 1 row to each table, then run ANALYZE. After that they filled the database with content and started trying to use it. Unfortunately, the degenerate sqlite_stat1 content created from the one-row-per-table sample data seriously confused the query planner. Rerunning ANALYZE fixed the problem. They would have been better off to have never run ANALYZE at all, I think. That episode demonstrated an important point: It is hard to know when to run ANALYZE. Furthermore, application developers ought not need to become SQLite experts in order to get SQLite to function well in their application. SQLite should just figure out the right thing to do for itself. It should "just work". The PRAGMA optimize command is our effort to move further in the direction of a "smart" SQLite that always automatically "does the right thing" with respect to gathering and using database statistics. Rather than try to teach application developers a bunch of arcane rules (which are really only heuristics) for when and how to run ANALYZE, we just ask them to run "PRAGMA optimize" right before closing the database connection. The application developer has much less to worry about. They don't need to understand ANALYZE, how it works or why it is important. The "PRAGMA optimize" command will take care of running ANALYZE for them, and do so in a way that is minimally invasive (PRAGMA optimize only rarely runs ANALYZE in practice, and when it does, it will only ANALYZE individual tables, not the entire database). We have been dog-fooding this technique as we are able. The Fossil system has been modified (https://www.fossil-scm.org/fossil/artifact/616c0d?ln=1706) to run "PRAGMA optimize" after every command and/or web-page visit, and that has worked out quite well. But, I'd like to get more real-world feedback on the use of "PRAGMA optimize" and how well it works (or doesn't work) for a wider spectrum of developers before I start pushing it as a recommended solution. On 10/4/17, David Raymond <david.raym...@tomtom.com> wrote: > http://www.sqlite.org/optoverview.html section 6.2 "Manual Control Of Query > Plans Using SQLITE_STAT Tables" > > SQLite provides the ability for advanced programmers to exercise control > over the query plan chosen by the optimizer. One method for doing this is to > fudge the ANALYZE results in the sqlite_stat1, sqlite_stat3, and/or > sqlite_stat4 tables. That approach is not recommended except for the one > scenario described in the next paragraph. > > For a program that uses an SQLite database as its application file-format, > when a new database instance is first created the ANALYZE command is > ineffective because the database contain no data from which to gather > statistics. In that case, one could construct a large prototype database > containing typical data during development and run the ANALYZE command on > this prototype database to gather statistics, then save the prototype > statistics as part of the application. After deployment, when the > application goes to create a new database file, it can run the ANALYZE > command in order to create the statistics tables, then copy the precomputed > statistics obtained from the prototype database into these new statistics > tables. In that way, statistics from large working data sets can be > preloaded into newly created application files. > > > > -----Original Message----- > From: sqlite-users [mailto:sqlite-users-boun...@mailinglists.sqlite.org] On > Behalf Of Jens Alfke > Sent: Wednesday, October 04, 2017 12:06 PM > To: SQLite mailing list > Subject: Re: [sqlite] PRAGMA optimize; == no busy handler? > > >> So why not work around the whole process ? On your development system, >> create a database with plausible data in. Run ANALYZE. Then copy all the >> tables named "sqlite_stat*" into a new database. > > That’s an interesting idea. I’ve been assuming that ANALYZE was dependent on > the exact contents of the database, but if its results can be canned and > applied to new databases, that’s a good optimization. > > I’m just a bit wary because this isn’t documented anywhere, and generally > one is warned against writing to `sqlite_*` tables. I’d rather have some > official blessing of this technique before I consider using it. > > —Jens > > _______________________________________________ > sqlite-users mailing list > firstname.lastname@example.org > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users > -- D. Richard Hipp d...@sqlite.org _______________________________________________ sqlite-users mailing list email@example.com http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users