Yes, this is what I have read on the net too. But, tests show that without the 
sqlite_stat* tables, my queries are ridiculously slow (probably unwanted table 
scans etc).
Real-life data... can't I simply take my real-life database and extract the 
data in sqlite_stat*?
 
Btw, this command produces nothing even though the table does contain several 
rows: ".dump sqlite_stat2"

Thanks
 
> From: [email protected]
> Date: Mon, 7 Feb 2011 16:44:00 +0000
> To: [email protected]
> Subject: Re: [sqlite] Regarding "Manual Control Of Query Plans"
> 
> 
> On 7 Feb 2011, at 4:38pm, Sven L wrote:
> 
> > So, with this in mind, it makes sense to precompute the sqlite_stat* 
> > tables. Right?
> 
> Which you do by running ANALYZE, but since it needs real-life data to work on 
> there's no point doing it until your customer has put some data in. I don't 
> write this type of application any more, but I might put it in a maintenance 
> routine -- some obscure menu option near the Config Preferences or something. 
> Run it as part of your yearly maintenance procedure, after you've run 'PRAGMA 
> integrity_check'.
> 
> Simon.
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
                                          
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to