To be honest, it isnt my db, but I just have access to it ... Either way, so I need to change the vacuum_Analyze_scale/threshold for the original table ? But the value will be too high/low for the original table. For example if my original table has 30,000 rows and my toasted has 100,000,000 rows. I want to analyze every 50K records in the toasted (by the way, sounds legit ?) which is 0.05% of 100m. With this value it means that every 0.05*30,000=1500 updated/deletes on the original table it will run analyze on the original table which is very often... Doesn't it seems a little bit problematic ?
בתאריך יום ד׳, 13 בפבר׳ 2019 ב-18:13 מאת Alvaro Herrera < alvhe...@2ndquadrant.com>: > On 2019-Feb-13, Mariel Cherkassky wrote: > > > Hey, > > I have a very big toasted table in my db(9.2.5). > > Six years of bugfixes missing there ... you need to think about an > update. > > > Autovacuum doesnt gather > > statistics on it because the analyze_scale/threshold are default and as a > > result autoanalyze is never run and the statistics are wrong : > > analyze doesn't process toast tables anyway. > > I think the best you could do is manually vacuum this table. > > -- > Álvaro Herrera https://www.2ndQuadrant.com/ > PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services >