I have two tables in the database which are used almost every time
someone makes use of the webpages themselves. The first, is some sort
of database side parameter list which stores parameters from session to
session. While the other, is a table that handles the posting activity
of all the rooms and chatters.
The first is required in all authentication with the system and when
entries are missing you are challenged by the system to prove your
identity. This table is based on a randomized order, as in the unique
number changes pseudo randomly and this table sees a reduction in
entries every hour on the hour as to keep it's information fresh and
The other table follows a sequential order and carries more columns of
information. However, this table clears it's entry nightly and with
current settings will delete roughly a days traffic sitting at 50K rows
The difference is as follows: Without making the use of vacuum every
hour the parameter table performs very well, showing no loss in service
or degradation. Since people authenticate more then post, it is safe
to assume that it removes more rows daily then the posting table.
The posting table often drags the system down in performance when a day
has been skipped, which includes the use of VACUUM ANALYZE EXPLAIN.
This seems to be an indication that the process of a daily delete is
actually a very wise step to take, even if the information itself is not
needed for very long.
A VACUUM FULL will correct the issue, but put the site out of commission
for roughly 20 minutes as the drive crunches the information.
My question is, should the purging of rows be done more often then once
a day for both tables. Is this why performance seems to take a hit
specifically? As there were too many rows purged for vacuum to
accurately keep track of?
Creator/Designer Ethereal Realms
---------------------------(end of broadcast)---------------------------
TIP 9: the planner will ignore your desire to choose an index scan if your
joining column's datatypes do not match