I have been working on external replication on Postgresql 9.2 for a little while
(with too many interruptions blocking my progress!)

Who knows a good utility to aggressively analyze
and recover Postgresql Databases?

It seems the standard reply that I see
is "Make regular backups", but that guarantees maximum full data loss
defined by the backup time interval.

Our MariaDB Mysql/ExtraDB/Innodb friends and Aria_check and some other tools
to "recover" as much as possible up to the moment of failure.

While full replication is the ultimate safeguard, in "split brain" mode, I could see a hardware failure causing loss of data up to the last replication exchange
or last backup interval.

During a data crash, I want the recovery tool to HELP me get as much data recovered and get back to operations. What I do not want to do is a bunch of manual command line file copy and deletes to "guess" my way back to operational mode (some data loss is inevitable)

I could make a daily snapshot of the system catalog to assist the recovery tool in
restoring the database.

Who has ideas on this?






--
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to