Hi

I have been grappling with a problem for some time I would appreciate some
advice on.  We have a public health application which is web based with a
postgresql backing store which is designed for use by the public sector
ministry of health in a significant number of African, Asian and other
countries (http//:dhis2.org).  "Traditionally" it is hosted as a national
data warehouse application with users dispersed amongst district offices
and sometimes health facilities around the country.

Particularly in many countries in Africa the public sector typically has
limited data centre infrastructure to reliably host the application
in-house and so a good number have opted to use some global cloud service
(infrastructure as a service) to ensure maximum availability of the
application.  Others have managed to make use of in-country resources such
as national ISPs and mobile companies.  There are many cost-benefit and
governance considerations behind these decisions which I don't need to go
into here.

Whereas ministries have been prepared to do this there are important to
reasons to ensure that a backup of the database can be maintained in the
ministry.  So we attempt to grab the nightly snapshot backups from the
database each night.  In the past I have attempted this somewhat
simplistically with rsync over ssh but it is a very inefficient approach
and particularly so over weak internet connections.

What are people's thoughts about a more optimal solution?  I would like to
use a more incremental approach to replication.  This does not have to be a
"live" replication .. asynchronously triggering once every 24 hours is
sufficient.  Also there are only a subset of tables which are required (the
rest consist of data which is generated).

Appreciate any advice.

Regards
Bob

Reply via email to