Hi Gerald

We tested this when I was in Sierra Leone and we were finding serious
problems with bandwidth getting the data back to Sierra Leone.

So you are going to have to think carefully about when and how often to
synch.  Currently your database files are very small as you don't have much
data on your cloud server, but it will soon grow.  I suspect "at least
twice a day" sounds unrealistic.

The way I typically do it is to first create an account on the backup
server.  Make sure that the account running your dhis instance can login to
the backup server without a password by creating an ssh key pair and
installing the public key on the backup server account.  Then you can
simply the rsync the backups directory (eg /var/lib/dhis2/dhis/backups) to
a directory on the backup server using cron.   In fact if you look in
/usr/bin/dhis2-backup you will see that the commands are already there to
do this, just commented out.  This would synch with the backup server after
taking the nightly backup.

This simple (and slightly lazy) setup has worked fine, and continues to
work, in a number of places.  But there are a number of reasons you might
want to do something different.

(i)  you might want to pull from the backup server rather than push to it.
Particularly as the backup server might not be as reliably always online as
the production server.  This would require a slightly different variation
on the above, but using the same principle of creating an ssh keypair and
letting rsync do the work.

(ii) rsync is a really great and simple tool, but it is sadly quite slow.
If you are bandwidth stressed and your database is growing it might not be
the best solution.  Works fine when bandwidth is not a critical issue.  The
trouble is it doesn't really take into account the incremental nature of
the data ie. you backup everything every time (besides the ephemeral tables
like analytics, aggregated etc).  In which case you need to start thinking
smarter and maybe a little bit more complicated.  One approach I have been
considering, (but not yet tried) is to make a copy of the metadata export
every night and then just pull all the datavalues with a lastupdated
greater than the last time you pulled.  That is going to reduce the size of
the backup quite considerably.  In theory this is probably even possible to
do through the api rather than directly through psql which might be fine if
you choose the time of day/night carefully.  I'd probably do it with psql
at the backed,

So there are a few options.  The first being the simplest and also the
crudest.  Any other thoughts?

Cheers
Bob

On 18 December 2014 at 05:07, gerald thomas <gerald17...@gmail.com> wrote:
>
> Dear All,
> Sierra Leone wants to finally migrate to an online server (External
> server hosted outside the Ministry) but we will like to create a daily
> backup of that server locally in case anything goes wrong.
> My questions:
>
> 1.  We need a help with a script that can create a sync between the
> External Server and the Local Server (at least twice a day)
>
> 2. Is there something we should know from past experiences about
> hosting servers on the cloud
>
> Please feel free to share anything and I will be grateful to learn new
> things about dhis2
>
> --
> Regards,
>
> Gerald
>
> _______________________________________________
> Mailing list: https://launchpad.net/~dhis2-devs
> Post to     : dhis2-d...@lists.launchpad.net
> Unsubscribe : https://launchpad.net/~dhis2-devs
> More help   : https://help.launchpad.net/ListHelp
>
_______________________________________________
Mailing list: https://launchpad.net/~dhis2-users
Post to     : dhis2-users@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-users
More help   : https://help.launchpad.net/ListHelp

Reply via email to