I suggest

1) Use the backend database built-in backup utilities (such as
postgres
pg-dump), using cron to tar.gz the files and put them in their
respective locations.

if available else:

2) Use web2py cron, using export_to_csv file, and then tarfile to
compress and then save the file in its correct location, all from
within web2py

On Dec 7, 3:52 pm, Thadeus Burgess <[email protected]> wrote:
> What would the preferred method of data backup be?
>
> * Use web2py cron, using export_to_csv file, and then tarfile to
> compress and then save the file in its correct location, all from
> within web2py?
> * Use the backend database built-in backup utilities (such as postgres
> pg-dump), using cron to tar.gz the files and put them in their
> respective locations.
> * Use a filesystem level backup, backing up the actual database files on disk.
> * Use online backup with PITR (as noted in the postgres 
> manual:http://www.postgresql.org/docs/8.1/static/backup-online.html)
>
> -Thadeus

--

You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en.


Reply via email to