Hi,

I'm saving and loading snapshots of a database (controlled and
accessed via Django models) to and from source control. I've written
routines using django.core.management.call_command and the
serialization framework to dump and load the database contents
periodically. This is usually just before a commit to the repository
or when a new working copy is checked out or brought up to date.

While this is almost good enough for my needs, I can't help wondering
if there is a more elegant way to keep changes to the database in sync
with a (non-binary) dumped copy of the database in a working copy.
Something more lazy and "pay as you go" without the lengthy start up
and shutdown times would be great but I'm not quite sure of the right
way to go about it.

The emphasis for this app is more on data integrity than raw speed
(the data sets are reasonably small, biggest tables are only a couple
of thousand rows at most).

Any ideas or suggestions?

David Moss

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to