On Fri, Sep 4, 2009 at 10:37 AM, Russell Keith-Magee <[email protected]
> wrote:

>
> On Fri, Sep 4, 2009 at 4:57 PM, Joshua Russo<[email protected]>
> wrote:
> > My goal here is to create a backup and recovery scheme, where recovery is
> > possible on any computer.
> > I've been performing incremental updates to an application that people
> have
> > started to use. The incremental updates seem to have created a problem
> for
> > the dump and load data functions when trying to reload into a fresh
> > database. I tried to use dump data to create an __ initial __ .json file,
> > but I received a duplication error on load, I think from the unique index
> > (not the primary key) on the content type. I believe this is because
> > the tables (and thus content types) are created in a different order when
> > doing a syncdb from scratch, as opposed to incrementally on the old
> > production machine.
>
> > Firstly, does anyone have a more elegant solution to the dump and load
> data
> > problem specifically?
>
> You've pretty much correctly diagnosed the problem that you are
> experiencing. It's well known to the core, and is logged as #7052.
>
> I've had a solution in mind for a while, but never got around to
> implementing it. However, just before v1.1 was released, a patch was
> offered that implements my proposed solution. I expect this patch (or
> something like it) will be in v1.2.
>
> However, that said...
>
> > Second, what do you use for a data backup and recovery scheme for your
> > Django data?
>
> I'm unclear why normal MySQL backup tools aren't the obvious first
> solution here. Yes, Django does provide data dumping and loading
> tools, but they are primarily designed for small fixtures for tests
> and initial database setup.
>
> I'm not saying that loaddata and dumpdata _can't_ be used on large
> data sets, when you're talking about backing up the entire database,
> the database is going to be able to provide much better tools than
> Django will. Trying to build a backup and recovery system without
> using the tools closest to the data seems like a lot of extra work to
> me.
>
> You also appear to be conflating two separate problems. Talking about
> the fragility of fixtures when the database changes isn't a problem
> specific to fixtures - any data dumping scheme will experience the
> same problems. Some data loading schemes might be more robust to
> simple changes, but for anything other than trivial changes, you're
> going to need a much more sophisticated approach than "dump the data
> and reload it". This is a schema evolution issue, not a backup issue.
>

The reason I was looking at the dump data instead of a MySQL backup is
because it was more obvious to automate. I'm going to take a closer look at
the MySQL backup though.

I have a special case. It's a small app on a computer only connected (via
crossover) to one other. The backup process will include copying onto a pen
drive. There is also the strong likely hood that if they have do have the
need to restore the database, I won't be here.  I'm a volunteer in the Peace
Corps. The app is only suppose to be a stop gap until the national
government brings in their own app,  but that depends on other factors that
could potentially take years. Thus I need to create a simple yet fool proof
process for "disaster recovery".

That said, I see that my only real option should be the MySQL backup and
restore. This is how I ended up moving the data anyway.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to