there once was a perhaps now deprecated dump and import tool with appcfg.py 
that did the whole thing.

i have been using the updated version with a bulkloader.yaml, and i wrote a 
shell script with a list of my tables and a for loop.

in my bulkloader.yaml i convert keys to strings and then use the 
datastore.Key method on import to preserve IDs.  as a matter of example, my 
yaml for auth_group:

- kind: auth_group
  connector: csv
  connector_options:
    # TODO: Add connector options here--these are specific to each 
connector.
  property_map:
    - property: __key__
      external_name: key
      export_transform: str
      import_transform: datastore.Key

    - property: description
      external_name: description
      # Type: Text Stats: 6 properties of this type in this kind.
      import_transform: db.Text

    - property: role
      external_name: role
      # Type: String Stats: 6 properties of this type in this kind.


On Friday, March 16, 2012 10:54:09 AM UTC-7, fabien wrote:
>
> Hello,
>
> I wish to get a local copy of a running GAE database, so that I can 
> experiment on my PC without messing up actual user data. I can't figure out 
> the proper way to do that. If I export databases, either through appadmin 
> or "./appcfg download_data":
> - it seems that I need to export each table individually; that's 
> surprisingly tedious for such a common task as backing up everything
> - when I import those tables back, also one by one, ids are discarded, so 
> all relationships are lost
>
> Is there a (1) simple and (2) working way to do this?
>

Reply via email to