As long as you have the remote_api set up for your app, these two commands
should do what you want:

# python2.5 bulkloader.py --dump --num_threads=1 --url=
http://localhost:8080/remote_api --filename=myApp.dump --app_id=myApp
--rps_limit=1000 --http_limit=1000

# python2.5 bulkloader.py --restore --batch_size=100 --url=
http://myApp.appspot.com/remote_api --filename=myApp.dump

Of course, you need to replace "myApp" with whatever your App ID is.

See here for info on setting up the remote_api:

http://code.google.com/appengine/articles/remote_api.html

You might need to experiment with batch_size, rps_limit, and http_limit to
find optimal settings.

Your entities will be loaded into the live datastore with the same key_names
as the local datastore.. so any existing entities with the same key_names
will get overwritten.

On Mon, Mar 15, 2010 at 12:22 AM, JoeM <[email protected]> wrote:

> Hi All,
>
> Can I take the datastore file that results from hosting a Google
> appengine server on my local machine and directly insert that data
> into Google's real online datastore by an operation that is simple and
> direct ?
>
> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected]<google-appengine%[email protected]>
> .
> For more options, visit this group at
> http://groups.google.com/group/google-appengine?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to