Hello Shawn, Thanks a lot for your valuable suggestions and sorry for the delay in replying. Could you tell me some more details about the GAE Log and Remote API ? is there some link pointing more details about using these APIs?
I came across another link for this purpose : http://gbayer.com/big-data/app-engine-datastore-how-to-efficiently-export-your-data/ . I guess you are mentioning about the first approach mentioned in this link right ? I tried out the New approach mentioned in this link. We have to basically schedule a backup job which copies the datastore to cloud storage and from there we can use tools like gsutil to download. I was wondering if there was an automatic way to backup datastore to cloud storage ? do you know of any such method ? I know we can use task queues, cron jobs ,etc. I tried setting up a cron job but it simply didnt work. Do let me know if you can figure out a way to automatically schedule a cron job or a task queue to backup datastore to google cloud storage. Thanks and Regards, Rohith On Friday, August 22, 2014 8:04:37 PM UTC+5:30, Shawn Lee wrote: > > Hi Rohith, > > We have an experimental idea of doing periodic backup in our app. It > involves a combination of GAE APIs to achieve it. The following are the > steps we used: > > 1) Identify the entities that need to be backup. This is because > downloading all entities will incur unnecessary costs. > To do this, we created identifying log messages for > creation/modification/deletion of the entities in our app. This allows us > to look through the logs and retrieve entities that are recently modified. > > 2) Create a cron job on the local machine that looks through the app > engine logs for let's say the past 5 minutes. The logs can be retrieved > remotely using the GAE log api. > > 3) Once the entities are identified, you can download them via the app > engine's remote api, which allows you to remotely access the datastore to > download the individual entities. > > 4) Save the data of these downloaded entities in a .csv format or any > other format that you require. > > 5) The csv file can also be uploaded to a local host so that you can view > them in a local application. This can be done using the bulkloader in the > python SDK, which can also be applied to Java applications. > > Hope these might be of some help. We are still experimenting with this > idea, but we are able to periodically retrieve the entities over 5 minute > intervals and upload them to a local machine. We can then view the data on > the local server as though we are on the live server. > > Best regards, > Shawn > > On Friday, 22 August 2014 01:54:14 UTC+8, Rohith D Vallam wrote: >> >> Hello, >> >> I have some data stored in the Google Datastore. I have two entities in >> the Google Datastore for this purpose. I was wondering if there is a way to >> backup these Datastore entries periodically to my local machines ? I >> searched the internet and could some answers related to using task queues, >> cron jobs, etc but none of them offered a complete, working solution to >> this problem. It would be great if someone could share their ideas about >> how to periodically backup the datastore to local machines. >> >> Thanks and Regards, >> Rohith >> >> -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/google-appengine. For more options, visit https://groups.google.com/d/optout.
