Hi!
I am using the new appcfg.py bulkloader to restore data to my SDK
datastore. The command I am issuing is as follows:
python appcfg.py upload_data --application=<appname> --url=http://
localhost:8080/remote_api --filename bac
kup.csv --batch_size=1 --num_threads=1
After uploading ~7K entities, it fails with the following error:
File "C:\Program Files\Google\google_appengine\google\appengine\tools
\adaptive
_thread_pool.py", line 150, in WorkOnItems
status, instruction = item.PerformWork(self.__thread_pool)
File "C:\Program Files\Google\google_appengine\google\appengine\tools
\bulkload
er.py", line 693, in PerformWork
transfer_time = self._TransferItem(thread_pool)
File "C:\Program Files\Google\google_appengine\google\appengine\tools
\bulkload
er.py", line 850, in _TransferItem
self.request_manager.PostEntities(self.content)
File "C:\Program Files\Google\google_appengine\google\appengine\tools
\bulkload
er.py", line 1294, in PostEntities
datastore.Put(entities)
File "C:\Program Files\Google\google_appengine\google\appengine\api
\datastore.
py", line 282, in Put
'datastore_v3', 'Put', req, datastore_pb.PutResponse(), rpc)
File "C:\Program Files\Google\google_appengine\google\appengine\api
\datastore.
py", line 186, in _MakeSyncCall
rpc.check_success()
File "C:\Program Files\Google\google_appengine\google\appengine\api
\apiproxy_s
tub_map.py", line 474, in check_success
self.__rpc.CheckSuccess()
File "C:\Program Files\Google\google_appengine\google\appengine\api
\apiproxy_r
pc.py", line 149, in _WaitImpl
self.request, self.response)
File "C:\Program Files\Google\google_appengine\google\appengine\ext
\remote_api
\remote_api_stub.py", line 223, in MakeSyncCall
handler(request, response)
File "C:\Program Files\Google\google_appengine\google\appengine\ext
\remote_api
\remote_api_stub.py", line 349, in _Dynamic_Put
'datastore_v3', 'Put', put_request, put_response)
File "C:\Program Files\Google\google_appengine\google\appengine\ext
\remote_api
\remote_api_stub.py", line 155, in MakeSyncCall
self._MakeRealSyncCall(service, call, request, response)
File "C:\Program Files\Google\google_appengine\google\appengine\ext
\remote_api
\remote_api_stub.py", line 175, in _MakeRealSyncCall
raise pickle.loads(response_pb.exception().contents())
RequestTooLargeError: The request to API call datastore_v3.Put() was
too large.
[INFO ] Unexpected thread death: Thread-3
[INFO ] An error occurred. Shutting down...
[ERROR ] Error in Thread-3: The request to API call
datastore_v3.Put() was too
large.
[INFO ] 6570 entites total, 6366 previously transferred
[INFO ] 0 entities (1621932 bytes) transferred in 126.5 seconds
[INFO ] Some entities not successfully transferred
I watched the Google I/O talk on data migration, and I have played
around with the batch size and thread size to see if I could surpass
this error. However, nothing I've tried has helped, and it seems to
me like a batch and thread size of 1 would cause the fewest problems.
Is there any way to find out what entity it's being held up on?
Thanks in advance.
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-appengine?hl=en.