I have made many attempts to import a large set ot entities
(1.000.000) from a .csv file (14 MB) into the local datastore of the
development server.

All entities are of one very simple kind which has just one string
property. The values are rather small too:

key name values: 1 - 1000000, e.g. "1000"
property values: strings with exactly 6 characters, e.g. "qttrqg"

After the bulkloader has worked for some time info(/error) messages
appear on the command line:
[INFO    ] [Thread-9] Backing off due to errors: 1.0 seconds

There are no further messages in the bulkloader log explaining those
"errors" nor does the development server output any error message.

While searching for a clue I realized that the JVM of the development
server was using over 512 MB of memory with memory consumption
increasing rapidly while the bulkloader is running. After increasing
the limit (VM argument -Xmx) to 1024 MB and after that to 2 GB the
same errors still appear albeit later.

In my last attempt the errors started when the JVM was using about 1,4
GB of memory.
Some bulkloader numbers after stopping the server:
[INFO    ] 406400 entities (37193549 bytes) transferred in 1291.9
seconds
The local datastore file (local_db.bin) has a size of 117 MB.

Can somebody confirm this behaviour? Is the development server not
able to handle such amounts of data?

Google App Engine Java SDK 1.4.2.v201102111811, running the server
inside Eclipse.

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine for Java" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.

Reply via email to