Hello all,
Im working on a project that reads GTFS archives as part of its
functionality. When we are importing
a particular set of data that has ~60,000 entries my dev server locks
up around the 27,000 entry. The
error is the java heap space error. I would like to know if anyone has
successfully upped their heap
space. I searched, and found general instructions but after
implementing them I'm still failing around
the same entry. This is leading me to believe that my command line
args are not taking affect. Here
is what I have so far:
appengine-sdk-java/bin/dev_appserver.sh:
#!/bin/bash
# Launches the development AppServer
[ -z "${DEBUG}" ] || set -x # trace if $DEBUG env. var. is non-zero
SDK_BIN=`dirname $0 | sed -e "s#^\\([^/]\\)#${PWD}/\\1#"` # sed makes
absolute
SDK_LIB=$SDK_BIN/../lib
SDK_CONFIG=$SDK_BIN/../config/sdk
java -Xms1024m -Xmx1024m -ea -cp "$SDK_LIB/appengine-tools-api.jar" \
com.google.appengine.tools.KickStart \
com.google.appengine.tools.development.DevAppServerMain $*
Is this correct?
Also the way we get our data is through a custom import function that
chunks the data into groups of 20
and then processes them and stores them into the datastore. We are
only using one instance of the
persistance manager as per the recommendations to use a singleton
class. Each batch hits a servlet on
our app and then does its work. Should we close the persistence
manager after every batch is completed?
I'm wondering if this would be a memory leak.
Thanks,
-Mike
--
You received this message because you are subscribed to the Google Groups
"Google App Engine for Java" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-appengine-java?hl=en.