Hi,

I'm trying to upload a large set of data (8 million entities). I'm
testing the upload like this:

C:\Users\magnus\Desktop\cities>appcfg.py upload_data --
config_file=cityloader.py --filename=allCountriescsv.csv --kind=City c:
\Users\magnus\Code\blirdetsol.se --db_filename=progress.sql --dry_run
--batch_size=5 --rps_limit=10

However, after processing a number of lines I get:

Application: xxxxxx; version: xxx.
Uploading data records.
[INFO    ] Logging to bulkloader-log-20100305.084929
[INFO    ] Throttling transfers:
[INFO    ] Bandwidth: 250000 bytes/second
[INFO    ] HTTP connections: 8/second
[INFO    ] Entities inserted/fetched/modified: 10/second
[INFO    ] Opening database: progress.sql
[INFO    ] Running in dry run mode, skipping remote_api setup
[INFO    ] Connecting to xxxxxxx/remote_api
[INFO    ] Starting import; maximum 5 entities per post
......................................................................................................................................................
......................................................................................................................................................
......................................................................................................................................................
......................................................................................................................................................
......................................................................................................................................................
......................................................................................................................................................
......................................................................................................................................................
....................................................................................
[ERROR   ] [Thread-11] ProgressTrackerThread:
Traceback (most recent call last):
  File "C:\Program Files (x86)\Google\google_appengine\google\appengine
\tools\bulkloader.py", line 1431, in run
    self.PerformWork()
  File "C:\Program Files (x86)\Google\google_appengine\google\appengine
\tools\bulkloader.py", line 2202, in PerformWork
    self.UpdateProgress(item)
  File "C:\Program Files (x86)\Google\google_appengine\google\appengine
\tools\bulkloader.py", line 2238, in UpdateProgress
    self.db.UpdateState(item.progress_key, item.state)
  File "C:\Program Files (x86)\Google\google_appengine\google\appengine
\tools\bulkloader.py", line 2015, in UpdateState
    (new_state, key))
OperationalError: unable to open database file
[INFO    ] Unexpected thread death: Thread-11
[INFO    ] An error occurred. Shutting down...
[WARNING ] data source thread hung while trying to exit

Any ideas on this problem? Can't the sqllite db handle the large
amount of data?

//Magnus

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to