On Sun, Mar 11, 2012 at 09:26:44AM +0000, Barbie wrote: > Andreas has already alerted me to this, but even recreating the SQLite > DB from scratch has the same problem. The only thing I can think of is > that the DB is so big now that SQLite has problems storing the data. > > This is why I want to look at an alternative method of getting at the > data. I'm going to look at creating an API that returns the records in > CSV or JSON format for a given range of IDs. This will allow you to > maintain your own DB as you wish, rather than have to continually > download several Gig repeatedly.
That would be great - it's the way I use the existing SQLite database anyway. As a temporary measure though, would it be possible to generate a database for the first ten million reports, one for the second ten million, one for the third ten million and so on? -- David Cantrell | Official London Perl Mongers Bad Influence Longum iter est per praecepta, breve et efficax per exempla.