Hi there! I am embarking on re-engineering an application using Solr/Lucene (If you'd like to see the current manifestation go to: fictionfinder.oclc.org). The database for this application consists of approximatly 1.4 million records of varying size for the "work" record, and another database of 1.9 million bibliographic records. I fear that loading this through http will take several days, perhaps a week. Do any of you have a way to do a large batch load of the DB? Roger Thompson
- Batch indexing a large number of records Thompson,Roger
- Re: Batch indexing a large number of records Erik Hatcher
- Re: Batch indexing a large number of records Mike Klaas