Thanks very much to everyone who replied to my plea for help. Elliot -- Thanks for pointing me to the right list.
Jason: I'm just kicking the tires. I was sure we'd need a much bigger server if we decide to use Evergreen in production, but I thought 16GB of RAM and 200GB of disk would be okay for testing. And at this point it's just 160k bib records. I haven't even gotten to items or patrons yet. Thanks for sharing your git project. I'm not a git user but I'll give it a try if I don't find a more familiar solution. Dan: Thanks -- I'll probably give your suggestion a try, but 550,000 records 1,000 at a time will certainly take a while. Three things: -- The process I started three days ago to import 160,000 records using the method on the Evergreen site is still running. -- Maybe an unfair comparison, but we use VuFind as an alternative interface to Horizon, and a full import of all 550k records takes about 45 minutes. -- It's surprising to me that there isn't a faster method. We're looking seriously at Evergreen as a replacement for Horizon, but this would be a problem. I'll try Dan's and then Jason's methods (again, thank you very much) and hope that they're significantly faster. If I had the time and ability (unfortunately I have neither) I'd take a shot at it myself. Thanks again. Joe Joe Thornton Manager, Automation Services Upper Hudson Library System 28 Essex Street Albany, NY 12206 518-437-9880 x230 On Tue, Jun 4, 2013 at 3:26 PM, Joe Thornton <[email protected]>wrote: > I'm new to Evergreen and to this list so I apologize in advance if this > issue has been discussed already (I did look). > > I installed Evergreen successfully on a test server with 16GB RAM and > about 200GB of disk -- in two partitions. > > We have: > > Debian 7 > Postgres 9.1 (not on a remote server) > Evergreen 2.4 > > To migrate bib records from our SirsiDynix Horizon database I used this > document: > http://docs.evergreen-ils.org/2.4/_migrating_your_bibliographic_records.html > > The process was interrupted a few times by serious errors, but eventually > I ended up with 550k bib records in the staging_records_import table. > > The real problems started when I ran SELECT staging_importer(); > > The first time it stopped after many hours because it ran out of disk > space. Postgres was using the smaller partition for data so I changed it to > use the larger partition (~135GB) and restarted the job. This time it ran > over the weekend and then ran out of disk space again. > > Although this seems very strange to me, I started it again and this time > the staging_records_import table has about 160k records in it. > > I started SELECT staging_importer(); yesterday (about 24 hours ago) and > it's still running and has used more than 50GB of disk so far. > > Am I missing a step (or steps), or is this normal? > > Thanks, > > Joe Thornton > Manager, Automation Services > Upper Hudson Library System > 28 Essex Street > Albany, NY 12206 > 518-437-9880 x230 >
