> > The system that I'm developing, I have about 25000 (persons) x 8
>>(exams)
>> x 15 (answers per exam) = 3000000 records to process and it is VERY SLOW.
>
>f you need to import large quantities of data, look at the copy
>command, that tends to be faster.
By way of example for the level of improvement COPY gives:
a 3000000 row table ( 350Mb dump file -> 450Mb table ) can by loaded via copy
in 7 minutes. To insert each row (say using a perl prog to read the file and
DBD-Pg to insert, committing every 10000 rows ) takes about 75minutes. I used
a PII 266Mhz/192Mb and Postgresql 7.1b5 for these results. Postgresql 7.0.2
is slower ( 20-30% or so...), but should still display a similar level of
improvement with copy.
Good loading
Mark
---------------------------(end of broadcast)---------------------------
TIP 4: Don't 'kill -9' the postmaster