I am running postgresql as database backend, and I have some scripts
dealing with constant incoming data and then insert these data into the
database, in a quite complex way, involving a couple of procedures.

But the performance of the database is worse than I had thought. After
about 100 times of the script being run, the speed of the insertion
slowed down dramatically. But it went back to the regular fast speed
after I did a vacuum analyze.

how can I redesign the system to avoid the bottleneck? And why is it
that postgresql can slow down so much after doing some complex
operations?


Thanks


-- 
Wei Weng
Network Software Engineer
KenCast Inc.



---------------------------(end of broadcast)---------------------------
TIP 6: Have you searched our list archives?

http://archives.postgresql.org

Reply via email to