We are going live with a application in a few months that is a complete rewrite of an existing application. We are moving from an existing proprietary database to Postgresql. We are looking for some insight/suggestions as to how folks test Postgresql in such a situation.

We really want to run it throught the wringer before going live. I'm throwing together a test suite that consists of mostly perl scripts. I'm wondering what other, if any approaches folks have taken in a similar situation. I know there's nothing like a real live test with real users, and that will happen, but we want to do some semi-automated load testing prior.

Anyone ever use any profiling apps (gprof) with any success?

We've got a failover cluster design and would like any insights here as well.

We're also trying to decide whether a single database with multiple schemas or multiple databases are the best solution. We've done some research on this through the archives, and the answer seems to depend on the database/application design. Still, we welcome any generic ideas on this issue as well.

I've not provided any specifics on hardware or application as we really want high level stuff at this time.

Thanks for any pointers or suggestions.

Until later, Geoffrey

---------------------------(end of broadcast)---------------------------
TIP 5: don't forget to increase your free space map settings

Reply via email to