I am Kishore doing freelance development of J2EE applications.
We switched to use Postgresql recently because of the advantages it has over other commercial databases. All went well untill recently, untill we began working on an application that needs to maintain a huge database.
I am describing the problem we are facing below. Can you please take a look at the case, and help me in configuring the PostgreSQL.
We have only two tables, one of which contains 97% of the data and the other table which contains 2.8% of the data. All other contain only the remaining
0.2% of data and are designed to support these two big tables. Currently we have 9 million of records in the first table and 0.2 million of records in the second table.
We need to insert into the bigger table almost for every second , through out the life time. In addition, we receive at least 200,000 records a day at a fixed time.
We are facing a critical situation because of the performance of the database . Even a basic query like select count(*) from bigger_table is taking about 4 minutes to return.
The following is the system configuration.
Database : Postgresql 7.3
OS : Redhat Linux
Processor : Athlon,
Memory : 2 GB
We are expecting that at least 200 active connections need to be maintained through out the day.
Can any you please suggest the best configuration to satisfy the above requirements?
Thanks in advance.
Description: Binary data
---------------------------(end of broadcast)--------------------------- TIP 3: Have you checked our extensive FAQ?