Hello to all,

I am new to this group and postgresql. I am working on
a project which uses postgresql and project is time
critical. We did all optimization in our project but
postgresql seems to be a bottle-neck. To solve this we
run the database operations in a different thread. But
still, with large volume of data in database the
insert operation becomes very slow (ie. to insert 100
records in 5 tables, it takes nearly 3minutes).

vacuum analyze helps a bit but performance improvement
is not much.
We are using the default postgres setting (ie. didn't
change postgresql.conf).

One more point: When we try to upload a pg_dump of
nearly 60K records for 7 tables it took more than
10hrs. 

System config:

Redhat Linux7.2
RAM: 256MB
postgres: 7.1.3
connection: ODBC

Thanks to all, please consider it even if it is silly
doubt. 
 
Vivek




                
__________________________________ 
Do you Yahoo!? 
Check out the new Yahoo! Front Page. 
www.yahoo.com 
 


---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]

Reply via email to