Hi:

I have an application wherein a process needs to read data from a stream and 
store the records for further analysis and reporting. The data in the stream is 
in the form of variable length records with clearly defined fields - so it can 
be stored in a database or in a file. The only caveat is that the rate of 
records coming in the stream could be several 1000 records a second.

The design choice I am faced with currently is whether to use a postgres 
database or a flat file for this purpose. My application already maintains a 
postgres (8.3.4) database for other reasons - so it seemed like the 
straightforward thing to do. However I am concerned about the performance 
overhead of writing several 1000 records a second to the database. The same 
database is being used simultaneously for other activities as well and I do not 
want those to be adversely affected by this operation (especially the query 
times). The advantage of running complex queries to mine the data in various 
different ways is very appealing but the performance concerns are making me 
wonder if just using a flat file to store the data would be a better approach.

Anybody have any experience in high frequency writes to a postgres database?

- Jay

Reply via email to