Re: [HACKERS] Postgres as Historian
Thanks for all your responses and my apologies for putting the question in the wrong list. I think OLAP is the answer for my requirements. Regards, Hardik On Wed, Aug 4, 2010 at 5:40 AM, Greg Smith wrote: > Hardik Belani wrote: > >> For this i can create a table with number and time (may be time offset >> instead of timestamp) as columns. But still it will require me to store huge >> number of rows in the order of few millions. Data is read only and only >> inserts can happen. But I need to perform all kinds of aggregation to get >> various statistics. for example: daily avg, monthly avg etc.. >> >> > > > You've unfortunately asked on the wrong list about this. pgsql-hackers is > intended mainly for discussion related to the source code of PostgreSQL, so > this is off-topic for it. The people who like to argue about the best way > to implement aggregates and the like are on the pgsql-performance list. > You'd be more likely to get detailed responses if you asked this question > there. That group loves to talk about how to design things for other > people. > > > -- > Greg Smith 2ndQuadrant US Baltimore, MD > PostgreSQL Training, Services and Support > g...@2ndquadrant.com www.2ndQuadrant.us <http://www.2ndquadrant.us/> > >
[HACKERS] Postgres as Historian
We are using postgres as RDBMS for our product. There is a requirement coming for a feature which will require me to store data about various data points (mainly numbers) on a time scale. Data measurement is being taken every few secs/mins based and it is needed to be stored for statistical analysis. Now this requires numbers (integers/floats) to be stored at every mins. For this i can create a table with number and time (may be time offset instead of timestamp) as columns. But still it will require me to store huge number of rows in the order of few millions. Data is read only and only inserts can happen. But I need to perform all kinds of aggregation to get various statistics. for example: daily avg, monthly avg etc.. We already are using postgres for our product so using postgres does not add any additional installation requirement and hence it is a bit easier. Would you recommand postgres for this kind of requirement and will be provide the performance. OR do you recommand any other database meant for such requirements. I am also searching for a good historian database if postgres doesn't suppport. Thanks, Hardik
[HACKERS] Trigger function in a multi-threaded environment behavior
We have a multi-threaded environment in linux where multiple threads are performing database operations(insert, update, select and at times delete as well) in transaction mode (which may span across stored procedures) using unixodbc. Now this as is, works fine. If we introduce postgres triggers (trigger functions) on some of the tables to track insert/update/delete operations, (This is done by keeping and updating a counter field for every insert, update and delete operation performed on a set of tables inside trigger function) at this point, one or more threads get stuck in lock while executing the query, to the extent that sometimes even with the pgadmin, the database tables cannot be updated. We are using postgres v8.4 and unixodbc v2.2.14. Here in this case when using postgres triggers in a multithreaded application, do we have to take care of table/row level locks inside trigger function. Thanks, Hardik