Hi all,
I bet you get tired of the same ole questions over and

I'm currently working on an application that will poll
thousands of cable modems per minute and I would like
to use PostgreSQL to maintain state between polls of
each device.  This requires a very heavy amount of
updates in place on a reasonably large table(100k-500k
rows, ~7 columns mostly integers/bigint).  Each row
will be refreshed every 15 minutes, or at least that's
how fast I can poll via SNMP.  I hope I can tune the
DB to keep up.

The app is threaded and will likely have well over 100
concurrent db connections.  Temp tables for storage
aren't a preferred option since this is designed to be
a shared nothing approach and I will likely have
several polling processes.

Here are some of my assumptions so far . . . 

Vacuum hourly if not more often

I'm getting 1700tx/sec from MySQL and I would REALLY
prefer to use PG.  I don't need to match the number,
just get close.

Is there a global temp table option?  In memory tables
would be very beneficial in this case.  I could just
flush it to disk occasionally with an insert into blah
select from memory table.

Any help or creative alternatives would be greatly
appreciated.  :)


Writing software requires an intelligent person,
creating functional art requires an artist.
-- Unknown

---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?


Reply via email to