Hi list

I'm fairly new to Postgres so bear with me. Googling and searching the
list, I didn't find anything that resembled my problem.

I have a large table with ca. 10 million inserts per day (fairly simple
data: timestam, a couple of id's and a varchar message)

I run a query every couple of minutes that looks at the new entries
since the last run and retrieves them for further processing (using a
WHERE eventtime > '2006-02-24 14:00:00' ) to limit to the most recent
entries

These queries run around 40-50 seconds (largely due to some LIKE %msg%
threwn in for good measure). Postgres performs a seq table scan on
those queries :-(

My idea is to limit the search to only the last n entries because I
found that a

SELECT * from table ORDER eventtime DESC limit 1000

is very fast. Because the inserts are in chronolgical order, I can
store the sequential id of the highest row from the last query and
subtract that from the current high row count to determine that number.

Is there a way to limit the expensive query to only those last 1000 (or
whatever) results?

I have tried to nest SELECTS but my SQL-fu is to limited to get
anything through the SQL processor :-)

thanks
Jens-Christian Fischer


---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster

Reply via email to