I have a table of around 200 million rows, occupying around 50G of disk. It
is slow to write, so I would like to partition it better.
The table is roughly:
id: integer # unique from sequence
external_id : varchar(255) # unique, used to interface with external
systems, not updated (only selec
I have a large table (200 million rows) with a column ( 'url' character
varying(255)) that I need to be unique.
Currently I do this via a UNIQUE btree index on (lower(url::text))
The index is huge, and I would like to make it much smaller. Accesses to
the table via this key are a tiny portion of
I'm frequently getting these errors in my console:
4/11/09 2:25:04 PMorg.postgresql.postgres[192]ERROR: could not read
directory "pg_xlog": Invalid argument
4/11/09 2:25:56 PMorg.postgresql.postgres[192]ERROR: could not read
directory "pg_xlog": Invalid argument
4/11/09 2:36:03 P