Re: [GENERAL] Partitioning table with billion row
1. You have to remove foreign key reference from table searchcache. 2. take backup of data from searchcache. 3. create partition of table product 4. add constraints on table searchcache.(if necessary delete and create searchcache after taking backup.) - Thanks and Regards, Sachin Kotwal NTT-DATA-OSS Center (Pune) -- View this message in context: http://postgresql.1045698.n5.nabble.com/Partitioning-table-with-billion-row-tp5771582p5772155.html Sent from the PostgreSQL - general mailing list archive at Nabble.com. -- Sent via pgsql-general mailing list (pgsql-general@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general
[GENERAL] Partitioning table with billion row
I have a big table referenced by other tables and need to split the table on several parts. 1. I create table by CREATE TABLE product_part0 () INHERITS (product); 2. Move some rows to product_part0 by INSERT INTO product_part0 SELECT * FROM ONLY product ORDER BY id LIMIT 200; 3. Now I need to remove duplicates from master-table. Run DELETE FROM ONLY product WHERE id IN (SELECT id FROM product_part0); With result: ERROR: update or delete on table product violates foreign key constraint product_id_refs on table searchcache DETAIL: Key (id)=(13375) is still referenced from table searchcache. So, how to delete duplicate rows?