Hi, Actually i am using a FDW , in which the data is written into a single file. So when i do concurrent inserts , the data is written into the file simultaneously and this is causing a data corruption . Is TABLE LOCK the only option available ??
--Harsha On Wed, Jan 21, 2015 at 3:22 PM, Albe Laurenz <laurenz.a...@wien.gv.at> wrote: > sri harsha wrote: > > Is there any way to stop concurrent inserts to happen on a single > table ?? > > > > Query 1 : INSERT INTO TABLE_A SELECT * FROM TABLE1; > > Query 2 : INSERT INTO TABLE_A SELECT * FROM TABLE2; > > Query 3 : SELECT * FROM TABLE_A; > > > > Assume i have the above queries. Query 1 and Query 3 can occur > concurrently . If one insert is taking > > place , the other should wait. How do i achieve this ?? > > Is that a typo and you really mean "query 1 an query 2" above? > > Why would you want to prevent concurrent inserts? > Maybe you don't really need to hobble your performance like that. > > If you really want that, it is easy with table locks. > Your INSERT could look like that: > > BEGIN; > LOCK table_a IN EXCLUSIVE MODE; > INSERT INTO table_a ... > COMMIT; > > > How do the concurrent inserts take place in postgres ?? Is data > stored temporarily for both queries > > separately and then written into the table ?? > > No, it is written to the table concurrently. > A table is organized in pages of 8KB each, and there is no problem with > writing to > different pages concurrently. > > Yours, > Laurenz Albe >