On 7/23/25 01:50, sivapostg...@yahoo.com wrote:
Tried in PostgreSQL 11.11 , PostgreSQL 15.2 in Windows 10


The above command succeeds, when
1.  The trigger in Table1 is disabled with all other constraints on.
2.  The no. of rows is within 16000 or less, with Trigger enabled.  We haven't tried with higher no of rows.

Why not?


The above command goes on infinite loop, when
1.  We try to transfer all 85000 rows at once, with Trigger and other constraints in table1 enabled.  We waited for 1.5 hrs first time and 2.5 hrs second time before cancelling the operation.

Try with the triggers disabled.


I read in the documentation that the fastest way to transfer data is to use Copy command.  And I couldn't find any limit in transferring data using that command.  One could easily transfer millions of rows using this command.

It is, I have used it for much larger datasets then 85000 rows and it completed in less time. As example using Duckdb it took the NYC taxi data set yellow_tripdata_2023-09.parquet, transformed it and loaded using COPY in 5.4 secs for ~2.8 million rows.


FYI, BEGIN in plpgsql is not the same as in SQL. In plpgsql it represents a block. I don't think you need the BEGIN/END around the UPDATE and INSERT queries. See https://www.postgresql.org/docs/current/plpgsql-structure.html for more information.

Any (other) suggestion to transfer successfully is really appreciated.

Happiness Always
BKR Sivaprakash



--
Adrian Klaver
adrian.kla...@aklaver.com


Reply via email to