Chris Hoover wrote:
Has anyone had problems with memory exhaustion and dblink? We were trying to use dblink to convert our databases to our new layout, and had our test server lock up several times when trying to copy a table that was significantly larger than our memory and swap.
Basically where were doing an insert into <table> select * from dblink('dbname=olddb','select * from large_table) as t_large_table(table column listing);


Does anyone know of a way around this?


dblink just uses libpq, and libpq reads the entire result into memory. There is no direct way around that that I'm aware of. You could, however, use a cursor, and fetch/manipulate rows in more reasonably sized groups.

HTH,

Joe

---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings

Reply via email to