Re: large numbers of inserts out of memory strategy

2017-12-01 Thread Steven Lembark
On Thu, 30 Nov 2017 08:43:32 -0600 Ted Toth wrote: > What is the downside of using a DO block? I'd have to do a nextval on > each sequence before I could use currval, right? Or I could do 'select > last_value from '. You are creating a piece of code that has to be parsed,

Re: large numbers of inserts out of memory strategy

2017-11-30 Thread Christopher Browne
On 30 November 2017 at 05:22, Peter J. Holzer wrote: > On 2017-11-29 08:32:02 -0600, Ted Toth wrote: >> Yes I did generate 1 large DO block: >> >> DO $$ >> DECLARE thingid bigint; thingrec bigint; thingdataid bigint; >> BEGIN >> INSERT INTO thing >>

Re: large numbers of inserts out of memory strategy

2017-11-30 Thread Christopher Browne
On 30 November 2017 at 05:22, Peter J. Holzer wrote: > On 2017-11-29 08:32:02 -0600, Ted Toth wrote: >> Yes I did generate 1 large DO block: >> >> DO $$ >> DECLARE thingid bigint; thingrec bigint; thingdataid bigint; >> BEGIN >> INSERT INTO thing >>

Re: large numbers of inserts out of memory strategy

2017-11-30 Thread Ted Toth
On Thu, Nov 30, 2017 at 4:22 AM, Peter J. Holzer wrote: > On 2017-11-29 08:32:02 -0600, Ted Toth wrote: >> Yes I did generate 1 large DO block: >> >> DO $$ >> DECLARE thingid bigint; thingrec bigint; thingdataid bigint; >> BEGIN >> INSERT INTO thing >>

Re: large numbers of inserts out of memory strategy

2017-11-30 Thread Peter J. Holzer
On 2017-11-29 08:32:02 -0600, Ted Toth wrote: > Yes I did generate 1 large DO block: > > DO $$ > DECLARE thingid bigint; thingrec bigint; thingdataid bigint; > BEGIN > INSERT INTO thing > (ltn,classification,machine,source,thgrec,flags,serial,type) VALUES >

Re: large numbers of inserts out of memory strategy

2017-11-29 Thread Steven Lembark
> I'm pretty new to postgres so I haven't changed any configuration > setting and the log is a bit hard for me to make sense of :( Diving into the shark tank is a helluva way to learn how to swim :-) Are you interested in finding doc's on how to deal with the tuning? -- Steven Lembark

Re: large numbers of inserts out of memory strategy

2017-11-29 Thread Steven Lembark
> > what tools / languages ate you using? > > I'm using python to read binary source files and create the text files > contains the SQL. Them I'm running psql -f . Then chunking the input should be trivial. There are a variety of techniques you can use to things like disable indexes during

Re: large numbers of inserts out of memory strategy

2017-11-29 Thread Ted Toth
On Tue, Nov 28, 2017 at 9:59 PM, Tom Lane wrote: > Brian Crowell writes: >> On Tue, Nov 28, 2017 at 12:38 PM, Tomas Vondra >> wrote: >>> So what does the script actually do? Because psql certainly is not >>> running pl/pgsql

Re: large numbers of inserts out of memory strategy

2017-11-28 Thread Tom Lane
Brian Crowell writes: > On Tue, Nov 28, 2017 at 12:38 PM, Tomas Vondra > wrote: >> So what does the script actually do? Because psql certainly is not >> running pl/pgsql procedures on it's own. We need to understand why >> you're getting OOM in the

Re: large numbers of inserts out of memory strategy

2017-11-28 Thread Tomas Vondra
On 11/28/2017 07:26 PM, Ted Toth wrote: > On Tue, Nov 28, 2017 at 12:01 PM, Tomas Vondra > wrote: >> >> ... >> >> That is, most of the memory is allocated for SPI (2.4GB) and PL/pgSQL >> procedure (500MB). How do you do the load? What libraries/drivers? >> > > I'm

Re: large numbers of inserts out of memory strategy

2017-11-28 Thread Ted Toth
On Tue, Nov 28, 2017 at 12:01 PM, Tomas Vondra wrote: > > > On 11/28/2017 06:54 PM, Ted Toth wrote: >> On Tue, Nov 28, 2017 at 11:22 AM, Tomas Vondra >> wrote: >>> Hi, >>> >>> On 11/28/2017 06:17 PM, Ted Toth wrote: I'm writing a

Re: large numbers of inserts out of memory strategy

2017-11-28 Thread Rob Sargent
On 11/28/2017 10:50 AM, Ted Toth wrote: On Tue, Nov 28, 2017 at 11:19 AM, Rob Sargent wrote: On Nov 28, 2017, at 10:17 AM, Ted Toth wrote: I'm writing a migration utility to move data from non-rdbms data source to a postgres db. Currently I'm

Re: large numbers of inserts out of memory strategy

2017-11-28 Thread Tomas Vondra
Hi, On 11/28/2017 06:17 PM, Ted Toth wrote: > I'm writing a migration utility to move data from non-rdbms data > source to a postgres db. Currently I'm generating SQL INSERT > statements involving 6 related tables for each 'thing'. With 100k or > more 'things' to migrate I'm generating a lot of

Re: large numbers of inserts out of memory strategy

2017-11-28 Thread Rob Sargent
> On Nov 28, 2017, at 10:17 AM, Ted Toth wrote: > > I'm writing a migration utility to move data from non-rdbms data > source to a postgres db. Currently I'm generating SQL INSERT > statements involving 6 related tables for each 'thing'. With 100k or > more 'things' to migrate

large numbers of inserts out of memory strategy

2017-11-28 Thread Ted Toth
I'm writing a migration utility to move data from non-rdbms data source to a postgres db. Currently I'm generating SQL INSERT statements involving 6 related tables for each 'thing'. With 100k or more 'things' to migrate I'm generating a lot of statements and when I try to import using psql