Re: [GENERAL] Smaller multiple tables or one large table?

2012-06-16 Thread Gabriele Bartolini
Hi Benedict, Il 15/06/12 20:58, Benedict Holland ha scritto: The tables would have to be specified with a table pk constraint falling between two ranges. A view would then be created to manage all of the small tables with triggers handling insert and update operations. Select would have to be

[GENERAL] any solution for doing a data file import spawning it on multiple processes

2012-06-16 Thread h...@101-factory.eu
hi there, I am trying to import large data files into pg. for now i used the. xarg linux command to spawn the file line for line and set and use the maximum available connections. we use pg pool as connection pool to the database, and so try to maximize the concurrent data import of the

Re: [GENERAL] any solution for doing a data file import spawning it on multiple processes

2012-06-16 Thread Edson Richter
Em 16/06/2012 12:04, h...@101-factory.eu escreveu: hi there, I am trying to import large data files into pg. for now i used the. xarg linux command to spawn the file line for line and set and use the maximum available connections. we use pg pool as connection pool to the database, and so

Re: [GENERAL] any solution for doing a data file import spawning it on multiple processes

2012-06-16 Thread h...@101-factory.eu
thanks i thought about splitting the file, but that did no work out well. so we receive 2 files evry 30 seconds and need to import this as fast as possible. we do not run java curently but maybe it's an option. are you willing to share your code? also i was thinking using perl for it henk

Re: [GENERAL] any solution for doing a data file import spawning it on multiple processes

2012-06-16 Thread Bosco Rama
h...@101-factory.eu wrote: thanks i thought about splitting the file, but that did no work out well. so we receive 2 files evry 30 seconds and need to import this as fast as possible. we do not run java curently but maybe it's an option. are you willing to share your code? also i

Re: [GENERAL] any solution for doing a data file import spawning it on multiple processes

2012-06-16 Thread Edson Richter
Em 16/06/2012 12:59, h...@101-factory.eu escreveu: thanks i thought about splitting the file, but that did no work out well. so we receive 2 files evry 30 seconds and need to import this as fast as possible. we do not run java curently but maybe it's an option. are you willing to share your

[GENERAL] v9.1.3 WITH with_query UPDATE

2012-06-16 Thread Bill House
Hello all, Would someone please point me to (or supply) some working examples of UPDATE commands using the WITH clause syntax as described in the manual (pdf version page 1560) and referring to Section 7.8 (pdf version page 104)? I have looked around a lot and haven't seen much on this. I have

Re: [GENERAL] v9.1.3 WITH with_query UPDATE

2012-06-16 Thread Yeb Havinga
On 2012-06-16 19:11, Bill House wrote: Would someone please point me to (or supply) some working examples of UPDATE commands using the WITH clause syntax as described in the manual (pdf version page 1560) and referring to Section 7.8 (pdf version page 104)?

Re: [GENERAL] any solution for doing a data file import spawning it on multiple processes

2012-06-16 Thread h...@101-factory.eu
thanks all, i will be looking into it. Met vriendelijke groet, Henk On 16 jun. 2012, at 18:23, Edson Richter edsonrich...@hotmail.com wrote: Em 16/06/2012 12:59, h...@101-factory.eu escreveu: thanks i thought about splitting the file, but that did no work out well. so we receive 2 files

Re: [GENERAL] v9.1.3 WITH with_query UPDATE

2012-06-16 Thread Vibhor Kumar
On Jun 16, 2012, at 1:11 PM, Bill House wrote: md5sum may be duplicated and I am trying to mark the column del of the redundant records leaving one unmarked. Here is one variation of the syntax I have tried on one group: WITH batch AS (select * from files_test where

Re: [GENERAL] v9.1.3 WITH with_query UPDATE

2012-06-16 Thread Bill House
On 06/16/2012 01:27 PM, Vibhor Kumar wrote: On Jun 16, 2012, at 1:11 PM, Bill House wrote: md5sum may be duplicated and I am trying to mark the column del of the redundant records leaving one unmarked. Here is one variation of the syntax I have tried on one group: WITH batch AS