On Tue, 31 Dec 2013 02:23:30 +,
Brent Wood brent.w...@niwa.co.nz wrote:
This should help... In each temporary table convert the time parts to
a timestamp, then create an index on each of these, then join on the
timestamp.
[...]
Thank you, these were very useful suggestions, and lead to
From: pgsql-general-ow...@postgresql.org [pgsql-general-ow...@postgresql.org]
on behalf of Seb [splu...@gmail.com]
Sent: Tuesday, December 31, 2013 2:53 PM
To: pgsql-general@postgresql.org
Subject: [GENERAL] bulk loading table via join of 2 large staging tables
Hi,
I have two large CSV files
Hi,
I have two large CSV files that need to be merged and loaded into a
single table of a database in Postgresql 9.3. I thought I'd do this by
first staging the data in these files in two temporary tables:
---cut here---start--
CREATE
Quick thoughts:
On both tables:
Convert your date-time varchar fields into a single epoch/integer field.
Create an index of that epoch/integer field.
David J.
--
View this message in context:
On Mon, Dec 30, 2013 at 07:53:06PM -0600, Seb wrote:
Given that the process involves a full join, I'm not sure I can do this
in chunks (say breaking down the files into smaller pieces). Any
suggestions would be greatly appreciated.
First, what I would probably do is merge the two files outside