On Mon, 20 Aug 2018 at 16:23, Adrian Klaver wrote:
>
> On 08/20/2018 08:56 AM, Nicolas Paris wrote:
> >> Can I split a large file into multiple files and then run copy using
> >> each file.
> >
> > AFAIK, copy command locks the table[1] while there is no mention of this
> > in the
On 08/20/2018 08:56 AM, Nicolas Paris wrote:
Can I split a large file into multiple files and then run copy using
each file.
AFAIK, copy command locks the table[1] while there is no mention of this
in the documentation[2].
[1] Is from Postgres 7.1(17 years ago). I suspect the conditions have
1. The tables has no indexes at the time of load.2. The create table and copy
are in the same transaction.
So I guess that's pretty much it. I understand the long time it takes as some
of the tables have 400+ million rows.Also the env is a container and since this
is currently a POC system ,
@lists.postgresql.org
*Subject:* [External] Multiple COPY on the same table
Can I split a large file into multiple files and then run copy using each
file. The table does not contain any
serial or sequence column which may need serialization. Let us say I split
a large file to 4 files. Will the
performance
On Mon, 20 Aug 2018 at 12:53, Ravi Krishna wrote:
> > What is the goal you are trying to achieve here.
> > To make pgdump/restore faster?
> > To make replication faster?
> > To make backup faster ?
>
> None of the above.
>
> We got csv files from external vendor which are 880GB in total size,
> What is the goal you are trying to achieve here.
> To make pgdump/restore faster?
> To make replication faster?
> To make backup faster ?
None of the above.
We got csv files from external vendor which are 880GB in total size, in 44
files. Some of the large tables had COPY running for
From: Ravi Krishna
Sent: Monday, August 20, 2018 8:24:35 PM
To: pgsql-general@lists.postgresql.org
Subject: [External] Multiple COPY on the same table
Can I split a large file into multiple files and then run copy using each file.
The table does not contain any
> Can I split a large file into multiple files and then run copy using
> each file.
AFAIK, copy command locks the table[1] while there is no mention of this
in the documentation[2].
> Will the performance boost by close to 4x??
You might be interested in the pbBulkInsert tool[3] that allows
Can I split a large file into multiple files and then run copy using each file.
The table does not contain any serial or sequence column which may need
serialization. Let us say I split a large file to 4 files. Will theperformance
boost by close to 4x??
ps: Pls ignore my previous post which