On 06/06/10 14:51, Ron Mayer wrote:
Jon Schewe wrote:
OK, so if I want the 15 minute speed, I need to give up safety (OK in
this case as this is just research testing), or see if I can tune
postgres better.
Depending on your app, one more possibility would be to see if you
can re-factor the a
On 06/01/2010 10:03 AM, Torsten Zühlsdorff wrote:
Hello,
i have a set of unique data which about 150.000.000 rows. Regullary i
get a list of data, which contains multiple times of rows than the
already stored one. Often around 2.000.000.000 rows. Within this rows
are many duplicates and often th
Cédric Villemain schrieb:
I think you need to have a look at pgloader. It does COPY with error
handling. very effective.
Thanks for this advice. I will have a look at it.
Greetings from Germany,
Torsten
--
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make cha
Scott Marlowe schrieb:
i have a set of unique data which about 150.000.000 rows. Regullary i get
a
list of data, which contains multiple times of rows than the already
stored
one. Often around 2.000.000.000 rows. Within this rows are many
duplicates
and often the set of already stored data.
I wa
On Sun, Jun 6, 2010 at 6:02 AM, Torsten Zühlsdorff
wrote:
> Scott Marlowe schrieb:
> Thank you very much for your example. Now i've got it :)
>
> I've test your example on a small set of my rows. While testing i've
> stumpled over a difference in sql-formulation. Using except seems to be a
> littl
Since you have lots of data you can use parallel loading.
Split your data in several files and then do :
CREATE TEMPORARY TABLE loader1 ( ... )
COPY loader1 FROM ...
Use a TEMPORARY TABLE for this : you don't need crash-recovery since if
something blows up, you can COPY it again... and it wil
Yes, the "other" reason is that I am not issueing a single SQL command,
but import data from plain ASCII files through the Pyhton-based
framework into the database.
The difference between your measurement and my measurent is the upper
potential of improvement for my system (which has, on