Johann and Kenji san, Thank you !
DB2 has CPYFRMIMPF command (I think it's same as COPY on postgres) so
I will try it.
On Feb 14, 6:48 am, kenji4569 <[email protected]> wrote:
> I use gluon.scheduler to import large csv files for production
> environment, and execute db.commit() by 100 records in the task. This
> works well for me.
>
> Kenji
>
> On 2月14日, 午後4:34, Johann Spies <[email protected]> wrote:
>
>
>
>
>
>
>
> > On 14 February 2012 00:54, Omi Chiba <[email protected]> wrote:
>
> > > I have a problem with the performance of CSV import and I assume it
> > > generate INSERT statement for every record so it will be 8000
> > > statement if you have 8000 records in csv file.
>
> > > Can we use bulk_insert method instead so there will be always only one
> > > INSERT statement which should reduce the performance significantly ?
>
> > > As I understand it the database (at least Postgresql) uses the COPY
>
> > statement to import csv-files. That is much quicker than a series of
> > individual inserts.
>
> > Regards
> > Johann
>
> > --
> > Because experiencing your loyal love is better than life itself,
> > my lips will praise you. (Psalm 63:3)