Hello.
I'm trying to insert a millions of records into database with ->insert_bulk
method.
I do this by parts, 5000 records per chunk for memory economy.
my $ars = $c->BalanceArchiveRS;
my $str = $c->schema->storage;
my @names = qw/user date event amount reason reason_type game/;
my $i = 1;
my @buf;
while (my $row = $rs->next) {
push @buf, [
$row->{user}, "$row->{y}-$row->{m}-$row->{d}", $row->{event},
$row->{amsum}, $row->{reason}, $row->{reason_type},
$row->{game},
];
next if $i++ % 5000;
$str->insert_bulk($ars->result_source, \...@names, \...@buf);
@buf = ();
}
$str->insert_bulk($ars->result_source, \...@names, \...@buf) if @buf;
Everything is going great until last line which flushes remaining buffer
into database:
$str->insert_bulk($ars->result_source, \...@names, \...@buf) if @buf;
It has less than 5000 rows and i get error:
[12/03/09 05:44:37] balance_history.plx [emergency]: This service has died:
DBIx::Class::Storage::DBI::insert_bulk(): DBI Exception: DBD::Pg::st
bind_param_array failed: Arrayref for parameter 1 has 178 elements but
parameter 6 has 5000 [for Statement "INSERT INTO "balance_archive"
("amount", "date", "event", "game", "reason", "reason_type", "user") VALUES
(?, ?, ?, ?, ?, ?, ?)" with ParamValues: 1='2.1', 2='2009-2-25', 3='2',
4='6', 5='1', 6=undef, 7='9588407'] at
./script/maintenance/balance_history.plx line 45
It seems like $sth remember values counts (Storage::DBI calls prepare_cached
to get $sth handle) and won't insert with cached $sth different number of
records.
If i do
$str->disable_sth_caching(1);
before last buffer flushing then everythings going great but it seems like a
hack for me.
Is there any good solution for this problem?
Thanks.
_______________________________________________
List: http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/dbix-class
IRC: irc.perl.org#dbix-class
SVN: http://dev.catalyst.perl.org/repos/bast/DBIx-Class/
Searchable Archive: http://www.grokbase.com/group/[email protected]