I've been asked to investigate the feasibility of adding access to
Sybase's bulk-load API in DBD::Sybase. This is an API that allows you to
load table rows in a minimally logged manner, and is of course much
faster than normal INSERT statements.

I have this API available in Sybase::CTlib, and I have a number of users
who mix DBI and Sybase::CTlib to get the necessary functionality.

The API consists of an init call (blk_init()), of binding columns to be
loaded (blk_bind()) and blk_rowxfer() to load on or more bound rows.
blk_done() is called to commit loaded rows.

I suspect that other databases have similar APIs, and that access to
these APIs might be useful as a general case.

I can of course implement this as a bunch of private DBD::Sybase calls,
but after thinking about this a little I thought that it might be
possible to integrate this to the prepare()/execute()/commit() sequence.

Maybe something like:

$sth = $dbh->prepare("BULK INSERT <tablename> -- attributes to be
defined");
while(<>) {
   @row = split('\|');   # or whatever...
   $sth->execute(@row);
}
$sth->commit;

Obviously the driver (or DBI) would have to recognize the BULK INSERT
statement and switch to the bulk load API. That's similar to what I do
now with the "EXEC ..." statement that generates RPC calls instead of
SQL language commands to the server.

Does this make sense, or is this too Sybase-specific to be of general
DBI interest?

Michael
-- 
Michael Peppler                              Data Migrations, Inc.
[EMAIL PROTECTED]                       http://www.peppler.org/
Sybase T-SQL/OpenClient/OpenServer/C/Perl developer available for short
or long term contract positions - http://www.peppler.org/resume.html


Reply via email to