On 7/16/21 4:54 PM, Tony Whyman wrote:
So I guess that as long as the string is < 32K and you are not using a
segmented blob then it is OK to use SQL_VARYING and not the inline
blob facility.
On the subject of limits, IBatch does seem to have a silent limit that
I am still exploring. I was comparing the time taken to insert 100000
rows using single inserts and the Batch interface. Using the Batch
interface, only 4061 records were written the table, even though
100000 were added (IBatch->add).
The number 4061 was confirmed from both a read back after commit and
by checking the IBatchCompletionState which reported both processed
and updated 4061.
4061 seems an arbitrary number. My original test table was declared as
Create Table LotsOfData (
RowID integer not null,
theDate TimeStamp,
MyText VarChar(1024),
Primary Key (RowID)
);
and on changing this to
Create Table LotsOfData (
RowID integer not null,
theDate TimeStamp,
MyText VarChar(512),
Primary Key (RowID)
);
I was able to successfully write 8083 rows. I guess that there is some
memory limit that is being hit, and the max mumber of rows that can be
added depends on the size of each buffer added to the batch.
The problem I have is that this is a silent failure. I am checking the
status vector returned by each IBatch->add, and no problem appears to
be reported.
Should I report this as a bug?
Definitely yes.
Firebird-Devel mailing list, web interface at
https://lists.sourceforge.net/lists/listinfo/firebird-devel