On Tue, Feb 10, 2004 at 03:53:16PM -0800, Dean Arnold wrote:
> I don't think ODBC provides any clear guidance.
> It defines 2 distinct descriptor elements:
> 
> COLUMN_SIZE: the size in bytes
> DISPLAY_SIZE: the size in characters
> 
> And PRECISION is only defined wrt numerics;
> otherwise, the column descriptors only return OCTET_LENGTH
> (aka byte length).

I've dug around in my books and "SQL-99 Complete, Really" says
that SQLDescribeCol (which is the relevant function here) uses
SQL_DESC_OCTET_LENGTH for the 'Columnsize' of char types.

I've updated the docs to read:

=item C<PRECISION>  (array-ref, read-only)

Returns a reference to an array of integer values for each column.

For numeric columns, the value is the maximum number of digits
(without considering a sign character or decimal point). Note that
the "display size" for floating point types (REAL, FLOAT, DOUBLE)
can be up to 7 characters greater than the precision (for the
sign + decimal point + the letter E + a sign + 2 or 3 digits).

For any character type column the value is the OCTET_LENGTH,
in other words the number of bytes, not characters.

(More recent standards refer to this as COLUMN_SIZE but we stick
with PRECISION for backwards compatibility.)

=cut

> BTW: is there a possible DBI API inconsistency ?
> 
> type_info() defines COLUMN_SIZE as the size in bytes;
> but column_info() defines COLUMN_SIZE as size in chars.

type_info() is wrong, it should be chars. Fixed. Thanks.

Tim.

Reply via email to