On 22/01/15 15:29, Josh Nijenhuis wrote:
old docs from dbi <= 1.39 i believe

   The PRECISION attribute contains a reference to an array of integer values 
that represent the defined length or size of the columns in the SQL statement.

     There are two general ways in which the precision of a column is 
calculated. String datatypes, such as CHAR and VARCHAR, return the maximum 
length of the column. For example, a column defined within a table as:

         location        VARCHAR2(1000)

     would return a precision value of 1000.

     Numeric datatypes are treated slightly differently in that the number of 
significant digits is returned. This may have no direct relationship with the 
space used to store the number. Oracle, for example, stores numbers with 38 
digits of precision but uses a variable length internal format of between 1 and 
21 bytes.

     For floating-point types such as REAL, FLOAT, and DOUBLE, the maximum 
``display size'' can be up to seven characters greater than the precision due 
to concatenated sign, decimal point, the letter ``E,'' a sign, and two or three 
exponent digits.

CHANGES in DBI 1.41 (svn rev 130), 22nd February 2004

|Clarified that ||$sth||->{PRECISION} is OCTET_LENGTH ||for| |char types.|

dbi >= 1.41

Type: array-ref, read-only

Returns a reference to an array of integer values for each column.

For numeric columns, the value is the maximum number of digits (without considering a 
sign character or decimal point). Note that the "display size" for floating 
point types (REAL, FLOAT, DOUBLE) can be up to 7 characters greater than the precision 
(for the sign + decimal point + the letter E + a sign + 2 or 3 digits).

For any character type column the value is the OCTET_LENGTH, in other words the 
number of bytes, not characters.

That is a little strange as I believed the definition initially came from the 
ODBC spec which says

"The maximum column size that the server supports for this data type. For numeric 
data, this is the maximum precision. For string data, this is the length in 
characters."

I can assure you that DBD::ODBC will not return octets unless an ODBC driver is 
broken.

(More recent standards refer to this as COLUMN_SIZE but we stick with PRECISION 
for backwards compatibility.)

I have tried COLUMN_SIZE but it seems to not work, with this error.

Can't get DBI::st=HASH(0x7fcee73ce078)->{COLUMN_SIZE}: unrecognised attribute 
name at ./ut_testdictsync line 19.

because of the bit you quoted "but we stick with PRECISION for backwards 
compatibility"

Martin


On 01/22/15 08:26, Michael Gerdau wrote:
In in "standard" command-line client this session choice would be set in
my.ini under the name default-character-set, if I'm not mistaken.
Changing its value from utf8 to latin1 could maybe "solve" the problem.
On linux, so its my.cnf and all the character-sets are latin1 and
character-set-server as well
Even if it were utf8 and switching to latin1 would "solve" the problem:
My possibly naive expectation would be that PRECISION returns the number
of characters and not the number of bytes required to represent these
characters.

Best wishes,
Michael


Reply via email to