Tim Bunce wrote:
On Wed, Jan 28, 2009 at 10:30:49AM -0500, John Scoles wrote:
I think this is on purpose. Oracle numbers have a huge precision (38
decimal digits), so in general you cannot convert them to a perl
number
without losing precision. DBD::Oracle could be smart enough to
recognize
the safe cases (integers with at most 9 (32 bit) resp. 18 (64 bit)
digits), but AFAIK it isn't. Your best bet is probably explicitely
binding with an approriate type (SQL_INTEGER).
That would be the right way to do it, but DBD::Oracle doesn't support
it at the moment.
Where would the patch for this go?

Patch how SQL_INTEGER is handled on the way out to Perl or add the
smarts
to DBD::Oracle to see the safe cases??
bind_col() could set the fbh->fetch_func pointer for that column to
point to a new fetch_func_integer() function. Or something like that.
Ok fair enough but how to nasty little part of the safe cases??

Eh?

Tim.

As the person who started this thread I am now backing away. The reason is that although it seemed I'd like integers back from Oracle queries when the columns were integers I'd forgotten that "integer" is just a synonym for number(38). All the integers I'm after are actually primary keys - some limited to very small numbers but some which may be very very large numbers. What John is referring to is the comment from Peter suggesting that as Oracle numbers can be very large they may not fit into Perl integer types and so DBD::Oracle would have to look at the numbers to decide if they can fit into Perl integers or not. This was the point I took 2 steps back very quickly as this meant I could not rely on what was returned from DBD::Oracle whereas although what is returned now is an inconvenience it is at least consistent. As I have shown in a previous post in this thread, my actual issue was that I was converting Oracle returned data to JSON using JSON::XS and was annoyed with JSON::XS putting double quotes (for strings) around fields I knew were integers. Adding 0 to them turned them into Perl integers (as far as JSON::XS was concerned) but that fails horribly when the numbers are large - suddenly you lose precision and converting the number to JSON and back loses precision.

As a result, I'm not pushing this in any way. It would seem the idea in DBI of saying this column is an integer so I want it as an integer looks ok on the surface until the numbers get large when things get rather inconsistent. Not really wishing to complicate this anymore I note that some other modules that have to deal with big numbers turn them into Math::BigInt (from what I remember not being near my notes now) objects.

Martin
Changing from this

        char *p = (char*)row_data;
sv_setpvn(sv, p, (STRLEN)datalen)

to something like this

  double        dnum;
(void) OCINumberToReal(fbh->imp_sth->errhp, (CONST OCINumber *) row_data,
                                (uword) sizeof(dnum), (dvoid *) &dnum);

        sv_setnv(sv, dnum);

John
Tim.



Reply via email to