On Saturday, September 27, 2003, at 10:39 PM, Yusuf W. wrote:

Now, I've got to convince my project's software
architech, that a bigint would be better than a

Does anyone know where I could get some documentation
on how the int and decimal are implemented so I could
prove to him that ints are better?  Can people suggest
good points to make in order to prove it?

Print out Tom's reply and give it to him. Saying 'one of the people who develops the thing says so' ought to carry some weight. I would hope...

Thanks in advance.

--- Tom Lane <[EMAIL PROTECTED]> wrote:
"Yusuf W." <[EMAIL PROTECTED]> writes:
For the application that I'm working on, we want
use data types that are database independent.
databases has decimal, but not big int).

Most databases have bigint, I think.

Anyhow, we are planning on using decimal(19,0) for
primary keys instead of a big int, would there be
performance difference in using a bigint over
using decimals?

You'll be taking a very large performance hit, for
very little benefit
that I can see.  How hard could it be to change the
column declarations
if you ever move to a database without bigint?
There's not normally
much need for apps to be explicitly aware of the
column type names.

regards, tom lane

Do you Yahoo!?
The New Yahoo! Shopping - with improved product search

---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]


Andrew Rawnsley
The Ravensfield Digital Resource Group, Ltd.
(740) 587-0114

---------------------------(end of broadcast)--------------------------- TIP 5: Have you checked our extensive FAQ?


Reply via email to