"Yusuf W." <[EMAIL PROTECTED]> writes:
> For the application that I'm working on, we want to
> use data types that are database independent.  (most
> databases has decimal, but not big int).

Most databases have bigint, I think.

> Anyhow, we are planning on using decimal(19,0) for our
> primary keys instead of a big int, would there be a
> performance difference in using a bigint over using decimals?

You'll be taking a very large performance hit, for very little benefit
that I can see.  How hard could it be to change the column declarations
if you ever move to a database without bigint?  There's not normally
much need for apps to be explicitly aware of the column type names.

                        regards, tom lane

---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]

Reply via email to