Mark Dilger wrote:
Tom Lane wrote:
Mark Dilger <[EMAIL PROTECTED]> writes:
pgsql=# select chr(14989485);
Is there a principled rationale for this particular behavior as
opposed to any other?
In particular, in UTF8 land I'd have expected the argument of chr()
to be interpreted as a Unicode code point, not as actual UTF8 bytes
with a randomly-chosen endianness.
Not sure what to do in other multibyte encodings.
"Not sure what to do in other multibyte encodings" was pretty much my
rationale for this particular behavior. I standardized on network byte
order because there are only two endianesses to choose from, and the
other seems to be a more surprising choice.
I looked around on the web for a standard for how to convert an integer
into a valid multibyte character and didn't find anything. Andrew,
Supernews has said upthread that chr() is clearly wrong and needs to be
fixed. If so, we need some clear definition what "fixed" means.
Since chr() is defined in oracle_compat.c, I decided to look at what Oracle
might do. See
It looks to me like they are doing the same thing that I did, though I don't
have Oracle installed anywhere to verify that. Is there a difference?
---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?