Currently, CHAR is correctly interpreted as CHAR(1), but VARCHAR is incorrectly interpreted as VARCHAR(<infinity>). Any reason for that, besides the fact that it of course makes much more sense than VARCHAR(1)? Additionally, neither CHAR nor VARCHAR seem to bark on too long input, they just truncate silently. I'm wondering because should the bit types be made to imitate this incorrect behaviour, or should they start out correctly? -- Peter Eisentraut [EMAIL PROTECTED] http://yi.org/peter-e/
- Re: [HACKERS] Varchar standard compliance Peter Eisentraut
- Re: [HACKERS] Varchar standard compliance Mitch Vincent
- Re: [HACKERS] Varchar standard compliance Tom Lane
- Re: [HACKERS] Varchar standard compliance Tom Lane
- Re: [HACKERS] Varchar standard compliance Tom Lane
