Markus Kuhn writes:

> I just want to write in maximally portable ISO C99 code
> 
> #if __STDC_ISO_10646__
>   printf("%lc", 0x201c);
> #else
>   putchar('"');
> #endif
> 
> and it should do the right thing depending on the locale (including
> transliteration to " for ISO 8859-1, which glibc 2.1.96 fails to do at
> the moment unfortunately).

This piece of code is far from being portable. The printf line has
undefined behaviour, because the standards don't say what happens when
you output (wchar_t)0x201c (in case the conversion to multibyte
fails).

> But gettext is overkill for monolingual applications that want to run in
> different character encodings (and there are *many* of these).

If you only want the translation, not the message catalog lookup, you
can do it like this:

  #if __STDC_ISO_10646__
    const wchar_t[] wstr = { 0x201c };
    char* str;
    size_t length;
    if (iconv_string (nl_langinfo (CODESET), "wchar_t",
                      (const char *) wstr, (const char *) (wstr + 1),
                      &str, &length) == 0)
      fwrite (str, 1, length, stdout);
  #else
    putchar('"');
  #endif

[iconv_string() is from libiconv/extras.]

But then it is not useful to check __STDC_ISO_10646__. The following
is more portable:

    const uint16_t[] wstr = { 0x201c };
    char* str;
    size_t length;
    if (iconv_string (nl_langinfo (CODESET), "UCS-2",
                      (const char *) wstr, (const char *) (wstr + 1),
                      &str, &length) == 0)
      fwrite (str, 1, length, stdout);

> I do not want to have to be forced to use (the non-standard!) 
> gettext

gettext is standard now: it's part of the LI18NUX specification.

Bruno
-
Linux-UTF8:   i18n of Linux on all levels
Archive:      http://mail.nl.linux.org/lists/

Reply via email to