On Fri, Mar 30, 2007 at 03:58:21PM +0200, Jan Willem Stumpel wrote:
> Marcin 'Qrczak' Kowalczyk wrote:
> 
> > There is still some software I have installed here which
> > doesn’t work with UTF-8. I switched from ekg to gaim and from
> > a2ps to paps because of this. UTF-8 support in some quite
> > popular programs still relies on unofficial patches: mc, pine,
> > fmt. There is still work to do.
> 
> Yes.. for instance texmacs and maxima. And a2ps -- doomed to be
> replaced by paps. But these examples are becoming rarer and rarer.
> 
> mc, for instance, is quite alright nowadays (well, in Debian it is).
> 
> Of course your point is quite correct. Until even a few years ago,
> UTF-8 was only practicable for hardy pioneers. But it is different
> now.

I agree. It’s amazing how much software I still fight with not
supporting UTF-8 correctly. Even bash/readline is broken in the
presence of nonspacing characters and long lines..

My point was that, had the mistake of introducing ISO-8859 support not
been made (i.e. if bytes 128-255 had remained considered as
“unprintable” at the time), there would have been both much more
incentive to get UTF-8 working quickly, and much less of an obstacle
(the tendancy of applications to treat these bytes as textual
characters).

Obviously there were plenty of people who wanted internationalization
even back in 1996 and earlier. I’m just saying they should have done
it correctly in a way that supports multilingualization rather than
taking the provincial path of ‘codepages’ some 5 years after
UCS/Unicode had obsoleted them.

Rich

--
Linux-UTF8:   i18n of Linux on all levels
Archive:      http://mail.nl.linux.org/linux-utf8/

Reply via email to