Gustavo Niemeyer wrote:
Given the fact that files have an 'encoding' parameter, and that
any unicode strings with characters not in the 0-127 range will
raise an exception if being written to files, isn't it reasonable
to respect the 'encoding' attribute whenever writing data to a
file?

In general, files don't have an encoding parameter - sys.stdout is an exception.

The reason why this works for print and not for write is that
I considered "print unicodeobject" important, and wanted to
implement that. file.write is an entirely different code path,
so it doesn't currently consider Unicode objects; instead, it
only supports strings (or, more generally, buffers).

> This difference may become a really annoying problem when trying to
> internationalize programs, since it's usual to see third-party code
> dealing with sys.stdout, instead of using 'print'.

Apparently, it isn't important enough that somebody had analysed this,
and offered a patch. In any case, it would be quite unreliable to
pass unicode strings to .write even *if* .write supported .encoding,
since most files don't have .encoding. Even sys.stdout does not
always have .encoding - only when it is a terminal, and only if we
managed to find out what the encoding of the terminal is.

Regards,
Martin
_______________________________________________
Python-Dev mailing list
[EMAIL PROTECTED]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to