Mikhail V wrote:
And decimal is objectively way more readable than hex standard character set,
regardless of how strong your habits are.

That depends on what you're trying to read from it. I can
look at a hex number and instantly get a mental picture
of the bit pattern it represents. I can't do that with
decimal numbers.

This is the reason hex exists. It's used when the bit
pattern represented by a number is more important to
know than its numerical value. This is the case with
Unicode code points. Their numerical value is irrelevant,
but the bit pattern conveys useful information, such
as which page and plane it belongs to, whether it fits
in 1 or 2 bytes, etc.

Python-ideas mailing list
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to