On 12 October 2016 at 23:50, Thomas Nyberg <tomuxi...@gmail.com> wrote:
> Since when was decimal notation "standard"?
Depends on what planet do you live. I live on planet Earth. And you?

> opposite. For unicode representations, byte notation seems standard.
How does this make it a good idea?
Consider unicode table as an array with glyphs.
Now the index of the array is suddenly represented in some
obscure character set. How this index is other than index of any
array or natural number? Think about it...

>> 2. Mixing of two notations (hex and decimal) is a _very_ bad idea,
>> I hope no need to explain why.
> Still not sure which "mixing" you refer to.

Still not sure? These two words in brackets. Mixing those two systems.
Python-ideas mailing list
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to