On 12 October 2016 at 23:58, Danilo J. S. Bellini
> Decimal notation is hardly
> readable when we're dealing with stuff designed in base 2 (e.g. due to the
> visual separation of distinct bytes).
Hmm what keeps you from separateting the logical units to be represented each
by a decimal number? like 001 023 255 ...
Do you really think this is less readable than its hex equivalent?
Then you are probably working with hex numbers only, but I doubt that.
> I agree that mixing representations for the same abstraction (using decimal
> in some places, hexadecimal in other ones) can be a bad idea.
"Can be"? It is indeed a horrible idea. Also not only for same abstraction
but at all.
> makes me believe "decimal unicode codepoint" shouldn't ever appear in string
I use this site to look the chars up:
that is rather peculiar, three negative replies already but with no strong
arguments why it would be bad to stick to decimal only, only some
"others do it so" and "tradition" arguments.
The "base 2" argument could work at some grade but if stick to this
criteria why not speak about octal/quoternary/binary then?
Please note, I am talking only about readability _of the character
And it is not including your habit issues, but rather is an objective
criteria for using this or that character set.
And decimal is objectively way more readable than hex standard character set,
regardless of how strong your habits are.
Python-ideas mailing list
Code of Conduct: http://python.org/psf/codeofconduct/