On 13 October 2016 at 01:50, Chris Angelico <ros...@gmail.com> wrote:
> On Thu, Oct 13, 2016 at 10:09 AM, Mikhail V <mikhail...@gmail.com> wrote:
>> On 12 October 2016 at 23:58, Danilo J. S. Bellini
>> <danilo.bell...@gmail.com> wrote:
>>
>>> Decimal notation is hardly
>>> readable when we're dealing with stuff designed in base 2 (e.g. due to the
>>> visual separation of distinct bytes).
>>
>> Hmm what keeps you from separateting the logical units to be represented each
>> by a decimal number? like 001 023 255 ...
>> Do you really think this is less readable than its hex equivalent?
>> Then you are probably working with hex numbers only, but I doubt that.
>
> Way WAY less readable, and I'm comfortable working in both hex and decimal.

Please don't mix the readability and personal habit, which previuos
repliers seems to do as well. Those two things has nothing
to do with each other. If you are comfortable with old roman numbering
system this does not make it readable.
And I am NOT comfortable with hex, as well as most people would
be glad to use single notation.
But some of them think that they are cool because they know several
numbering notations ;) But I bet few can actually understand which is more
readable.

> You're the one who's non-standard here. Most of the world uses hex for
> Unicode codepoints.
No I am not the one, many people find it silly to use different notations
for same thing - index of the element, and they are very right about that.
I am not silly, I refuse to use it and luckily I can. Also I know that decimal
is more readable than hex so my choice is supportend by the
understanding and not simply refusing.

>
>> PS:
>> that is rather peculiar, three negative replies already but with no strong
>> arguments why it would be bad to stick to decimal only, only some
>> "others do it so" and "tradition" arguments.
>
> "Others do it so" is actually a very strong argument. If all the rest
> of the world uses + to mean addition, and Python used + to mean
> subtraction, it doesn't matter how logical that is, it is *wrong*.

This actually supports my proposal perfectly, if everyone uses decimal
why suddenly use hex for same thing - index of array. I don't see how
your analogy contradicts with my proposal, it's rather supporting it.


> quote; if you us 0x93, you are annoyingly wrong,

Please don't make personal assessments here, I can use whatever I want,
moreover I find this notation as silly as using different measurement
systems without any reason and within one activity, and in my eyes
 this is annoyingly wrong and stupid, but I don't call nobody here stupid.

But I do want that you could abstract yourself from your habit for a while
and talk about what would be better for the future usage.

> everyone has to do the conversion from that to 201C.

Nobody need to do ANY conversions if  use decimal,
and as said everything is decimal: numbers, array indexes,
ord() function returns decimal, you can imagine more examples
so it is not only more readable but also more traditional.


> How many decimal digits would you use to denote a single character?

for text, three decimal digits would be enough for me personally,
and in long perspective when the world's alphabetical garbage will
dissapear, two digits would be ok.

> you have to pad everything to seven digits (\u0000034 for an ASCII
> quote)?

Depends on case, for input  -
 some separator, or padding is also ok,
I don't have problems with both. For printing obviously don't show
leading zeros, but rather spaces.
 But as said I find this Unicode only some temporary happening,
 it will go to history in some future and be
used only to study extinct glyphs.

Mikhail
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to