On 13 October 2016 at 12:05, Cory Benfield <c...@lukasa.co.uk> wrote:
> integer & 0x00FFFFFF  # Hex
> integer & 16777215  # Decimal
> integer & 0o77777777  # Octal
> integer & 0b111111111111111111111111  # Binary
> The octal representation is infuriating because one octal digit refers to 
> *three* bits

Correct, makes it not so nice looking and 8-bit-paradigm friendly.
Does not make it however
bad option in general and according to my personal suppositions and
works on glyph
development the optimal set is exactly of 8 glyphs.

> Decimal is no clearer.

In same alignment problematic context, yes, correct.

> Binary notation seems like the solution, ...

Agree with you, see my last reply to Greg for more thoughts on bitstrings
and quoternary approach.

> IIRC there’s some new syntax coming for binary literals
> that would let us represent them as 0b1111_1111_1111_1111

Very good. Healthy attitude :)

> less dense and loses clarity for many kinds of unusual bit patterns.

Not very clear for me what is exactly there with patterns.

> Additionally, as the number of bits increases life gets really hard:
> masking out certain bits of a 64-bit number requires

Self the editing of such a BITmask in hex notation makes life hard.
Editing it in binary notation makes life easier.

> a literal that’s at least 66 characters long,

Length is a feature of binary, though it is not major issue,
see my ideas on it in reply to Greg

> Hexadecimal has the clear advantage that each character wholly represents 4 
> bits,

This advantage is brevity, but one need slightly less brevity to make
it more readable.
So what do you think about base 4 ?

> This is a very long argument to suggest that your
> argument against hexadecimal literals
> (namely, that they use 16 glyphs as opposed
> to the 10 glyphs used in decimal)
> is an argument that is too simple to be correct.

I didn't understood this sorry :)))
Youre welcome to ask more if youre intersted in this.

> Different collections of glyphs are clearer in different contexts.
How much different collections and how much different contexts?

> while the english language requires 26 glyphs plus punctuation.

Does not *require*, but of course 8 glyphs would not suffice to effectively
read the language, so one finds a way to extend the glyph set.
Roughly speaking 20 letters is enough, but this is not exact science.
And it is quite hard science.

> But I don’t think you’re seriously proposing we should
> swap from writing English using the larger glyph set
> to writing it in decimal representation of ASCII bytes.

I didn't understand this sentence :)

In general I think we agree on many points, thank you for the input!

Python-ideas mailing list
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to