On 16 October 2016 at 23:23, Greg Ewing <greg.ew...@canterbury.ac.nz> wrote:

>> Those things cannot be easiliy measured, if at all,

>If you can't measure something, you can't be sure
>it exists at all.

What do you mean I can't be sure?
I am fully functional, mentally healthy man :)

>Have you *measured* anything, though? Do you have
>any feel for how *big* the effects you're talking
>about are?

For what case, of course. So the difference for
"0010 0011" and
"--k- --kk"
I can feel indeed big difference.
Literally, I can read the latter clearly even I close
my left eye and *fully* defocus my right eye.
That is indeed a big difference and tells a lot.
I suppose for disabled people this would be the only
chance to see anything there.
Currently I experiment myself and of course I plan to
do it with experimental subjects. I plan one survey
session in the end of November.

But indeed this is very off-topic. So feel free
to mail me, if anything.

So back to hex notation, which is still not so
off-topic I suppose.

>>There must *very* solid reason
>>for digits+letters against my variant, wonder what is it.

>The reasons only have to be *very* solid if there
>are *very* large advantages to the alternative you
>propose. My conjecture is that the advantages are

First ,I am the opinion that *initial* decision
in such a case must be supported by solid reasons
and not just like, "hey, John has already written
them in such a manner, lets take it!".

Second, I totally disagree that there always must be
*very* large advantages for new standards,
if we would follow such principle, we would still
use cuneiform for writing or bash-like syntax,
since everytime when someone proposes
a slight improvement, there would be somebody who
says : "but the new is not *that much* better than old!".
Actually in many cases it is better when it is
evolving - everybody is aware.

> Here are some reasons in favour of the current
> system:

> * At the point where most people learn to program,
> they are already intimately familiar with reading,
> writing and pronouncing letters and digits.
> * It makes sense to use 0-9 to represent the first
> ten digits, because they have the same numerical
> value.

So you mean they start to learn hex and
see numbers and think like: ooo it looks like
a number, not so scary.
So less time to learn, yes, +1
(less pain now, more pain later)

But if I am an adult intelligent man,
I understand that there are only ten digits
and I will despite need to extend the set
and they *all* should be optically consequent and good readable.
And what is a good readable set with >=16 glyphs?
Small letters!
Somewhat from the height of my current
knowledge, since I know that digits anyway not very
good readable.

> * Using a consecutive sequence of letters makes
> sense because we're already familiar with their
> ordering.

I actually proposed consecutive,
but that does not make much difference:
being familiar with ordering of the alphabet will have next to
zero influence on the reading of numbers encoded with letters,
it is just an illusion that it will, since the letter
is a symbol, if I see "z" I don't think of 26.
More probably, the weight of the glyph could
play some role, that means less the weight - less the number.

Mikhail
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to