So I've been seeing big-picture/philosophical discussions arise on 9fans
for about 25 years.  Usually I just sit back with my bowl of popcorn and
watch.  Every once in a while I'll jump in and present a painfully long
dissertation, and that's today.  For those new to my perspective, I've
been at this stuff for nearly 50 years (I wrote my first code in '76
or '77), and as time has gone by, I find myself becoming more and more
the G.H. Hardy of computing.  For those who don't know that reference,
look up his essay "A Mathematician's Apology."  The semi-serious tl;dr
is "If anyone ever finds a practical application of my work, I'll
consider it a failure."

My launching point today is the statement made in various ways by
several of the contributors that computers are tools.  Of course, I'm
aware that's the case for some people, and we've all done applied
things to put food on the table.  But if that were the whole point
of computing, I'd be bored out of my skull.  In the same way that
a study of physics might result in a better widget for the unwashed
masses but that's not its purpose, so a study of computer science
might result in a similar better widget but that's not its purpose.
Physics is the study of phenomena surrounding elementary particles
and forces.  Computer Science is the study of phenomena surrounding
the property of universality.  It requires no other justification.
We made a really big mistake back in the '60s and '70s when we got
excited about computers and people asked why and what good are they.
We tried to come up with justifications when we would have been
better off just saying, "you wouldn't be interested; they're just
for us nerds."  I continue to be amused that one of our justifications
was always keeping track of recipes in a kitchen.  If you're not
familiar with it, check out the infamous "kitchen computer" from
an old Neiman Marcus catalog.

You would be correct then to infer that I don't really put any
weight on how many users we bring into the community.  This is
the reason why by 2005 to 2010, I had largely turned my back on
Linux.  It seemed that too many decisions in that community were
about making things "easy" for those who had no UNIX experience
at the expense of those who did.  It seems that much of the open
source world has come to measure success by user adoption and
number of commits to the source control preference of the decade.
Neither of those draw my attention.  If there is any correlation
between quality and popularity, it is negative.  That gets us to
the general topic of the so-called "user."  The garbage promulgated
by the self-appointed UI experts is almost always directly opposed
to what I want.  I have plans in the after-life to use an endless
can of lighter fluid to stoke the fires burning the morons who
put huge trackpads where the heels of my hands belong on a laptop.
What would happen to music if piano and guitar makers applied the
same stupidity to help prevent beginners from making mistakes.  Maybe
there is a place for the record player, but if you damage my piano
to turn it into a record player, you're a force for evil, not good.
Along similar lines, the justifications that are made for most of
the UI commandments (including the precious mouse) are prefect
examples of measuring the wrong thing.  Speed of interface is
irrelevant.  It doesn't matter whether I use ed and use search
to find the bit I want to edit, or whether I use acme and let
the mouse scroll to where I want to edit.  The difference is lost
in the noise because any programmer or writer worth a damn spends
far more time thinking and drawing diagrams on paper than in
typing and clicking.  If you want to respond with "but most
people don't write software or papers" then see my rant on the
piano vs the record player.  Because the computer is a finite
realization of the universal machine Turing identified, it exists
to be programmed.  To not create with it is to use it not as
a piano but as a toaster.

As many of you know, I already have my own private language to
minimize the spikes in blood pressure every time the C standards
committee meets.  Although POSIX does still exist when I teach,
I no longer go anywhere near it when I'm programming.  So it
shouldn't be surprising to learn that in the last several years,
my thoughts have moved away from "what can I contribute to the
Plan 9 world" to "what ideas from Plan 9 would I borrow for
a private system" or "should I just use the Plan 9 kernel and
build my own environment around it."  Of course, I also have
to solve the problem of hardware.  The x86 is a steaming pile
of garbage and one of my objectives is to become an x86-free
zone.  I've also come to the conclusion that it is so horrific
that what sticks to it is only the similarly disgusting stuff
like ACPI, UEFI, etc.  What scares me to death is that some
of that same garbage has started to cling to what could
otherwise be reasonable architectures.  At least the Raspberry
Pi is mostly its own world, so most of my Plan 9 work these
days is centered there.  But I am starting to think I might
have to create my own hardware to truly escape from the
breathtaking stupidity that has come to dominate the industry.
Yes, I've even thought about resurrecting a 68000 machine
I wire-wrapped nearly 40 years ago.

Much of my aesthetic is described by the quote from
Saint-Exupery, "Perfection is achieved, not when there is
nothing more to add, but when there is nothing left to take
away."  Now I'm not telling people they shouldn't add for
themselves if they want.  But as I move toward disconnecting
from the parts of the computing world that give me heartburn
to stay in the parts that give me intellectual satisfaction
and fulfillment, I expect to be taking things out, rather
than adding them.  Everyone can have their own reasons
for being part of the community and their own objectives
moving forward.  But for me the reasons I'm here are largely
dominated by the minimalism of the design where I can feel
direct connections to the individuals who created it, the
stability of the code base, and the smallness of the community.

A little while back I found myself trying to articulate what
I really saw as programming while I was walking.  By the time
I got back to the office, I had a phrasing I liked, so I
typed it up in TeX with \magnification=3584 and put the
output on my wall.  The other day I was catching up with
one of our alums who has recently finished her PhD at Penn
and she saw it, said she liked it, and took a picture of it.
It reads, "Programming is the process by which we take an
idea, a concept, existing in nothing but a pattern of firing
neurons and transform it through pure thought into the
definition of a machine, a definition that can be interpreted
and emulated by a universal machine to manipulate the physical
world.  All else taints and compromises the purity and beauty
of programming."

There's your semi-decadal tirade from the old guy to remind
everyone that computing is not really about the nonsense that
pervades the general culture.

BLS


------------------------------------------
9fans: 9fans
Permalink: 
https://9fans.topicbox.com/groups/9fans/Tf84d656c78bbda91-Mf7008f52b2499d146c7fdaa1
Delivery options: https://9fans.topicbox.com/groups/9fans/subscription

Reply via email to