"real programming", which the author defines as "to increase the
computational capacity, to begin with a set of operations, and develop
them into new operations that were not obviously implicit in the
original set."

A bit like parallel parking -- a priori, cars don't seem to have anything in their API for moving sideways, towards the passenger side by a little over a car width, but the right combination of inputs produces the desired behavior. I suppose the difference is that the possibility of parallel parking is, through experience and example, far more obvious to the debutant than the possibility of powerful programming.

I'm not going to claim that everyone goes through the same phases, or
goes through them in the same order, but no doubt any programmer will
recognize some of their own history in what lies below.  In each
phase, I failed to understand that it was merely a phase, and that my
knowledge was still limited --- that a few years later, I would be
able to do things I could only dream of at the moment.

Not just any programmer. When Scott McCloud talks about his evolution as a comic book writer, it follows a similar pattern: the naive artist is fixated on the surface details and graphic style and tends to dismiss what has come before on those grounds, where the experienced artists, having had to wrestle on his own (I presume the gendered pronoun is about as applicable to comics as to code) with the tradeoffs involved in putting together a piece is able to recognize the mastery in earlier work, despite the drift in fashion.

I had the idea that "knowing how to program" was knowing the details
about all the programming interfaces I could use --- how to use the
PLAY statement to make music, how to use the RND function to generate
random numbers, and so on.  I didn't understand that there was a kind
of programming knowledge that wasn't specific to a particular
language...

The McCloud parallel is that one moves from "how do I draw characters *just like* the ones in Foo" to "how do I tell a *story* with my characters?" (from "how can I use the Foo package to make something that works *just like* Bar" to "how should this computation be structured", respectively?)

I didn't really understand that the PLAY statement was really
implemented by other code, not so different from the BASIC code I
could read myself, rather than some kind of magic --- I didn't know
how to take it apart and see what was inside, I didn't know the
language it was written in, and it ran a lot faster than code I could
write myself.

Just as an armillary sphere was a small model[0] of the visible universe, metacircular interpreters help demonstrate that the next turtle down isn't any more magical -- usually faster, often more hassle[1], but no more magical -- than the turtle one currently knows. "What one fool can learn to do, another can"

[0] I, too, succumb to the temptation of model railroading: http://en.literateprograms.org/Literate_Programming_(Python)

[1] there is a certain symmetry in programming languages -- algol 60 was chock full of features, and people spent the next couple of decades after that tossing them out, trying to find the sweet spot where the language gave the ability to say what you wanted to say, but not so much power that saying it safely took forever and a half. With the passing of the generation of programmers who remembered how well machine code allowed themselves to laboriously shoot themselves in the feet, we've spent the last couple of decades adding complexity and dynamism. (maybe the hardware people are just now catching up to where software people wished they were in the late 50s -- to which the hardware people might reply that we might all be a lot better off if we didn't make their machines slower almost as fast they make them faster)

cf. Landin, Histories of Discoveries of Continuations: Belles-Lettres with Equivocal Tenses

Today, I don't need [a language]; even if I were programming in
1980s MBASIC, I could use functional programming techniques to
structure the program.  At the time, this was still beyond my
capacity.

  "The withering away of the statement" is a phenomenon we may live to see.
  --Peter J. Landin, Getting rid of labels, 1965.

As far as I have yet learned, to know that programming is more a matter of the workman than the tools is the crux. One technical problem here is that the best applications of programming techniques don't leave obvious traces -- even removing code instead of adding it -- because the heavy work is done in the theory, to avoid any work in the practice.

In retrospect, I didn't need teachers to teach me things; I needed
teachers to force me to get out of my comfort zone, and partners to
get me unstuck.

In sport, the temptation to "step up one's game" is an easy spur to stretching the comfort zone. But usually, there the next challenges to tackle are clearer than they are in programming. In laying out an equestrian cross-country course, the good designer sets it so that there are two approaches to the obstacles -- the quick, but challenging line, and the safer, but slower, line. It might be interesting to come up with some sample programming problems that offered a similar series of decisions in the implementation, and then anyone could measure their comfort zone against the height of the fences on offer. (Also, just as it is better to save one's horse and only take the tricky options where they save the most time, such a course would implicitly demonstrate that it is better to save one's brain and only take the tricky options where they count most)

(the closest I can think of, but related more to puissance than endurance, is "evolution of a Haskell programmer"[2] -- it parallels the quasizen "when I first learned to program, code was code and data was data; when I was studying programming, code was no longer code" etc., all the way up to the punch line)

Unfortunately, I have no idea what range of programming abilities might be an appropriate target. To explain the FizzBuzz buzz[3] to my wife, I took the analogy of asking the new guy to tack up a horse and to walk, trot, and canter. At that point, having observed their style, you still don't necessarily know if they can do any work, but at least you can have some confidence that they're not going to get in a big wreck learning to do it.

[2] http://www.willamette.edu/~fruehr/haskell/evolution.html

[3] it is interesting that the top scoring declared majors on the LSAT are phys/maths and philosophy/religion. Given that programmers are among the few people who make inferences and immediately get several million chances a second to discover they've made an error, it seems odd that they would, as a group, do so poorly on the LSAT. My interpretation is that CS students who are hackers don't take the LSAT, and it is instead the CS students who originally declared their major on the basis of expected earnings who take it. (perhaps they now believe that wheedling into law practice will be easier than coaxing an unforgiving machine[4] into doing its job?)

[4] "They say Princes learn no art truly, but the art of horsemanship. The reason is, the brave beast is no flatterer. He will throw a prince as soon as his groom."

Unsurprisingly (in retrospect), reading code made me better at reading
code.

I attribute my early successes in computing to have misspent much of my youth downloading programs from net.sources (eventually net.sources.games), and porting them to work on the mini. I doubt I wrote much notable on my own at this point -- verb/noun adventure game engines and wireframe flight simulators -- but I did get very good at reading other people's code, and figuring out why it was not working as intended and how to make the necessary patch.

In retrospect, I had some amazing advantages. In the early 80's, my father was working at a company that was building (not cloning) personal computers. So, yeah, the chips themselves were magic, but for everything else, you knew, or at least knew of, the guys who were making it work (and would tell you of practical jokes, like the time the h/w guys reversed the leads on the monitors one day to see if the s/w people would notice instead of flipping all the signs in the driver code) -- and if you poked around in the filesystem, they had not only sources for what they were working on, but (depending upon whose schedule was crunched at what time) all sorts of side programs. Failing that, the company library was well stocked with ACM titles.

I think the biggest advantage was not the availability of all that source code (between open source and citeseer, I think kids now *should* win big over both of us -- do they?), but how it set what was "normal" and "aspirational" for me. At the time, I'd thought everyone else had the fancy SIGGRAPH displays and Xerox workstations and optimizing Scheme compilers, and I was deprived because my father worked for a company that was merely producing a little micro with a 512x256 bitmapped display and no mouse whose most compelling app was a graphical rogue-like game. (written by a couple of the engineers as a quick hack, who also provided a small bitmap editor which one of the secretaries used to draw the artwork) Never mind that I could log on to the VAX and read the code and even hack my own small changes (using an editor, and a compiler, written by others at the company) into a private copy. I became convinced that there must be more to programming that what I'd seen thus far. (which, after all, was something even my old man could do) Note that, like any n00b, I was still fixated on flash gear, instead of how well these guys were making things work with what they had, but at least I got the idea that what one fool can engineer, another can, and that the frontiers of the subject were a long way from what I knew.

Of course, I don't know what my next phase will look like.  Presumably
certain things I currently struggle with --- to the point of not
knowing how they're even possible --- will get easy, and it will seem
strange that they were ever difficult.  But I don't know what those
things will be.  But I'm pretty sure I'm not done learning yet...

  "You learn something every day ... unless you're careful" --Tom Van Vleck

At this point, I'm taking a detour, as backfill, into the algebra I never learned. (mathematicians' modules are a bit more modular than ours) To see the connection between "real computing" and Lie Algebras, refer to the example at the top of this mail.

-Dave

Reply via email to