David MacQuigg wrote:
> What ever happened to the original enthusiasm with Computer Programming
> for Everyone?  If everyone with a high school diploma knew how to write a
> simple program, not only would we be more productive, but we would
> understand the world better.  Instead of loose talk and isolated numbers,
> the news would show us charts.  The general public, not just experts,
> would have seen the very obvious bubble growing in the housing market,
> and could see now where we are on the down side.  What if the average
> real estate agent could show me the price trends on property similar to
> what I am looking at.  Instead, I have to dig out the data myself, and
> plot it in Excel.  Then when I show her the result, she still doesn't see
> the significance.

Many years ago someone said (probably Kirby, and probably on this list) essentially that while "computing" is taught in school as if it were a subset of schoolish "math", it's really more true that schoolish "math" is a subset of "computing". Obviously, real knock-your-socks-off math subsumes *everything*, as in physics is a subset of math in a way, but that is not the case either in most K-12 schools. And even then, the lines between computing and math are starting to blur, as even modern physicists now spend a lot of time with their computer simulations that their base equations. So, I feel from a practical point of view, computing should be introduced as early as possible in education (perhaps after, say age seven and kids get the real world at an intuitive level), and learning to do schoolish math (including algebra, trigonometry, logical proofs of correctness, and so on) should flow from that. And, for example, you can then link things like physics, chemistry, and biology (and even English and history) into a computer base curriculum using simulation and data acquisition.

On the larger issue:

David MacQuigg also wrote:
At 06:52 PM 12/8/2008 -0800, Guido van Rossum wrote:
On Mon, Dec 8, 2008 at 5:10 PM, David MacQuigg
<[EMAIL PROTECTED]> wrote:
At 03:30 PM 12/8/2008 -0800, michel paul wrote:
I think part of the problem in the past has been the
misunderstanding about tech jobs getting outsourced.  I've heard
people say there's no point in becoming a programmer, because all
the jobs are going overseas.  It's really kind of silly.
>>>
Stated that way, it does seem circular.  I've heard it stated more
convincingly by an EE prof to a class of undergrads.  "If you go into
engineering, you will be facing layoffs."  Imagine the effect of that
expectation on smart students who see their buddies going into law or
medicine, and getting more pay and more respect than engineers.  It's
no wonder there are almost no US students in our graduate classes.
I've thought about what I would have said to those students.  It
would be more like "If money is your major motivation, find another
profession.  If technology is in your blood, stay with it.  Learn
everything you can.  The money will come out OK."
>>
I read this as: Engineering is something where mediocrity doesn't pay. Doctors and lawyers are like cobblers, their output is limited by the number of hours they can work, so there is room for good solid workers who aren't particularly innovative. Engineering at its best is not like
that at all. It's a field whose main *point* is to make manual labor
redundant. Good engineers do their work because it's their passion. The
rest... Well they can always try to earn a living cranking out Java
code. ;-)

I'm a bit uncomfortable with the idea that engineering is a field where
only the brightest should feel comfortable.  There is plenty of need for
good solid workers, and I would like to see our schools and our economy
support that.  If we outsource the grunt work, and hope to keep just the
top geniuses employed, eventually we lose the top also.  I remember in
the 80's thinking the Japanese could never catch up with us in circuit
design.  They just didn't have the creative spark.  It wasn't in their
culture.

On this general topic of the cultural context of engineering education,
here a few ideas about historical trends, and one speculation based on projecting things forward a couple decades from what Guido said elsewhere.

After WWII, the USA was the only significant manufacturing power. Europe and much of Aisa were either in rubble, social turmoil, or both. The Southern hemisphere still had little infrastructure too. So, it could be expected that the manufacturing base in the USA would grow as it made stuff for the world, and like China today, this would be a good position to be in, having the world depending on it for stuff. But over the decades, this unusual situation has shifted, and while the USA still sells a lot of manufactured goods, as the world has rebuilt in some places and developed industrially in others, a more normal situation is reestablishing itself. Culturally, it is true that different places have different strengths and weaknesses, like Japan may struggle with too much conformity. On the other hand, the rest of the world now seems to be more quickly getting the cooperative nature of developing "free and open source" software, content, and physical design.

Also, before, during, and after WWII, the USA received for various reasons a significant influx of educated immigrants, like Einstein and van Braun, and many others (including those the USA scooped up from the ruins of post-WWII Germany). These are the people who helped give the USA atomic energy (and weapons) and who helped put a person on the moon, among many other innovations, flowing out of the grasp these people had of math and practical engineering. To an extent, the USA has been riding this intellectual capital instead of developing a culture that can as easily create educated people as the playful tradition of Germany up until the early 1930s. Over the last few decades, as these people have aged and died, the USA has lost some of its edge as well. Obviously, the USA can produce some educated people, but, as with manufacturing, the relative dominance again has been lost.

Also, for reasons of basic capitalism, formerly USA-based firms have seen it profitable in the short term to exploit a highly valued dollar to do operations oversee, as well as exploiting the relative greater social inequality in those countries (as one H1B holder from India put it to me, back in India he could afford a lot of servants on what he was earning and saving). Also, US citizens as contractors usually commanded a multiple of the prevailing wage for short term contracts, whereas H1Bs only need be paid the "prevailing wage" (what is not said is, "of an employee, not a contractor".) So, all those factors have made it more profitable for US firms to train foreign nationals in technology, again eroding any edge the USA had resulting from the above two factors.

Where does that leave future students? As the US dollar falls (the current rise is only short term as people sell dollar-denominated assets and hold the cash, unsure how to invest), this fall will make outsourcing less profitable, so US manufacturing will get some good news. Similarly, as other countries address internal inequities of their own rich-poor divides, it will also get harder to outsource or use H1Bs profitably (no more hiring a chauffeur, maid, and a cook on a programmer's salary, so why bother working for US Americans?). So, in the long term, that is all good news for US students interested in manufacturing. It is my hope that rather than the US standard of living significantly falling, that it will just stay static as the rest of the world catches up, with better technology in the USA offsetting other financial losses (like, your job pays less, but playing games at home is more fun and educational and more fulfilling socially, like the Wii is a first example of).

But there are two other counter-trends to the good news which are more serious.

One is the collapse of the value of the PhD in the USA, as documented by Dr. David Goodstein, Vice Provost of Caltech. His essential point is that the educational system mines and sort and polishes students looking for a few PhD-quality students, while discarding the rest. He says this emphasis needs to change for two reasons. One is that the discarded students are left mostly scientifically and technically illiterate which is wasteful and a threat to a democracy dependent on technology. The other reason is that academeia grew exponentially until the 1970s in the USA, creating plenty of jobs for people with PhDs, but that era is over and now most PhDs being created are surplus. When I look at the academic departments I have been part of in the past, and see most of the same professors there who were there in the 1980s and 1990s, this rings all to true. There are just not many new slots compared to the number of science PhDs produce. Industrial R&D is small, to begin with. So, we see more and more call for PhDs in K-12 or other situations (but that is not the expectation these people had, so they are often unhappy). Medicine and Law, on the other hand, by tightly controlling the number of related schools producing such professionals, and continually lobbying for increased restrictions on who can practice has managed to create an artificial scarcity of doctors and lawyers, which keeps their salaries up. There were many things common in the past, like passing the bar exam without going to law school, or pharmacists prescribing medicines, or midwives delivering babies at home, which are pretty much illegal now. But anyone can practice computer programming. I can take my car to a good mechanic without much of an appointment, but I may need to wait weeks or months to see a competent doctor -- because of this artificial scarcity. This isn't an argument for licensing programmers, I'm just pointing to the historical difference. By the way, there are at least two big tiers of doctors -- family practice and specialist, and while in my opinion family practice sounds harder, it is the specialists who get the extra training and get the big bucks, so there is some room there for the more ambitious. In any case, when you couple the collapse of the PhD pyramid scheme system in a sense, along with outsourcing and H1Bs, then it is no wonder people who thirty years ago would have pursued advanced study in science or engineering are now tempted by law or medicine. The law is a lot like programming (based on precedent, or subroutine call :-) and medicine these days is more and more science and technology driven. Still, even if we were to quadruple the numbers of doctors produced per year (please, no more lawyers :-), at 100,000 (up from 25000 per year) that would not at all accommodate the millions of kids a year interested in science and engineering. And of course, many doctors are unhappy because of insurance reimbursement and other societal issues. And nurses and aids are already in short supply as the jobs are very stressful with little control or recognition. So, in short, there is no where for most of these kids to go to apply their skills in a profitable and pleasant way, at least, not on terms anywhere like those people were getting thirty years ago.

The other is an even more serious issue that that. It was predicted in the 1960s, and echoes Guido's point of "Engineering at its best is not
like that at all. It's a field whose main *point* is to make manual
labor redundant." To amplify on Guido's point, see:
  "The Triple Revolution":
  http://www.educationanddemocracy.org/FSCfiles/C_CC2a_TripleRevolution.htm
"The fundamental problem posed by the cybernation revolution in the U.S. is that it invalidates the general mechanism so far employed to undergird people’s rights as consumers. Up to this time economic resources have been distributed on the basis of contributions to production, with machines and men competing for employment on somewhat equal terms. In the developing cybernated system, potentially unlimited output can be achieved by systems of machines which will require little cooperation from human beings. As machines take over production from men, they absorb an increasing proportion of resources while the men who are displaced become dependent on minimal and unrelated government measures—unemployment insurance, social security, welfare payments. These measures are less and less able to disguise a historic paradox: That a substantial proportion of the population is subsisting on minimal incomes, often below the poverty line, at a time when sufficient productive potential is available to supply the needs of everyone in the U.S. ... The industrial system was designed to produce an ever-increasing quantity of goods as efficiently as possible, and it was assumed that the distribution of the power to purchase these goods would occur almost automatically. The continuance of the income-through-jobs link as the only major mechanism for distributing effective demand -- for granting the right to consume -- now acts as the main brake on the almost unlimited capacity of a cybernated productive system."

If you want a more modern take on this, see Marshall Brain's sci-fi:
  "Manna"
  http://www.marshallbrain.com/manna1.htm
or his non-fiction:
  "Robotic Nation"
  http://www.marshallbrain.com/robotic-nation.htm

Or you could see the writing of any of a number of other technologists, like Ray Kurzweil:
  "The Law of Accelerating Returns"
   http://www.kurzweilai.net/articles/art0134.html?printable=1

My own take on this:
  "Post-Scarcity Princeton"
  http://www.pdfernhout.net/post-scarcity-princeton.html
The most important point there is: "Capitalism is often it seems all about cost cutting. Why do people have such a hard time thinking about what happens as costs approach zero, even for improvements in quality? Or why do economists have a hard time understanding that many conventional economic equations may produce infinities as costs trend towards zero? "

But going back to Marshall Brain's non-fiction, he writes in Robotic Nation: "I don't think anyone in 1900 could imagine the B-52 happening in 54 years. Over the next 55 years, the same thing will happen to us with robots. In the process, the entire employment landscape in America will change. Here is why that will happen. ... The arrival of humanoid robots should be a cause for celebration. With the robots doing most of the work, it should be possible for everyone to go on perpetual vacation. Instead, robots will displace millions of employees, leaving them unable to find work and therefore destitute. I believe that it is time to start rethinking our economy and understanding how we will allow people to live their lives in a robotic nation. ..."

Ultimately, money on education now is not going to make much of a difference in twenty or thirty years as far as the "competitiveness" that schools and business people are often talking about, see:
 "IBM CEO Sam Palmisano's speech at the Council of Foreign Relations on "A
Smarter Planet""
 http://www.cfr.org/publication/17696
if, as predicted, following Moore's law and exponential growth, you can buy a computer that can run a human-level AI for about $1000 in 2038 or sooner.
  "When will computer hardware match the human brain?"
  http://www.transhumanist.com/volume1/moravec.htm

I developed this theme here:
http://groups.google.com/group/openmanufacturing/msg/72330a22bcae8928?hl=en
"""
The handwriting is on the wall, not just for compulsory schools, but for other large parts of our social structure they link up with. It's not necessarily a bad message either, if we accept it and try our hardest to make the best of it. It's not like one day the robots and AIs will suddenly take over (I hope). It is more like bit by bit things will continue to change and these things will show up in our lives, and our social network will shape them based on our priorities. For example, luxury cars have moved from anti-lock brakes, then to GPS course routing, then to Electronic Stability Control, and now the big thing is adaptive cruise control using radar to maintain a fixed distance from the next car, and also automatic parallel parking. Soon more safety features will be common to detect swerving lane changes, to drive by radar in fog, to brake fast and swerve to avoid deer, and so on, until before we know it, we decide in about ten or twenty years that it's safer to let the car drive itself than give the keys to our teenagers: "GM: Self Driving cars on the road in 10 years" http://senseofevents.blogspot.com/2008/01/gm-self-driving-cars-on-road-in-10.html
"""

I also list there how a various occupations are already being automated and are likely to disappear in the next couple of decades, like: Check out clerk, Cab driver, Heart Surgeon, Airline pilot, Nurse, Entertainer, Athlete, Migrant agricultural laborer, Librarian, Artist, Designer, and Miner. I could probably list more, but that seems long enough to make the point. What will a student in kindergarten today be expected to do for a profession in twenty years if they need to compete with robots and other automation to make a living? This isn't like in the 1920s when "buggy whip" manufacturers were closing down, or like in the 1950s when the profession of "picture tinters" were going away. Back then, there were lots of jobs to go to. Right now, between a previous bailout to the car companies to shift to alternative vehicles, and the current bailout proposal, the US Congress will be handing over about US$50 billion to automotive companies so they will *only* cut about one third of their jobs (assuming GM's stated plans are similar to the other's unstated ones). Again, the US is giving out tens of billions of dollars so only one-third the total jobs will be cut, instead of all of them. Frankly, those jobs are not coming back anytime soon. While it is true that millions of green jobs can be created, many, many millions, and should IMHO, even that will not match the job losses from exponentially developing automation. Just as one example, this somewhat charitably funded think-tank is already making great progress on robots that can work around humans:
  http://www.willowgarage.com/
While plumbers may hold on the longest, by 2040, we'll probably see even household robot plumbers. Of course, we may not need them if we were to redesign plumbing to be easier to maintain -- but even then, the job goes away. In a similar way, if you look at the video of Amory Lovins' plan to revitalized the US automotive industry here, he outlines snap-together car bodies. So, those jobs are going, going, gone. And except for the "Triple Revolution" issues related to the politics of distributing wealth, we are all better for those jobs being gone, because there are plenty of things people prefer to do, whether study math or nature or raise children or play music or swim and so on, including building software and robots, just for the fun of it.

Anyway, that's the elephant in the living room, when you extrapolate from Guido's observation. :-)

So why should kids learn programming and advanced computer use?
* Fun.
* A gateway to more fun in science and engineering.
* A way to make sure the robots are friendly (or at least, enough of the dumber ones are reasonably obedient). * A way to have confidence in an ability to interact and control the future world they will live in (ala, Computer Programming for Everybody).

All the best to everyone here. I will now go back to lurking. :-)

--Paul Fernhout
_______________________________________________
Edu-sig mailing list
Edu-sig@python.org
http://mail.python.org/mailman/listinfo/edu-sig

Reply via email to