Derek Zahn wrote:
This message is "semi-serious".
The latest SIAI blog laments the apparently dismissive attitude of mainstream media toward the singularity summit (and presumably the concept in general, and SIAI itself by extension). Maybe it's not the worst thing thing that could happen. Consider the war in Iraq (oops, I just lost half my readers! But this is not a political tirade, it's about AGI): The "reason" for this war, in my opinion, is to establish a base from which the USA can exert social, cultural, economic, and military pressure on people who might use nasty weapons against the USA or its friends. Whether such a project is noble or effective is unimportant. What is important is that the USA is so scared of having our people and stuff blown up that we'll spend a trillion dollars and thousands of lives on a rather speculative strategy for fighting the threat. Now our little gang is basically saying that AGI is WAY more dangerous than any little nuclear bomb or other WMD. Thank the AGI and Bayes its prophet that they think we're kooks, they'd shut us down in a heartbeat if they didn't! Can they? As an arbitrary thought experiment, let's say that a beyond-human AGI can be built on a 1000-pc cluster. Modern computer chips are incredibly complicated devices that can only be produced in massive high-tech fabrication facilities. I could easily imagine the government attempting to regulate these plants and their products like any other hazardous but useful substance, and bombing fabs if they are constructed in North Korea or Iran. Controlling proliferation of radioactive material in this way has been at least somewhat effective, and maybe spending a trillion dollars in an effort to do the same thing to CPUs could seem to powerful people to be a good idea, especially if the threat is not only physical but also spiritual. That doesn't stop Russia or China etc from building AGI, so I suppose we'd also have treaties to prevent AGI development that we'd secretly cheat on, so all of us will end up in windowless cinderblock cubicles in Los Alamos. Now let's follow up on the recent speculation on the AGI list that a cheap laptop is actually enough processing power. In that case, the hardware restriction policy would be necessary but also too late. AGI work itself can still be banned. What sort of additions to the Patriot Act would be needed to make sure that we are not working on AGI in secret? Also in this case, amusingly, the well-publicised effort to make sure every kid on the planet has a cheap laptop is basically making sure that every kid on the planet has something worse than a nuclear bomb kit. Maybe all those kids are too dumb to figure out how to assemble it. Next, consider religious fundamentalists. Those people are able to follow a chain of reasoning that leads them to blow up abortion clinics, marketplaces, and fly airplanes into buildings to protect their points of view. AGI and the singularity are much larger threats to their world view than any current target. How attractive a bomb target is the singularity summit itself or an artificial intelligence conference? Thank the AGI and Bayes its prophet that they think we're kooks, they'd kill us if they didn't! Why do we care whether the world thinks we're kooks or not? 1) We want to beg for money, and people don't give money to kooks. Fair enough, but another approach that good true ideas with economic value can take is to earn money instead by selling people things with value. 2) If we "raise awareness", perhaps a better-informed "common man" will help make a "positive" singularity more likely. It's possible. Getting more people who think technically for a living (scientists, engineers) convinced could also be beneficial (in case us believers don't have the right answers yet and aren't going to find them soon). If those people's opinions are driven by what they see on tv news or the wall street journal, the scent of kookery is not too helpful. 3) Bloggers and websites are successful in proportion to the number of hits they get, and kooks don't get many hits. Any other good reasons we should care whether journalists heap scorn on our efforts?

I'm sure my analysis of the politics of AGI quite squares with yours.

1) First, you distinguish between governments and the reasons they might care, and religious organizations and the reaons they might care. I agree with this distinction.

2) Governments care most about bogeypersons that they can use for their agenda. So where does AGI rate? Well, so long as it doesn't rate, they'll ignore it. But if it did start to get taken seriously, they would not act according to real factors (like, the perceived threats that SIAI squeals about, or the preceived benefits of AGI), they will ask "How can we USE this to our advantage?"

3) The answer they will give to that question will probably be the same as the one they gave to nanotechnology: it is ONLY useful if they can kick off a funding bandwagon called "AGI" and use it to funnel more money to existing corporate interests who are paying their (the politicians') bills. Those corporate interests will then take the money and claim to be doing "AGI" even though, in fact, they will just carry on doing whatever they were doing before.

4) Religious reaction. Unpredictable. I think it could go either way. I think in fact that many religious people will see in the Singularity a unified picture of the world that they were promising, and will find ways to embrace it.

5) One other very powerful interest group is the fiction and fearmongering community - the science fiction folks in Hollywood who want to make money off ideas that can be turned into horror - and the more realistic the horror, the better. This is the community that already takes it seriously, and will do so even more when the rest of the world wakes up to the idea.

I think it is this 3rd group that are most to be feared.

And this is also the group that a certain element within SIAI - with its naive discussion of unrealistic threats, and exaggeration of the impossibility of dealing with the real threats - is playing up to. The fear mongers in Hollywood would *love* that SIAI-based group to get more publicity, because they'd make money hand over fist if that happened.


Richard Loosemore



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=44899414-4175dc

Reply via email to