The rubber has already hit the road. Automation and computer displacement of jobs is an old story. The real challenge in my mind is how the world at large will shift to a post-scarcity economics and at what point. More importantly what are the least disruptive and most beneficial steps along the way. Clearly we cannot jump to livable financial and material benefits to all regardless of employment in one great leap. So what is the grading along the way?

- s

On May 29, 2007, at 8:01 AM, Jonathan H. Hinck wrote:

Sorry, me again. I was thinking specifically along the lines a movement which could present to humanity the (potential) benefits of an automated world where, among other things, wage slavery and its resulting inequities and hardships are abolished and supplanted by machines (to use the most general term) which monitor and take care of humanity.



My concern, as a futurist, is that if the discussion is framed in terms of immediate economic dislocation (which it often seems to be), then there will more likely be a less-than-positive reaction from John Q. Public, who will view machines as an immediate threat.



If, therefore, an automated future is going to have any chance in the political arena (where it will eventually end up, sooner or later), then perhaps Futurists need to pre-empt the Luddites (and pseudo-Luddites) somehow, before the rubber hits the road.



Jon



-----Original Message-----
From: Jonathan H. Hinck [mailto:[EMAIL PROTECTED]
Sent: Tuesday, May 29, 2007 9:15 AM
To: [email protected]
Subject: RE: [singularity] The humans are dead...



Is a broad-based political/social movement to (1) raise consciousnessregarding the potential of A.I. and its future implications and to, in turn, (2) stimulate public discussion about this whole issue possible at this time? Or is there simply too much disagreement (or, at Ben put it, too much disregard for the "geeks") for this to be possible? (Please,

please, thoughts anyone?)



Jon



-----Original Message-----

From: Keith Elis [mailto:[EMAIL PROTECTED]

Sent: Monday, May 28, 2007 9:19 PM

To: [email protected]

Subject: RE: [singularity] The humans are dead...





Ben Goertzel wrote:



> Right now, no one cares what a bunch of geeks and freaks

> say about AGI and the future of humanity.

>

> But once a powerful AGI is actually created by person X, the prior

> mailing list posts of X are likely to be scrutinized, and

> interpreted by people whose points of view are as far from

> transhumanism as you can possibly imagine ... but who

> may have plenty of power in the world...



This understanding of the world we live in is where my posts on this

topic originate.



But, Ben is being optimistic. I would suggest the situation is even

worse. If person X actually comes up with some exciting result short of

real AGI, and this gets widespread media attention, the same scrutiny

will begin in earnest. If you're person X, and you haven't been careful,

you might not get the chance to complete your work the way you want to.



Regardless, the powers that be won't have to go sifting through

thousands of emails either. Think of all the people you've disagreed

with over the years. Think of all the people who could be jealous of

your achievement. Sour grapes and outright dislike will see to it that

your views on *many* things get to the wrong people.



Keith





-----

This list is sponsored by AGIRI: http://www.agiri.org/email

To unsubscribe or change your options, please go to:

http://v2.listbox.com/member/?&;



-----

This list is sponsored by AGIRI: http://www.agiri.org/email

To unsubscribe or change your options, please go to:

http://v2.listbox.com/member/?&;

This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to