[EMAIL PROTECTED]
Shane Legg" < [EMAIL PROTECTED] : Mon, 25 Sep 2006 23:16:12 +0200 wrote:

I think the major problem is one of time scale.  Due to Hollywood everybody
is familiar with the idea of the future containing super powerful
intelligent (and usually evil) computers.  So I think the basic
concept that these >things could happen in the future is already out
there in the popular culture.  I think
the key thing is that most people, both Joe six pack and almost all
professors I
know, don't think it's going to happen for a really long time ---
long enough that
it's not going to affect their lives, or the lives of anybody they
know.  As such
they aren't all that worried about it.  Anyway, I don't think the idea is going
to be taken seriously until something happens that really gives the public a
fright.

Your right.  The general public doesn't really care of future
generations.  They are solely concentrated on themselves.  (A natural
response)
This doesn't take away the responsibility.  If YOU are aware that that
something is occuring then it is your responsibility to follow threw.
Frightening people doesn't relay anything but FEAR. (False evidence
appearing real.)

On 9/25/06, Bruce LaDuke <[EMAIL PROTECTED]> wrote
What happens to the oil industry?
What happens to politics because of what happens to the oil industry?  How
will a space elevator by 2012 change the balance of power?  Nanoweapons?
World War III?  China/India industrialization and resulting pollution? As
announced recently what happens when the world warms to its hottest level in
a million years?  When biodiversity reduction goes critical and plankton die
and oxygen fails?

Well, I guess you need to find the experts within those fields to tell
you what's  going to happen.
There are no quick solutions, I wish there where.
Nobody at this present moment has all the answers.  The best scenario
is to find the people you "believe" have the right answers to your
questions.

Gregory Johnson <[EMAIL PROTECTED]>  wrote:
For the very reasons you point out, the public is not ready for the singularity
and we don't have the time or resources to waste making them ready
for something we are not even sure the shape  of yet..

Maybe creating a shape is a good thing.  Defining it  may take some
time but so what.

What would a supersoldier be like?????

Why do we need a supersoldier? If rationality has become a popular
aspect, we wouldn't need a supersoldier.  We would rely on science.
Wouldn't a supersoldier simply be a virus, chemical weapon or
biochemical reaction, isn't that what you are talking about?
The bombs never change, they just become different.

The point at which the singularity will  occur is when the general population
becomes aware of the existance of  posthuman supersoldiers and has to
decide >if they want to destroy the technology or use it to enhance
their personal lives.

Gregory, this is a very damaging responsive.
Posthuman has nothing to do with "supersoldiers".
Technology is not there to enhance war like behavior.  What part of
history made you think that?

Just curious
Anna:)














On 10/21/06, Gregory Johnson <[EMAIL PROTECTED]> wrote:
I had bit of a cute experiencel last week.
An associate in a group that advises our goverment found in her
housecleaning of old printed materials a book in which one of my futurist
essays from back in 1971 appeared.  I  thought it was a neat way to review
the subject to see if the future for 2008-2012 I wrote about was close to
the real thing.

One of the biggest mistakes we can make when proposing the singularity to
non-techies is to even use the term.

Lets just dumb it down.
No sense getting the fear factor up when the singularity is so weak and
defenceless that popular frankenstein/terminator/day after tomorrow
reactions can jeopardize the entire event horizon.

I think we entered the event horizon the day the DARPANET was switched on
and simply have continued on since.

I really think that Kurzweil and Google to name drop a few of the players
have found shelter in the USA Military Industrial Complex.
It is from here, sheltered from the luddites with access to significant
R&D resources that the singularity will eminate.

For the very reasons you point out, the public is not ready for the
singularity
and we don't have the time or resources to waste making them ready
for something we are not even sure the shape  of yet.

Perhaps the GMO combination of extreme biological modification
to endure extreme lenths of time in constant battle  without loss of
mental  functionality and enhancement to incorporate  high speed data
managment direct to the cortex from the military internet servers
will not only create a supersoldier, but also
as an accidental side effect, a super long lived post-human.

Yes society would change with the perfection of the supersoldier.
What would a supersoldier be like?????
A person who can work perhaps 100 hours non-stop at full operating
efficiency both mental and physical.
A person able to endure body temperatures from perhaps 40F to 150F
both internal and external.
A person with regenerative  and repair capacity manyfold that of their
parents.
A person able to withstand biological, chemical and physical agents that
would
otherwise kill a traditional human.
A person able to interface with a data stream many times that of what our
5 senses today can deluge  us with at once.

The point at which the singularity will  occur is when the general
population
becomes aware of the existance of  posthuman supersoldiers and has to decide
if they
want to destroy the technology or use it to enhance their personal lives.

Of course, like with the internet, once the genie is out of the bottle
it cannot be quashed without extraordinary effort.

I think one has to simply move along and make it happen.
And of course sell the technology spin-offs to pay for the R& D
expenditures.

......................














On 9/25/06, Bruce LaDuke <[EMAIL PROTECTED]> wrote:
>
> I really like Shane's observation below that people just don't think
> Singularity is coming for a very long time.  The beginning affects are
> already here.  Related to this, I've got a few additional thoughts to
> share.
>
> We're not looking into singularity yet, but the convergence has already
> started.  Consider that the molecular economy has the potential to bring
> total social upheaval in its own right, without singularity.  For example,
> what happens when an automobile is weighs around 400 pounds and  is
> powered
> by a battery that never needs charging.  What happens to the oil industry?
> What happens to politics because of what happens to the oil industry?  How
> will a space elevator by 2012 change the balance of power?  Nanoweapons?
> World War III?  China/India industrialization and resulting pollution? As
> announced recently what happens when the world warms to its hottest level
> in
> a million years?  When biodiversity reduction goes critical and plankton
> die
> and oxygen fails?
>
> I'm sure you know about most of these things and how quickly they are
> moving, but my point is, trouble isn't coming...it's here.  Not only
> should
> we be thinking about these things now, but I think it is our social
> responsibility.  That is, if we want children to grow up and inhabit this
> world with any level of normalcy...or at all.
>
> Any number of things could bring our glorious house crashing down in a
> matter of days or months.  When the Soviet economy crashed, nuclear
> physicists were standing in the soup line over night.  The same could
> easily
> be seen of us in a global economic crash.  Our scholarly/industrial
> existence is really very fragile.  It doesn't take much for our hierarchy
> of
> needs to return to survival.
>
> Our human track record of late in terms of creating advance is really
> quite
> good, but in terms of dealing with the social impacts of that advance is
> really very, very poor and immature.  All of our wonderful creations are
> already making quite a big global mess.  So who's to say that our
> continued
> focus on modernist, profit-centric values will result in any thing less
> than
> more and more advance alongside escalating social issues?
>
> In my mind, singularity is no different.  I pesonally see it providing
> just
> another tool in the hand of mankind, only one of greater power.  And this
> power holds the potential to fulfill human values and human intention,
> which
> is the piece we really aren't managing well.  Bad intentions and bad
> values,
> combined with a bigger tool, equals bigger trouble.
>
> Given our human track record and factors already outside of our control,
> we
> have a far better chance of destroying what we have now (the rest of the
> way) than we have of realizing singularity.  Not that we shouldn't
> continue
> to seek singularity, but we need a hard look at the values and intentions
> than we're basing these efforts on.
>
> See the Second Enlightenment Conference:  http://www.2enlightenment.com
> Elizabet Sahtouris will be keynote (http://www.ratical.org/LifeWeb/)
>
> Kind Regards,
>
> Bruce LaDuke
> Managing Director
>
> Instant Innovation, LLC
> Indianapolis, IN
> [EMAIL PROTECTED]
> http://www.hyperadvance.com
>
>
>
>
> ----Original Message Follows----
> From: "Shane Legg" <[EMAIL PROTECTED]>
> Reply-To: [email protected]
> To: [email protected]
> Subject: Re: [singularity] Convincing non-techie skeptics that the
> Singularity isn't total bunk
> Date: Mon, 25 Sep 2006 23:16:12 +0200
>
> I'd suggest looking at Joy's "Why the future doesn't need us" article in
> Wired.
> For some reason, which isn't clear to me, that article was a huge hit,
> drawing
> in people that normally would never read such stuff.  I was surprised when
> various educated but non-techie people I know started asking me about it.
>
> I think the major problem is one of time scale.  Due to Hollywood
> everybody
> is familiar with the idea of the future containing super powerful
> intelligent (and
> usually evil) computers.  So I think the basic concept that these things
> could
> happen in the future is already out there in the popular culture.  I think
> the key
> thing is that most people, both Joe six pack and almost all professors I
> know,
> don't think it's going to happen for a really long time --- long enough
> that
> it's
> not going to affect their lives, or the lives of anybody they know.  As
> such
> they
> aren't all that worried about it.  Anyway, I don't think the idea is going
> to be
> taken seriously until something happens that really gives the public a
> fright.
>
> Shane
>
> -----
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/[EMAIL PROTECTED]
>
>
> -----
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/[EMAIL PROTECTED]
>

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to