On 10/24/2012 9:23 PM, Craig Weinberg wrote:


    Or what if we don't care?  We don't care about slaughtering cattle, which 
are pretty
    smart
    as computers go.  We manage not to think about starving children in Africa, 
and they
    *are*
    humans.  And we ignore the looming disasters of oil depletion, water 
pollution, and
    global
    warming which will beset humans who are our children.


Sure, yeah I wouldn't expect mainstream society to care, except maybe for some people, I am mainly focused on what seems to be like an astronomically unlikely prospect that we will someday find it possible to make a person out of a program, but won't be able to just make the program itself and no person attached.

Right. John McCarthy (inventor of LISP) worried and wrote about that problem decades ago. He cautioned that we should not make robots conscious with emotions like humans because then it would be unethical use them like robots.

Especially given that we have never made a computer program that can do anything whatsoever other than reconfigure whatever materials are able to execute the program, I find it implausible that there will be a magical line of code which cannot be executed without an experience happening to someone.

So it's a non-problem for you. You think that only man-born-of-woman or wetware can be conscious and have qualia. Or are you concerned that we are inadvertently offending atoms all the time?

No matter how hard we try, we can never just make a drawing of these functions just to check our math without invoking the power of life and death. It's really silly. It's not even good Sci-Fi, it's just too lame.

I think we can, because although I like Bruno's theory I think the MGA is wrong, or at least incomplete. I think the simulated intelligence needs a simulated environment, essentially another world, in which to *be* intelligent. And that's where your chalk board consciousness fails. It needs to be able to interact within a chalkboard world. So it's not just a question of going to a low enough level, it's also a question of going to a high enough level.

Brent
The person I was when I was 3 years old is dead. He died because
too much new information was added to his brain.
         -- Saibal Mitra

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to