On 10/24/2012 10:19 PM, Craig Weinberg wrote:


On Thursday, October 25, 2012 1:10:24 AM UTC-4, Brent wrote:

    On 10/24/2012 9:23 PM, Craig Weinberg wrote:

        Or what if we don't care?  We don't care about slaughtering cattle, 
which are
        pretty smart
        as computers go.  We manage not to think about starving children in 
Africa, and
        they *are*
        humans.  And we ignore the looming disasters of oil depletion, water 
pollution,
        and global
        warming which will beset humans who are our children.


    Sure, yeah I wouldn't expect mainstream society to care, except maybe for 
some
    people, I am mainly focused on what seems to be like an astronomically 
unlikely
    prospect that we will someday find it possible to make a person out of a 
program,
    but won't be able to just make the program itself and no person attached.

    Right. John McCarthy (inventor of LISP) worried and wrote about that 
problem decades
    ago.  He cautioned that we should not make robots conscious with emotions 
like
    humans because then it would be unethical use them like robots.


It's arbitrary to think of robots though. It can be anything that represents computation to something. An abacus, a card game, anything. Otherwise it's prejudice based on form.


    Especially given that we have never made a computer program that can do 
anything
    whatsoever other than reconfigure whatever materials are able to execute the
    program, I find it implausible that there will be a magical line of code 
which
    cannot be executed without an experience happening to someone.

    So it's a non-problem for you.  You think that only man-born-of-woman or 
wetware can
    be conscious and have qualia.  Or are you concerned that we are 
inadvertently
    offending atoms all the time?


Everything has qualia, but only humans have human qualia. Animals have animal qualia, organisms have biological qualia, etc.

So computers have computer qualia. Do their qualia depend on whether they are sold-state or vacuum-tube? germanium or silicon? PNP or NPN? Do they feel different when they run LISP or C++? Do you have Craig qualia?



    No matter how hard we try, we can never just make a drawing of these 
functions just
    to check our math without invoking the power of life and death. It's really 
silly.
    It's not even good Sci-Fi, it's just too lame.

    I think we can, because although I like Bruno's theory I think the MGA is 
wrong, or
    at least incomplete.  I think the simulated intelligence needs a simulated
    environment, essentially another world, in which to *be* intelligent.  And 
that's
    where your chalk board consciousness fails.  It needs to be able to 
interact within
    a chalkboard world.  So it's not just a question of going to a low enough 
level,
    it's also a question of going to a high enough level.


A chalkboard world just involves a larger chalkboard.

Right.  And it involves great chalkboard sex - but none we need worry about.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to