On Aug 3, 12:39 am, meekerdb <meeke...@verizon.net> wrote:
> On 8/2/2011 8:27 PM, Craig Weinberg wrote:

> > Why are you equating how something appears to behave with it's
> > capacity to experience human consciousness? Think of the live neurons
> > as a pilot light in a gas appliance. You may not need to heat your hot
> > water heater with a bonfire that needs to be maintained with wood, and
> > if you have a natural gas utility you can substitute a different fuel,
> > but if that fuel can't ignite, there's not going to be any heat. By
> > your reasoning, the natural gas could be substituted with carbon
> > dioxide, since it looks the same, acts like a gas, etc so you could
> > infer that it should make the same heat. With the pilot light, you
> > will at least know whether or not the fuel is viable.
>
> OK, so in your analogy to what is the pilot light analogous?

To the presence of a group of live neurons in the executive role of a
silicon brain simulation.

> >> Do I conclude that it experiences consciousness exactly as I do?  No, I
> >> think that it might depend on how its programming is implement, e.g.
> >> LISP might produce different experience than FORTRAN  or whether there
> >> are asynchronous hardware modules.  I'm not sure how Bruno's theory
> >> applies to this since he looks at the problem from a level where all
> >> computation is equivalent modulo Church-Turing.
>
> > I hear what you're saying, and there's no question that the
> > programming is instrumental both in simulating intelligence or
> > generating a human level of interior experience artificially. All I'm
> > saying is that LISP or FORTRAN cannot have an experience by itself.
>
> I agree with that.  But can 'it' (a program) have experience when
> running on a computer?  And if so, does it have the same experience when
> it's running under Linux as under MacOS, on a PC or a (physical) Turing
> machine?  The latter is what functionalism asserts and you seem to deny.

I think that the experience that a silicon computer has when it's
powered up is likely to be very similar regardless of what it's
running, just as a TVs experience is probably about the same
regardless of what TV show you are watching on it. Because at that low
level of awareness, each pixel is a separate 'neuron' that doesn't
talk to the others (except maybe where colors physically interfere and
create on screen artifacts). They have no coherent sense of the image
that we are watching or the program they are using because they aren't
a single-ish entity integrating a gazillion smaller entities, they are
a gazillion small primitive entities that we are integrating
perceptually.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to