Hi,

2009/1/29 Günther Greindl <guenther.grei...@gmail.com>

>
> Quentin,
>
> you are, it seems to me, simply reproducing the MGA. You are assuming a
> (material) computer on which the AI+environment run - relatively to us,


No, I'm just assuming the program (for example written in java).


>
> this will never be conscious - but it _could_ be conscious relatively to
> other computations in Platonia.
>
> To make an AI conscious relatively to us, you have to keep it's
> counterfactual structure and intricate causal dynamics relating it to
> the _our_ environment intact - otherwise it will not be conscious (for us).
>

So when do the AI becomes a zombie when I run it relatively to me ? after
how much stub subpart (I'm talking about function in a program, not about a
physical computer on which the said program is run) have been replaced ?

Will answer more later.

Regards,
Quentin


> This is not strange - a human who loses parts of his thalamocortical
> structure (assuming he has an accident and is being kept alive by
> medical machines) will also cease to be conscious - see for instance
> Tononi, 2004. (http://www.biomedcentral.com/1471-2202/5/42) (open
> access, good paper). At least, he will cease to be conscious relatively
> to those worlds where he has the accident.
>
> I guess it boils down to the fact that you can't take parts of the
> metaphysics and ignore the implications.
>
> Assuming that computation suffices for AI _automatically_ leads to a
> view which makes computational states conscious relative to some worlds
> but not to others.
>
> That is relative state with a vengeance.
>
> Cheers,
> Günther
>
>
>
>
>
> Quentin Anciaux wrote:
> > Maybe I wasn't clear enough in my explanations, so I'll try to be
> clearer.
> >
> > Let's suppose we have a "conscious" program (an AI), running in a
> > simulated environment.
> > Let us record the run of the environment+AI.
> >
> > Then restore the state of the program just at the start of the record.
> >
> > I can now selectively replace any subpart of the AI or Environment or
> > both with a stub subpart which instead of doing an actual computation
> > and return the computed result to other subparts simply make a lookup in
> > the recorded state we've done before. In the end I can replace
> > everything with just a lookup (the case where all gates are broken and
> > receive lucky rays in the movie graph), the stub subpart plays the role
> > of the lucky rays.
> >
> > So if by our assumption our program was "conscious", If I replace only
> > one subpart is it still ?... 2 ? 3 ? ... everything ?
> >
> > Quentin
> >
> > 2009/1/28 Quentin Anciaux <allco...@gmail.com <mailto:allco...@gmail.com
> >>
> >
> >
> >
> >     2009/1/28 Bruno Marchal <marc...@ulb.ac.be <mailto:marc...@ulb.ac.be
> >>
> >
> >
> >
> >         Hi  Quentin,
> >
> >          > I was thinking about the movie graph and its conclusions. It
> >          > concludes that it is absurd for the connsciousness to
> >         supervene on
> >          > the movie hence physical supervenience is false.
> >
> >
> >         OK. It is a reductio ad absurdo. It assumes that consciousness
> >         supervenes on the physical activity of a brain (Phys. Sup.), it
> >         shows
> >         that it leads to the fact that consciousness suoervenes on a
> movie
> >         "qua computatio", and this is considered as an absurdity, and so
> it
> >         concludes that Phys. Sup is false.
> >
> >
> >          >
> >          >
> >          > But if I simulate the graph with a program, and having for
> >         exemple
> >          > each gates represented by a function like "out = f(in)" each
> >          > functions of the simulated graph is in a library which is
> loaded
> >          > dynamically. I can record a run and then on new run I can
> >          > selectively replace each libraries/functions by another one
> >         with the
> >          > same function contract but which instead of computing the out
> >         value,
> >          > it takes the value from the record. I can do it like in the
> movie
> >          > graph for each gates/functions.
> >          >
> >          > Then it seems that means in the end the consciousness has to
> >          > supervene on the record...
> >
> >         Why? Consciousness supervenes on the computation(s), not on his
> >         physical implementation, be it with record or with the original
> >         modules.
> >
> >
> >          > then it is the same conclusion than for physical
> >         supervenience. What
> >          > is wrong ?
> >
> >         The physical supervenience. Consciousness does not supervene on
> any
> >         implementation "in particular" of a computation. It supervenes
> >         on all
> >         (immaterial) computations going through the (relevant) states.
> >         This is
> >         in Platonia.
> >
> >         Tell me if I miss something, but it seems to me there is no
> problem
> >         here. It is just, again, a problem if you believe in some
> physical
> >         supervenience.
> >
> >         Best,
> >
> >         Bruno
> >
> >
> >     The problem I see is that the movie graph is used to show that
> >     phys-sup is wrong (having as condition that I know consciousness is
> >     turing-emulable, as we have a "conscious" graph which is the
> >     physical implementation), the argument shows that consciousness does
> >     not supervene on this physical implementation because we should be
> >     forced to accept it also supervene upon broken graph + movie. But
> >     what I think with my exemple is that it does not supervene on the
> >     particular simulation of the functionnal graph nor does it supervene
> >     on the non-functionnal lookup record sumulation of the graph.
> >
> >     I understand the thing is that it supervene on all computations not
> >     a particular computation... but I don't see how then movie graph
> >     rules out phys sup and not any kind of supervenience.
> >
> >     Regards,
> >     Quentin
> >
> >
> >
> >
> >
> >          >
> >          >
> >          > Regards,
> >          > Quentin
> >          >
> >          > --
> >          > All those moments will be lost in time, like tears in rain.
> >          >
> >          > >
> >
> >         
> > http://iridia.ulb.ac.be/~marchal/<http://iridia.ulb.ac.be/%7Emarchal/>
> >         <http://iridia.ulb.ac.be/%7Emarchal/>
> >
> >
> >
> >
> >
> >
> >
> >
> >     --
> >     All those moments will be lost in time, like tears in rain.
> >
> >
> >
> >
> > --
> > All those moments will be lost in time, like tears in rain.
> >
> >
>
> >
>


-- 
All those moments will be lost in time, like tears in rain.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to