On 08 May 2015, at 02:15, LizR wrote:

Nicely summarised. I may have comments once I've had a chance to digest your summary (and any subsequent comments).

In the meantime, if you aren't familiar with Maudlin's "Olimpia" argument that is also (possibly) relevant. It uses a similar form of argument to the MGA to arrive at a different consclusion, namely that supervenience of consciousness on a physical machine (brain, computer) isn't possible.


But that is the same conclusion than MGA.
Both MGA and Maudlin shows that there is a serious difficulty in maintaining both comp and the physical supervenience. Maudlin leans toward abandoning comp, I keep comp and lean toward abandoning materialism. But both show their incompatibility.

Bruno


(In summary, he attempts to show that physical supervenience implies that a machine running an AI programme is conscious if and only if the machine is capable of supporting counterfactual states, even if it is performing physically identical actions to one that isn't.)

http://web.csulb.edu/~cwallis/labs/stanford/Computation&consc.pdf

On 8 May 2015 at 11:08, Russell Standish <li...@hpcoders.com.au> wrote:
On Thu, May 07, 2015 at 10:45:12PM +1000, Bruce Kellett wrote:
...

>
> I am sorry, but this just does not follow. The original physical
> functionality is admitted to be still intact -- provide, admittedly,
> by the projected movie, but that is still a physical device,
> operating with a physical film in a physical projector, and
> projecting on to the original (albeit damaged) physical machinery.
> How has the physical element in all of this been rendered redundant?
> The original functionality of the 'brain' has been preserved by the
> movie; the conscious experience is still intact even though much of
> the original functionality has been provided by another external
> physical device. How does this differ from the original "Yes Doctor"
> scenario in which the subject agrees to have his brain replaced by a
> physical device that simulates (emulates) his original brain
> functionality? I submit that it does not.
>
> The only difference between the movie replacing the functionality of
> the original experience and having that functionality replaced by a
> computer would seem to be that the computer can emulate a wider
> range of conscious experiences -- it is 'counterfactually correct'
> in that it can respond appropriately to different external inputs.
> The film, being a static record of one conscious experience, cannot
> do this. But it has been admitted that the film can reproduce the
> original conscious experience with perfect fidelity. And the film is
> every bit as physical as the original 'brain'. So the physical has
> not been shown to be redundant. It cannot be cut away with Occam's
> razor after all. If it were, there would be no conscious experience
> remaining.
>
> I conclude that the MGA fails to establish the conclusions that it
> purports to establish.
>

Thanks for this excellent summary, Bruce. The answer given as to why the film is
supposedly not conscious is that it absurd. I agree with you that it
is not, prima facie, absurd at this point. Usually, Bruno then goes on
to recount his "stroboscope argument", which is in his thesis, but not
in any English language publication to my knowledge. Essentially the
idea is that we stop the projector, take the film out and lay it down
on a very large table. Now as an observer, we can run along the table,
seeing the frames of the film in their original order, and it will be
as though the film is projected. But that would mean the conscious
moment would depend on whether the external observer is running or
not.

Personally, I think the problem started much earlier, in supposing
that that recreating the exact same sequence of physical states
instantiates more conscious moments. It does not. The conscious moment
is exactly the same, and exists in that physical reality. Creating a
recording does not change that fact.

The only problem I see is if the recording were to arise by chance, by
some lucky coincidence of the random motion of molecules, without the
original computation having taken place. Then is that conscious moment
instantiated? Obviously, in a robust ontology, it is, because all
conscious moments are instantiated, but suppose the ontology is not
robust.

Personally, I think the intuituion pump has simply run dry at that point. I
don't think the MGA helps.


--

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      hpco...@hpcoders.com.au
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to