Brent Meeker wrote:
> 1Z wrote:
> >
> > Stathis Papaioannou wrote:
> >
> >> Russell Standish writes:
> >>
> >>
> >>> On Sun, Aug 27, 2006 at 09:31:15PM +1000, Stathis Papaioannou wrote:
> >>>
> >>>> It seems to me that the idea of a deterministic machine being conscious 
> >>>> is
> >>>> assumed to be preposterous, for no good reason. I believe that I could 
> >>>> have
> >>>> acted differently even with identical environmental inputs, which is what
> >>>> the feeling of "free will" is. However, it is possible that I might *not*
> >>>> have been able to act differently: simply feeling that I could have done 
> >>>> so
> >>>> is not evidence that it is the case. And even if it were the case, due to
> >>>> true quantum randomness or the proliferation of branches in the 
> >>>> multiverse
> >>>> leading to the effect of first person indeterminacy, it does not follow 
> >>>> that
> >>>> this is necessary for consciousness to occur.
> >>>
> >>> It is true that Maudlin's argument depends on the absurdity of a recording
> >>> being conscious. If you can accept a recording as being conscious, then  
> >>> you
> >>> would have trouble in accepting the conclusion that counterfactuals are
> >>> relevant.
> >>
> >> That's what I'm disputing. You can have a machine handling 
> >> counterfactuals, like
> >> a thermostat, that aren't conscious (not much, anyway), and machines not
> >> handling counterfactuals, like a complex computer or human with rigidly
> >> constrained inputs, that is conscious.
> >
> >
> > Computer always have counterfactuals, because there changing one part of 
> > them
> > (whether data or programme) has an effect on the overall behaviour. 
> > Changing one
> > part of a recording (e.g splicing a film) changes only *that* part.
>
> But a branch in a program need not change very much.

Chainging one bit of a programme can change everything.

>  It seems that now you are
> introducing a new critereon, a degree of "counterfactualness" required for 
> consciousness.

Movies how no counterfactualness and are therefore not
(implementationsof ) programmes.
we can say that without knowing how much counterfactualness
*is* required for consciousness. 


> Brent Meeker


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to