Stathis Papaioannou wrote:
> Brent meeker writes:
>
> > >>>I think it goes against standard computationalism if you say that a 
> > >>>conscious
> > >>>computation has some inherent structural property. Opponents of 
> > >>>computationalism
> > >>>have used the absurdity of the conclusion that anything implements any 
> > >>>conscious
> > >>>computation as evidence that there is something special and 
> > >>>non-computational
> > >>>about the brain. Maybe they're right.
> > >>>
> > >>>Stathis Papaioannou
> > >>
> > >>Why not reject the idea that any computation implements every possible 
> > >>computation
> > >>(which seems absurd to me)?  Then allow that only computations with some 
> > >>special
> > >>structure are conscious.
> > >
> > >
> > > It's possible, but once you start in that direction you can say that only 
> > > computations
> > > implemented on this machine rather than that machine can be conscious. 
> > > You need the
> > > hardware in order to specify structure, unless you can think of a 
> > > God-given programming
> > > language against which candidate computations can be measured.
> >
> > I regard that as a feature - not a bug. :-)
> >
> > Disembodied computation doesn't quite seem absurd - but our empirical 
> > sample argues
> > for embodiment.
> >
> > Brent Meeker
>
> I don't have a clear idea in my mind of disembodied computation except in 
> rather simple cases,
> like numbers and arithmetic. The number 5 exists as a Platonic ideal, and it 
> can also be implemented
> so we can interact with it, as when there is a collection of 5 oranges, or 3 
> oranges and 2 apples,
> or 3 pairs of oranges and 2 triplets of apples, and so on, in infinite 
> variety. The difficulty is that if we
> say that "3+2=5" as exemplified by 3 oranges and 2 apples is conscious, then 
> should we also say
> that the pairs+triplets of fruit are also conscious?

No, they are only subroutines.

>  If so, where do we draw the line?

At specific structures

> That is what I mean
> when I say that any computation can map onto any physical system. The 
> physical structure and activity
> of computer A implementing program a may be completely different to that of 
> computer B implementing
> program b, but program b may be an emulation of program a, which should make 
> the two machines
> functionally equivalent and, under computationalism, equivalently conscious.

So ? If the functional equivalence doesn't depend on a
baroque-reinterpretation,
where is the problem ?

> Maybe this is wrong, eg.
> there is something special about the insulation in the wires of machine A, so 
> that only A can be conscious.
> But that is no longer computationalism.

No. But what would force that conclusion on us ? Why can't
consciousness
attach to features more gneral than hardware, but less general than one
of your re-interpretations ?

> Stathis Papaioannou
> _________________________________________________________________
> Be one of the first to try Windows Live Mail.
> http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to