> On 21 Mar 2018, at 22:56, Brent Meeker <meeke...@verizon.net> wrote:
> 
> 
> 
> On 3/21/2018 2:27 PM, Stathis Papaioannou wrote:
>> 
>> On Thu, 22 Mar 2018 at 5:45 am, Brent Meeker <meeke...@verizon.net 
>> <mailto:meeke...@verizon.net>> wrote:
>> 
>> 
>> On 3/20/2018 11:29 PM, Stathis Papaioannou wrote:
>>> On Wed, 21 Mar 2018 at 9:03 am, Brent Meeker <meeke...@verizon.net 
>>> <mailto:meeke...@verizon.net>> wrote:
>>> 
>>> 
>>> On 3/20/2018 1:14 PM, Stathis Papaioannou wrote:
>>>> 
>>>> On Wed, 21 Mar 2018 at 6:34 am, Brent Meeker <meeke...@verizon.net 
>>>> <mailto:meeke...@verizon.net>> wrote:
>>>> 
>>>> 
>>>> On 3/20/2018 3:58 AM, Telmo Menezes wrote:
>>>>>> The interesting thing is that you can draw conclusions about 
>>>>>> consciousness
>>>>>> without being able to define it or detect it.
>>>>> I agree.
>>>>> 
>>>>>> The claim is that IF an entity
>>>>>> is conscious THEN its consciousness will be preserved if brain function 
>>>>>> is
>>>>>> preserved despite changing the brain substrate.
>>>>> Ok, this is computationalism. I also bet on computationalism, but I
>>>>> think we must proceed with caution and not forget that we are just
>>>>> assuming this to be true. Your thought experiment is convincing but is
>>>>> not a proof. You do expose something that I agree with: that
>>>>> non-computationalism sounds silly.
>>>> 
>>>> But does it sound so silly if we propose substituting a completely 
>>>> different kind of computer, e.g. von Neumann architecture or one that      
>>>>                                just records everything instead of an 
>>>> episodic associative memory, for the brain.  The Church-Turing conjecture 
>>>> says it can compute the same functions.  But does it instantiate the same 
>>>> consciousness.  My intuition is that it would be "conscious" but in some 
>>>> different way; for example by having the kind of memory you would have if 
>>>> you could review of a movie of any interval in your past.
>>>> 
>>>> I think it would be conscious in the same way if you replaced neural 
>>>> tissue with a black box that interacted with the surrounding tissue in the 
>>>> same way. It doesn’t matter what is in the black box; it could even work 
>>>> by magic.
>>> 
>>> Then why draw the line at "surrounding tissue".  Why not the external 
>>> enivironment? 
>>> 
>>> Keep expanding the part that is replaced and you replace the whole brain 
>>> and the whole organism.
>>> 
>>> Are you saying you can't imagine being "conscious" but in a different way?
>>> 
>>> I think it is possible but I don’t think it could happen if my neurones 
>>> were replaced by a functionally equivalent component. If it’s functionally 
>>> equivalent, my behaviour would be unchanged,
>> 
>> I agree with that.  But you've already supposed that functional equivalence 
>> at the behavior level implies preservation of consciousness.  So what I'm 
>> considering is replacements in the brain far above the neuron level, say at 
>> the level of whole functional groups of the brain, e.g. the visual system, 
>> the auditory system, the memory,...  Would functional equivalence at the 
>> body/brain interface then still imply consciousness equivalence?
>> 
>> I think it would, because I don’t think there are isolated consciousness 
>> modules in the brain. A large enough change in visual experience will be 
>> noticed by the subject, who will report that things look different. This 
>> could only happen if there is a change in the input to the language system 
>> from the visual system; but we have assumed that the output from the visual 
>> system is the same, and only the consciousness has changed, leading to a 
>> contradiction.
> 
> But what about internal systems which are independent of perception...the 
> very reason Bruno wants to talk about dream states.  And I'm not necessarily 
> asking that behavior be identical...just that the body/brain interface be the 
> same.  The "brain" may be different in how it processes input from the 
> eyeballs and hence report verbally different perceptions.  In other words, 
> I'm wondering how much does computationalism constrain consciousness.  My 
> intuition is that there could be a lot of difference in consciousness 
> depending on how different perceptual inputs are process and/or merged and 
> how internal simulations are handled.  To take a crude example, would it 
> matter if the computer-brain was programmed in a functional language like 
> LISP, an object-oriented language like Ruby, or a neural network?  Of course 
> Church-Turing says they all compute the same set of functions, but they don't 
> do it the same way

They can do it in the same way. They will not do it in the same way with a 
compiler, but will do it in the same way when you implement an interpreter in 
another interpreter. The extensional CT (in terms if which functions are 
calculated) entails the intensional CT (in terms of which computations can be 
processed. Babbage machine could emulate a quantum brain. It involves a 
relative slow-down, but the subject will not notice without external clues. In 
arithmetic, all possible sorts of computers are implemented infinitely often, 
with some special redundancy which plays a role in the computations statistics, 
the first person statistics and the origin of the physical appearances.

Bruno




> and that might make a difference in consciousness (and at least verbal 
> behavior).
> 
> Brent
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to everything-list+unsubscr...@googlegroups.com 
> <mailto:everything-list+unsubscr...@googlegroups.com>.
> To post to this group, send email to everything-list@googlegroups.com 
> <mailto:everything-list@googlegroups.com>.
> Visit this group at https://groups.google.com/group/everything-list 
> <https://groups.google.com/group/everything-list>.
> For more options, visit https://groups.google.com/d/optout 
> <https://groups.google.com/d/optout>.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to