On Tuesday, September 1, 2015, Bruno Marchal <[email protected]> wrote:

>
> On 31 Aug 2015, at 14:56, Stathis Papaioannou wrote:
>
>
>
> On Monday, August 31, 2015, Bruno Marchal <[email protected]
> <javascript:_e(%7B%7D,'cvml','[email protected]');>> wrote:
>
>>
>> On 31 Aug 2015, at 00:42, Russell Standish wrote:
>>
>> On Sun, Aug 30, 2015 at 12:34:18PM +0200, Bruno Marchal wrote:
>>>
>>>>
>>>> On 30 Aug 2015, at 03:08, Russell Standish wrote:
>>>>
>>>> Well as people probably know, I don't believe C. elegans can be
>>>>> conscious in any sense of the word. Hell - I have strong doubts about
>>>>> ants, and they're massively more complex creatures.
>>>>>
>>>>
>>>> I think personally that C. Elegans, and Planaria (!), even amoeba,
>>>> are conscious, although very plausibly not self-conscious.
>>>>
>>>> I tend to think since 2008 that even RA is already conscious, even
>>>> maximally so, and that PA is already as much self-conscious than a
>>>> human (when in some dissociative state).
>>>>
>>>> But I don't know if PA is more or less conscious than RA. That
>>>> depends of the role of the higher part of the brain consists in
>>>> filtering consciousness or enacting it.
>>>>
>>>>
>>>>> But it probably won't be long before we simulate a mouse brain in toto
>>>>> - about 2 decades is my guess, maybe even less given enough dollars -
>>>>> then we're definitely in grey philosophical territory :).
>>>>>
>>>>
>>>> I am slightly less optimistic than you. It will take one of two
>>>> decades before we simulate the hippocampus of a rat, but probably
>>>> more time will be needed for the rest of their brain. And the result
>>>> can be a conscious creature, with a quite different consciousness
>>>> that a rat, as I find plausible that pain are related to the glial
>>>> cells and their metabolism, which are not  taken into account by the
>>>> current "copies".
>>>>
>>>
>>> What is blocking us is not the computing power - already whole "rat
>>> brain" simulations have been done is something like 1/10000 of real
>>> time - so all we need is about a decade of performane improvement
>>> through Moores law.
>>>
>>> What development is needed is ways of determining the neural
>>> circuitry. There have been leaps and bounds in the process of slicing
>>> frozen brains, and imaging the slices with electron microscopes, but
>>> clearly it is still far too slow.
>>>
>>> As for the hypothesis that glial cells have something to do with it,
>>> well that can be tested via the sort of whole rat brain simulation
>>> I've been talking about. Run the simulation in a robotic rat, and
>>> compare the behaviour with a real rat. Basically what the open worm
>>> guys a doing, but scaled up to a rat. If the simulation is way
>>> different from the real rat, then we know something else is required.
>>>
>>
>>
>> I can imagine that the rat will have a "normal behavior", but as he
>> cannot talk to us, we might fail to appreciate some internal change or even
>> some anosognosia. The rat would not be a zombie rat, but still be in a
>> quite different conscious state (perhaps better, as it seems the glial cell
>> might have some role in the chronic pain.
>>
>
> In general, if there is a difference in consciousness then there should be
> a difference in behaviour. If the difference in consciousness is impossible
> to detect then arguably it is no difference.
>
>
>
> How would you detect that the rat has a slight headache?
>

It should be detectable under ideal circumstances, or it should be
detectable statistically by sampling a large number of rats.


> Some drugs change *only* the "volume" of consciousness (notably alcool on
> high dose, but this one change also the behavior). It is quite unpleasant,
> like listening to music with a the sound made too much high, but you can
> behave in your normal way, and unless somebody ask, there is no noticeable
> difference in behavior.
>

The point is, it is detectable. If a subjective difference makes no
objective difference under any circumstances then arguably there is no
subjective difference.


> First order experiences are usually wider than anything we can communicate
> in a third person way, so it is natural that difference in consciousness
> does not necessarily entail a difference in behavior, especially for a
> finite time.
>
> The problem of inverse-spectrum for the qualia of color illustrates also
> that a difference of consciousness might not lead to a difference in
> behavior.
>

If the colours I see change every five minutes but I don't notice, then I
would say there is no subjective change.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to