On 17 February 2010 02:39, Brent Meeker <meeke...@dslextreme.com> wrote:

> My intuition is that once we have a really good 3-p theory, 1-p will seem
> like a kind of shorthand way of speaking about brain processes.  That
> doesn't mean you questions will be answered.  It will be like Bertrand
> Russell's neutral monoids.  There are events and they can be arranged in 3-p
> relations or in 1-p relations.  Explanations will ultimately be circular -
> but not viciously so.

Yes, I've been sympathetic to this intuition myself.  It's just that
I've been troubled recently by the apparent impossibility of
reconciling the two accounts - i.e. what I'm calling (justifiably
AFAICS) the non-computability of 1-p from 3-p, leading directly to the
apparent causal irrelevance of (and mystery of our references to) 1-p
phenomena.  So I've started to wonder again if we've given up too soon
on the possibility of an "interactionist" approach - one that doesn't
fall back on "two substance" dualism with all its hopeless defects.
We certainly don't know that it's ruled out - i.e. that it is indeed
the case that all experiential phenomena map directly to neurological
phenomena in a straightforward 3-p way; this is currently merely an
assumption.  If it could be demonstrated robustly this would dismiss
my doubts, though not my puzzlement.  But the intriguing empirical
possibility exists that, for example, "consciously seeing" (1-p) and
"visually detecting" (3-p) may act on the world by partially different
paths (i.e. that there is an additional possibility - beyond mechanism
- in the deep structure of things that, moreover, has not been missed
by evolution).

David

> David Nyman wrote:
>>
>> On 17 February 2010 00:16, Brent Meeker <meeke...@dslextreme.com> wrote:
>>
>>
>>>
>>> But suppose we had a really good theory and understanding of the brain so
>>> that we could watch yours in operation on some kind of scope (like an
>>> fMRI,
>>> except in great detail) and from our theory we could infer that "David's
>>> now
>>> thinking X.  And it's going to lead him to next think Y.  And then he'll
>>> remember Z and strenghten this synapse over here.  And..."   Then
>>> wouldn't
>>> you start to regard the 1-p account as just another level of description,
>>> as
>>> when you start you car on a cold day it "wants" a richer fuel mixture and
>>> the ECU "remembers" to keep the idle speed up until it's warm.
>>>
>>
>> In short, yes.  But that doesn't make the problem as I've defined it
>> go away.  At the level of reconciliation you want to invoke, you would
>> have to stop putting scare quotes round the experiential vocabulary,
>> unless your intention - like Dennett's AFAICS - is to deny the
>> existence, and causal relevance, of genuinely experiential qualities
>> (as opposed to "seemings", whatever they might be).  At bottom, 1-p is
>> not a "level of description" - i.e. something accessed *within*
>> consciousness - it *is* the very mode of access itself.
>
> I think "accessed" creates the wrong image - as though there is some "you"
> outside of this process that is "accessing" it.  But I'm not sure that
> vitiates your point.
>
>
>> The trouble
>> comes because in the version you cite the default assumption is that
>> the synapse-strengthening stuff - the 3-p narrative - is sufficient to
>> account for all the observed phenomena - including of course all the
>> 3-p references to experiential qualities and their consequences.
>>
>> But such qualities are entirely non-computable from the 3-p level,
>
> How can you know that?
>
>> so
>> how can such a narrative refer to them?  And indeed, looked at the
>> other way round, given the assumed causal closure of the 3-p level,
>> what further function would be served by such 1-p references?
>
> "Function" in the sense of purpose?  Why should it have one?
>>
>> Now, if
>> we indeed had the robust state of affairs that you describe above,
>> this would be a stunning puzzle, because 1-p and 3-p are manifestly
>> not "identical", nor are they equivalently "levels of description" in
>> any relevant sense. Consequently, we would be faced with a brute
>> reality without any adequate explanation.
>>
>> However, in practice, the theory and observations you characterise are
>> very far from the current state of the art. This leaves scope for some
>> actual future theory and observation to elucidate "interaction"
>> between 1-p and 3-p with real consequences that would be inexplicable
>> in terms of facile "identity" assertions.  For example, that I
>> withdraw my hand from the fire *because* I feel the pain, and this
>> turns out to both in theory and observation to be inexplicable in
>> terms of any purely 3-p level of description.  Prima facie, this might
>> seem to lead to an even more problematic interactive dualism, but my
>> suspicion is that there is scope for some genuinely revelatory
>> reconciliation at a more fundamental level - i.e. a truly explanatory
>> identity theory.  But we won't get to that by ignoring the problem.
>>
>
> My intuition is that once we have a really good 3-p theory, 1-p will seem
> like a kind of shorthand way of speaking about brain processes.  That
> doesn't mean you questions will be answered.  It will be like Bertrand
> Russell's neutral monoids.  There are events and they can be arranged in 3-p
> relations or in 1-p relations.  Explanations will ultimately be circular -
> but not viciously so.
>
> Brent
>
>> David
>>
>>
>>>
>>> David Nyman wrote:
>>>
>>>>
>>>> On 16 February 2010 22:21, Stathis Papaioannou <stath...@gmail.com>
>>>> wrote:
>>>>
>>>>
>>>>
>>>>>
>>>>> Consciousness could be computable in the sense that if you are the
>>>>> computation, you have the experience.
>>>>>
>>>>>
>>>>
>>>> Yes, but that's precisely not the sense I was referring to.  Rather
>>>> the sense I'm picking out is that neither the existence, nor the
>>>> specifically experiential characteristics, of any 1-p component over
>>>> and above the 3-p level of description is accessible (computable) in
>>>> terms of any such 3-p narrative.  Consequently any reference to such a
>>>> component at the 3-p level seems inexplicable.  This leads some (e.g.
>>>> Dennett, if I've understood him) to try to finesse this by claiming
>>>> that 1-p experience only "seems" to exist - IOW that when 3-me refers
>>>> to 3-my "conscious experience" this is merely a 3-p reference to some
>>>> equivalent computational aspect which is fully sufficient to account
>>>> for all the resultant 3-p phenomena.  The 1-p "seeming" is then
>>>> supposed to be, in some under-defined sense, "identical" to this
>>>> computation.
>>>>
>>>> But for two manifestly distinct levels of description to have any
>>>> prospect of being seen as "identical", they must  be capable of being
>>>> discarded individually, in order to be jointly reconciled in terms of
>>>> a single more fundamental level clearly compatible with both - this is
>>>> the only manoeuvre that could validate any non-question-begging
>>>> ascription of "identity".
>>>>
>>>
>>> But suppose we had a really good theory and understanding of the brain so
>>> that we could watch yours in operation on some kind of scope (like an
>>> fMRI,
>>> except in great detail) and from our theory we could infer that "David's
>>> now
>>> thinking X.  And it's going to lead him to next think Y.  And then he'll
>>> remember Z and strenghten this synapse over here.  And..."   Then
>>> wouldn't
>>> you start to regard the 1-p account as just another level of description,
>>> as
>>> when you start you car on a cold day it "wants" a richer fuel mixture and
>>> the ECU "remembers" to keep the idle speed up until it's warm.
>>>
>>> Brent
>>>
>>>
>>>>
>>>> ISTM that the Dennettian approach is merely
>>>> to *assert* - given the undeniable "seeming" of conscious experience -
>>>> that this *must* be the case, whilst offering no glimmer of what the
>>>> nature of such a transcendent level of reconciliation could possibly
>>>> be.
>>>>
>>>> David
>>>>
>>>>
>>>>
>>>>
>>>>>
>>>>> On 17 February 2010 05:07, David Nyman <david.ny...@gmail.com> wrote:
>>>>>
>>>>>
>>>>>>
>>>>>> This is old hat, but I've been thinking about it on awakening every
>>>>>> morning for the last week.  Is consciousness - i.e. the actual first-
>>>>>> person experience itself - literally uncomputable from any third-
>>>>>> person perspective?  The only rationale for adducing the additional
>>>>>> existence of any 1-p experience in a 3-p world is the raw fact that we
>>>>>> possess it (or "seem" to, according to some).  We can't "compute" the
>>>>>> existence of any 1-p experiential component of a 3-p process on purely
>>>>>> 3-p grounds.  Further, if we believe that 3-p process is a closed and
>>>>>> sufficient explanation for all events, this of course leads to the
>>>>>> uncomfortable conclusion (referred to, for example, by Chalmers in
>>>>>> TCM) that 1-p conscious phenomena (the "raw feels" of sight, sound,
>>>>>> pain, fear and all the rest) are totally irrelevant to what's
>>>>>> happening, including our every thought and action.
>>>>>>
>>>>>> But doesn't this lead to paradox?  For example, how are we able to
>>>>>> refer to these phenomena if they are causally disconnected from our
>>>>>> behaviour - i.e. they are uncomputable (i.e. inaccessible) from the 3-
>>>>>> p perspective?  Citing "identity" doesn't seem to help here - the
>>>>>> issue is how 1-p phenomena could ever emerge as features of our shared
>>>>>> behavioural world (including, of course, talking about them) if they
>>>>>> are forever inaccessible from a causally closed and sufficient 3-p
>>>>>> perspective.  Does this in fact lead to the conclusion that the 3-p
>>>>>> world can't be causally closed to 1-p experience, and that I really do
>>>>>> withdraw my finger from the fire because it hurts, and not just
>>>>>> because C-fibres are firing?  But how?
>>>>>>
>>>>>>
>>>>>
>>>>> Consciousness could be computable in the sense that if you are the
>>>>> computation, you have the experience.
>>>>>
>>>>>
>>>>> --
>>>>> Stathis Papaioannou
>>>>>
>>>>> --
>>>>> You received this message because you are subscribed to the Google
>>>>> Groups
>>>>> "Everything List" group.
>>>>> To post to this group, send email to everything-l...@googlegroups.com.
>>>>> To unsubscribe from this group, send email to
>>>>> everything-list+unsubscr...@googlegroups.com.
>>>>> For more options, visit this group at
>>>>> http://groups.google.com/group/everything-list?hl=en.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>> --
>>> You received this message because you are subscribed to the Google Groups
>>> "Everything List" group.
>>> To post to this group, send email to everything-l...@googlegroups.com.
>>> To unsubscribe from this group, send email to
>>> everything-list+unsubscr...@googlegroups.com.
>>> For more options, visit this group at
>>> http://groups.google.com/group/everything-list?hl=en.
>>>
>>>
>>>
>>
>>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To post to this group, send email to everything-l...@googlegroups.com.
> To unsubscribe from this group, send email to
> everything-list+unsubscr...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/everything-list?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to