Stathis Papaioannou wrote:
> Brent meeker writes:
>
>
>>Stathis Papaioannou wrote:
>>
>>>Peter Jones writes:
>>>
>>>
>>>
>>>>Stathis Papaioannou wrote:
>>>>
>>>>
>>>>
>>>>>Like Bruno, I am not claiming that this is definitely the case, just that
>>>>>it is the case if
>>>>>computationalism is true. Several philosophers (eg. Searle) have used the
>>>>>self-evident
>>>>>absurdity of the idea as an argument demonstrating that computationalism
>>>>>is false -
>>>>>that there is something non-computational about brains and consciousness.
>>>>>I have not
>>>>>yet heard an argument that rejects this idea and saves computationalism.
>>>>
>>>>[ rolls up sleaves ]
>>>>
>>>>The idea is easilly refuted if it can be shown that computation doesn't
>>>>require
>>>>interpretation at all. It can also be refuted more circuitously by
>>>>showing that
>>>>computation is not entirely a matter of intepretation. In everythingism
>>>>, eveything
>>>>is equal. If some computations (the ones that don't depend on
>>>>interpretation) are
>>>>"more equal than others", the way is still open for the Somethinginst
>>>>to object
>>>>that interpretation-independent computations are really real, and the
>>>>others are
>>>>mere possibilities.
>>>>
>>>>The claim has been made that computation is "not much use" without an
>>>>interpretation.
>>>>Well, if you define a computer as somethin that is used by a human,
>>>>that is true.
>>>>It is also very problematic to the computationalist claim that the
>>>>human mind is a computer.
>>>>Is the human mind of use to a human ? Well, yes, it helps us stay alive
>>>>in various ways.
>>>>But that is more to do with reacting to a real-time environment, than
>>>>performing abstract symbolic manipulations or elaborate
>>>>re-interpretations. (Computationalists need to be careful about how
>>>>they define "computer". Under
>>>>some perfectly reasonable definitions -- for instance, defining a
>>>>computer as
>>>>a human invention -- computationalism is trivially false).
>>>
>>>
>>>I don't mean anything controversial (I think) when I refer to interpretation
>>>of
>>>computation. Take a mercury thermometer: it would still do its thing if all
>>>sentient life in the universe died out, or even if there were no sentient
>>>life to
>>>build it in the first place and by amazing luck mercury and glass had come
>>>together
>>>in just the right configuration. But if there were someone around to observe
>>>it and
>>>understand it, or if it were attached to a thermostat and heater, the
>>>thermometer
>>>would have extra meaning - the same thermometer, doing the same thermometer
>>>stuff. Now, if thermometers were conscious, then part of their "thermometer
>>>stuff" might include "knowing" what the temperature was - all by themselves,
>>>without
>>>benefit of external observer.
>>
>>We should ask ourselves how do we know the thermometer isn't conscious of the
>>temperature? It seems that the answer has been that it's state or activity
>>*could*
>>be intepreted in many ways other than indicating the temperature; therefore
>>it must
>>be said to unconscious of the temperature or we must allow that it implements
>>all
>>conscious thought (or at least all for which there is a possible
>>interpretative
>>mapping). But I see it's state and activity as relative to our shared
>>environment;
>>and this greatly constrains what it can be said to "compute", e.g. the
>>temperature,
>>the expansion coefficient of Hg... With this constraint, then I think there
>>is no
>>problem in saying the thermometer is conscious at the extremely low level of
>>being
>>aware of the temperature or the expansion coefficient of Hg or whatever else
>>is
>>within the constraint.
>
>
> I would basically agree with that. Consciousness would probably have to be a
> continuum
> if computationalism is true. Even if computationalism were false and only
> those machines
> specially blessed by God were conscious there would have to be a continuum,
> across
> different species and within the lifespan of an individual from birth to
> death. The possibility
> that consciousness comes on like a light at some point in your life, or at
> some point in the
> evolution of a species, seems unlikely to me.
>
>
>>>Furthermore, if thermometers were conscious, they
>>>might be dreaming of temperatures, or contemplating the meaning of
>>>consciousness,
>>>again in the absence of external observers, and this time in the absence of
>>>interaction
>>>with the real world.
>>>
>>>This, then, is the difference between a computation and a conscious
>>>computation. If
>>>a computation is unconscious, it can only have meaning/use/interpretation in
>>>the eyes
>>>of a beholder or in its interaction with the environment.
>>
>>But this is a useless definition of the difference. To apply we have to know
>>whether
>>some putative conscious computation has meaning to itself; which we can only
>>know by
>>knowing whether it is conscious or not. It makes consciousness ineffable and
>>so
>>makes the question of whether computationalism is true an insoluble mystery.
>
>
> That's what I have in mind.
>
>
>>Even worse it makes it impossible for us to know whether we're talking about
>>the same
>>thing when we use the word "consciousness".
>
>
> I know what I mean, and you probably know what I mean because despite our
> disagreements
> our minds are not all that different from each other. Of course, I can't be
> absolutely sure that
> you are conscious, but it's one of those things even philosophers only worry
> about when at
> work. On the other hand, there is serious debate about whether dogs are
> conscious, or whether
> fish feel pain.
>
>
>>>>If a computation is conscious,
>>>
>>>it may have meaning/use/interpretation in interacting with its environment,
>>>including
>>>other conscious beings, and for obvious reasons all the conscious
>>>computations we
>>>encounter will fall into that category; but a conscious computation can also
>>>have meaning
>>>all by itself, to itself.
>>
>>I think this is implicitly circular. Consciousness supplies meaning through
>>intepretation. But meaning is defined only as what consciousness supplies.
>>
>>It is to break this circularity that I invoke the role of the enviroment.
>>Certainly
>>for language, it is our shared environment that makes it possible to assign
>>meaning
>>to words.
>
>
> I don't see a fundamental problem with using this sort of definition. A
> self-expanding
> archive is an archive which opens itself without need of external programs.
> "Expansion"
> and "awareness" are two possible functions that programs can have, and they
> can operate
> on either themselves or aspects of their environment.
>
>
>>>You might argue, as Brent Meeker has, that a conscious being would
>>>quickly lose consciousness if environmental interaction were cut off, but I
>>>think that is just
>>>a contingent fact about brains, and in any case, as Bruno Marchal has
>>>pointed out, you
>>>only need a nanosecond of consciousness to prove the point.
>>
>>I don't think any human can experience consciousness for a nano-second. I
>>see that
>>as part of the lesson of the Libet's and Grey Walter's experiments.
>>Consciousness
>>has a time-scale on the order of tenths of a second. But that's only a
>>quibble.
>>
>>The real importance of the environment is not that it keeps our brains from
>>falling
>>into "do loops", but that it makes interpretation or meaning possible.
>
>
> Why can't you have meaning during a loop? We might be doomed to repeat this
> discussion
> forever as part of some Nietzchian eternal return, but it will be just as
> meaningful to us (or
> not) every time.
>
>
>>>>It is of course true that the output of a programme intended to do one
>>>>thing
>>>>("system S", say) could be re-interpeted as something else. But what
>>>>does it *mean* ?
>>>>If computationalism is true whoever or whatever is doing the
>>>>interpreting is another
>>>>computational process. SO the ultimate result is formed by system S in
>>>>connjunction
>>>>with another systen. System S is merely acting as a subroutine. The
>>>>Everythingist's
>>>>intended conclusion is that every physical system implements every
>>>>computation.
>>>
>>>
>>>That's what I'm saying, but I certainly don't think everyone agrees with me
>>>on the list, and
>>>I'm not completely decided as to which of the three is more absurd: every
>>>physical system
>>>implements every conscious computation, no physical system implements any
>>>conscious
>>>computation (they are all implemented non-physically in Platonia), or the
>>>idea that a
>>>computation can be conscious in the first place.
>>
>>Why not reject the first two and accept that computations, to the degree they
>>have
>>certain structures, are conscious, i.e. self-intepreting relative to their
>>environment?
>>
>>
>>>>But the evidence -- the re-interpretation scenario -- only supports the
>>>>idea
>>>>that any computational system could become part of a larger system that
>>>>is
>>>>doing something else. System S cannot be said to be simultaneously
>>>>perforiming
>>>>every possible computation *itself*. The multiple-computaton -- i.e
>>>>multiple-interpretation
>>>>-- scenario is dependent on a n intepreter. Having made computation
>>>>dependent
>>>>on interpretation, we cannot the regard the interpreter as redundant,
>>>>so that it
>>>>is all being done by the system itself. (Of course to fulfil the
>>>>"every" in
>>>>"every possible interpretation" you need not just interpreters but
>>>>every possible intepreter, but that is another problem for another
>>>>day..)
>>>
>>>
>>>Only the unconscious thermometers require external thermometer-readers.
>>
>>Require for what?
>
>
> They don't actually *require* them; they are what they are regardless of the
> outside
> world. However, something can only have meaning, or taste, or beauty in the
> eye of the
> beholder, and sometimes the beholder is the object as well as subject.
>
> Stathis Papaioannou
Aye, there's the rub. What's a beholder? It doesn't behold itself beholding
and it
doesn't behold much of anything - but I might say it does "behold" the
temperature or
the expansion of Hg or some such.
Brent Meeker
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---