On 7/8/2014 1:36 PM, LizR wrote:
On 9 July 2014 07:23, meekerdb <meeke...@verizon.net
<mailto:meeke...@verizon.net>> wrote:
But would it still be consciousness if there were no world that provided
referents
for the program?
It's hard to imagine how you could do this /without/ a world to supply referents. Even
if the world being simulated in the MGA was some abstract virtual reality based on, say,
music a la Olaf Stapledon, the referents used to create it still have to come from a
world. In practice, I don't think anyone can imagine a world which doesn't relate in
some way to the one they (apparently) exist in. Even Alan E Nourse's "Universe between"
or Andre Norton's geometric parallel world, I forget in which story, were of course
based on the one they lived in.
However even if the MGA's apparent reality /could /be entirely without referents to a
world, the programme in the MGA would still be conscious, by hypothesis (assuming comp,
of course, but the MGA assumes comp). It would be conscious of an entirely imaginary
world. I suppose that is kind of similar to dreaming / tripping, although dreams / trips
still refer to the outside world, at least in a surreal / oblique manner.
The fact that the only way to do the experiment in practice is to use a world to supply
referents doesn't mean the world involved has to be primary, of course. It could still
emerge from infinite computations or whatever.
Sure, but would it supply a unique association between a computation and the emergent
referents?
The argument about consciousness needing a world to relate to /at the time it's
relating/ is interesting, but has always struck me as requiring some weird metaphysical
extras.
Or metacomputational. If you suppose that no referents are needed then it's hard to say
what the computation is *about*. If you suppose that they could be referents to a made up
world, then it seems that there could be arbitrarily many different worlds providing
referents; like the paradox of the rock that computes everything. If you suppose that the
computation only has meaning by referent to the world in which it is implemented, then
you've denied that the computation simpliciter is implementing consciousness; it's only
conscious in virtue of being in the world of its referents - or resorting to my example of
a Mars Rover only by being able to act and be acted upon by this world.
I can't see how a conscious digital computer programme being started in the same state
as the first time around, and having the same inputs replayed to it as before, would not
be just as conscious as it was the first time around. Either it's conscious as it was
before, OR it wasn't conscious the first time, OR there is some weird supernatural stuff
going on that somehow makes a difference. (This isn't the point at which I have problems
with the MGA.)
And you're irritated because I don't take a definite position. ;-)
Brent
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.