On 4/1/2015 8:34 PM, Russell Standish wrote:
On Thu, Apr 02, 2015 at 02:48:47AM +0200, Platonist Guitar Cowboy wrote:
I still don't see what MGA "pumps intuitively and incorrectly", as you seem
to assume that MGA is "bad" intuition pump, rather than "good" one that
facilitates seeing something tricky. You've not shown that consciousness
supervenes on broken gates, you don't treat movies like conscious entities,
and haven't pointed towards a recording that is obviously or demonstrably
conscious.

It is one thing to argue intuitively that playing Casablanca does not
instantiate Humphrey Bogart's consciousness. That I would happily agree
with. It only involves a few 100KB per second. It is another thing to
argue that a precise recording of the firings of every neuron in
someone's brain similarly doesn't instantiate consciousness (at around
10^11 neurons per typical human brain, this would be something of the
order of 10^16 bytes per second). This is the sort of recording being
used in Maudlin's thought experiment/MGA. And obviously, according to
COMP, a huge lookup table encoding the machine's output for every
possible input for a machine implementing a conscious moment (which is
just another type of recording, albeit a very complex one that would
exceed the Seth LLoyd bound for the universe) must be conscious. Note
this latter type of device was used in Searles Chinese Room argument,
and I think needs to be answered the same way Dennett answers the
Chinese Room argument.

At some point on the complexity scale, recordings go from being not
conscious to conscious. Where do you draw the line? I'm afraid
intuition does not help much in this matter, which is why I say it is
a weakness of the MGA.

There must be something more to it than just complexity or even Turing universality. Bruno says human-like consciousness requires Lobianity. But I think that's asking for more than just awarenss; it's asking for self-awarness. If I were building a Mars Rover and gave it the ability to learn from its experience by reviewing its memory of events and projecting hypothetical futures, I would be concerned that I had created a sentient being that would forsee its own end. So I would be sure to avoid putting its indefinite survival into its value system.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to