On 9/24/2013 8:44 PM, LizR wrote:
On 25 September 2013 15:41, meekerdb <[email protected] <mailto:[email protected]>> wrote:On 9/24/2013 6:32 PM, LizR wrote:On 25 September 2013 13:38, Russell Standish <[email protected] <mailto:[email protected]>> wrote: This is also true of materialism. Whether you think this is a problem or not depends on whether you think the "hard problem" is a problem or not. Indeed. I was about to say something similar (to the effect that it's hard to imagine how "mere atoms" can have sights, sounds, smells etc either).As a rule, if you want to explain X you need to start from something without X. Absolutely.If you know of such an explanation, or even the outlines of one, I'd be interested to hear it. As Russell said, this is the so-called "hard problem" so any light (or sound, touch etc) on it would be welcome.
My 'solution' to the hard problem is to prognosticate that when we have built intelligent robots we will have learned the significance of having an internal narrative memory. We will have learned what emotions and feelings are at the level of sensors and computation and action. And when we have done that 'the hard problem' will be seen to have been an idle question - like "What is life." proved to be in the 20th century.
Brent -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/groups/opt_out.

