On 25 Feb 2013, at 01:41, Craig Weinberg wrote:

You'll forgive me if I don't jump at the chance to shell out $51.96 for 300+ pages of the same warmed over cog-sci behaviorism-cum- functionalism that I have been hearing from everyone.

By making explicit the level of digital substitution, functionalism is made less trivial that in many accounts that you can find. And comp is stronger hypothesis that behavioral-mechanism (agnostic on zombie).





The preview includes a couple of pages that tell me all that I need to know: (p.22)

'Building in self-representaiton and value, with the goal of constructing a system that could have feelings, will result in a robot that also has the capacity for emotions and complex social behavior.'

I can agree with you, in the sense that I don't believe we can emulate for sure emotion. Well, we should see the algorithm to decide. If emotion comes from the use of diverse exploration made from the data, they might be correct, but loose in their way of presenting what they done.





No, it won't. And a simulation of water won't make plants grow.

OK, but you might just mix levels, and so be trivially correct. A simulation of water, made at some level, can make a simulation of a growing plant, at some level of description. And that plant can be smelt by a person supported by a simulation, at some correct level. If that is not possible, it means that consciousness requires some infinite machinery, which infinities is not recoverable by first person indeterminacy (and thus requires something different than a quantum computer for example). That would make you and Penrose correct. But we still wait for which process you can have in mind, as such infinite machinery, not quantum emulable, remains speculation. Penrose do speculate on a collapse of the wave related to a quantum theory of gravitation. Well, you need such speculation if you want make comp false.

Bruno




Craig



On Sunday, February 24, 2013 1:17:53 AM UTC-5, Brent wrote:
Here's a book Craig should read

Jean-Marc Fellous and Michael A. Arbib (2005). Who Needs Emotions? The Brain Meets the Robot

Heres' the table of contents.




Or at least he should write to the authors and tell them they are wasting their time and explain to them why robots cannot have emotions. They are apparently unaware of his definitive wisdom on the question.

Brent

--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en .
For more options, visit https://groups.google.com/groups/opt_out.



http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to