On Monday, February 25, 2013 12:53:47 PM UTC-5, Bruno Marchal wrote:
>
>
> On 25 Feb 2013, at 01:41, Craig Weinberg wrote:
>
> You'll forgive me if I don't jump at the chance to shell out $51.96 for 
> 300+ pages of the same warmed over cog-sci behaviorism-cum-functionalism 
> that I have been hearing from everyone.
>
>
> By making explicit the level of digital substitution, functionalism is 
> made less trivial that in many accounts that you can find. And comp is 
> stronger hypothesis that behavioral-mechanism (agnostic on zombie).
>

To me it's like arguing Episcopalian vs Presbyterian. Sure, there are 
differences, but the problem I have is that they all approach consciousness 
from the outside in while failing to recognize that the idea of there being 
an exterior to consciousness is only something which we have come to expect 
through consciousness itself.


>
>
>
>
> The preview includes a couple of pages that tell me all that I need to 
> know: (p.22) 
>
> 'Building in self-representaiton and value, with the goal of constructing 
> a system that could have feelings, will result in a robot that also has the 
> capacity for emotions and complex social behavior.'
>
>
> I can agree with you, in the sense that I don't believe we can emulate for 
> sure emotion. Well, we should see the algorithm to decide. If emotion comes 
> from the use of diverse exploration made from the data, they might be 
> correct, but loose in their way of presenting what they done.
>

What prevents you from believing that we might not be able to emulate 
emotion? I have no sentimental reason for believing that, but am guided 
more by the observation of how interaction with machines leads me and most 
others with the distinct impression dealing with an impersonal 
presentation. It would seem that if emotions were harder to produce than 
logic, that the cortex should be our brain stem, and the limbic system 
should be something that only higher primates have.
 

>
>
>
>
>
> No, it won't. And a simulation of water won't make plants grow.
>
>
> OK, but you might just mix levels, and so be trivially correct. A 
> simulation of water, made at some level, can make a simulation of a growing 
> plant, at some level of description. 
>

But it isn't necessary to simulate water to make a simulated plant appear 
to grow. You can just simulate the growth directly, without cause, or with 
whatever cause you choose, even if it is invisible and arithmetic.

But lets talk about levels. What is it about the bottom level - the one 
which keeps simulated water from watering 'real' plants, different from the 
other levels in which the same simulated water could be used? Why is it 
that no simulated presence can interface with our bodies unless it is 
passed through a physical mechanism, but no such mechanism is required in 
virtual environments?
 

> And that plant can be smelt by a person supported by a simulation, at some 
> correct level. 
> If that is not possible, it means that consciousness requires some 
> infinite machinery, which infinities is not recoverable by first person 
> indeterminacy (and thus requires something different than a quantum 
> computer for example). That would make you and Penrose correct. But we 
> still wait for which process you can have in mind, as such infinite 
> machinery, not quantum emulable, remains speculation. Penrose do speculate 
> on a collapse of the wave related to a quantum theory of gravitation. Well, 
> you need such speculation if you want make comp false. 
>

There doesn't need to be infinite machinery if we assume a sense totality 
from the start. Machines are only necessary to manipulate isolated forms in 
space and transitive functions through time, but if sense pre-figures 
spacetime, then the machine becomes a second order construction. Machines 
are orthogonal to the absolute orientation of nature (experiences through 
time) in that they are derived from functional cliches which cut across 
nature horizontally. A machine simulates a snapshot of the tree at a 
certain moment in time, but it misses the longitudinal flow from acorn to 
forest. It's that superficiality that will always make comp fall apart. 
Contrary to our expectations, the more facades that are constructed, the 
more the rootless qualities are subtly exposed, and the more the AI becomes 
inconsistent - flashing apparent brilliance now, obvious cluelessness the 
next. The lack of personhood becomes more uncanny and difficult to put your 
finger on.

Craig


> Bruno
>
>
>
>
> Craig
>
>
>
> On Sunday, February 24, 2013 1:17:53 AM UTC-5, Brent wrote:
>>
>>  Here's a book Craig should read 
>>
>> Jean-Marc Fellous and Michael A. Arbib (2005). Who Needs Emotions? The 
>> Brain Meets the 
>> Robot<http://books.google.com/books?id=TvDi5V03b4IC&printsec=frontcover&client=safari&sig=ACfU3U1mSbp_Rp0SNLeLbdi3RHMox3Iagw>
>>
>> Heres' the table of contents.
>>
>>
>>
>>
>> Or at least he should write to the authors and tell them they are wasting 
>> their time and explain to them why robots cannot have emotions.  They are 
>> apparently unaware of his definitive wisdom on the question.
>>
>> Brent
>>  
>
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to everything-li...@googlegroups.com <javascript:>.
> To post to this group, send email to everyth...@googlegroups.com<javascript:>
> .
> Visit this group at http://groups.google.com/group/everything-list?hl=en.
> For more options, visit https://groups.google.com/groups/opt_out.
>  
>  
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to