On Saturday, September 22, 2012 11:55:35 AM UTC-4, Bruno Marchal wrote:
>
>
> On 22 Sep 2012, at 17:08, Craig Weinberg wrote:
>
>
>
> On Saturday, September 22, 2012 9:10:30 AM UTC-4, Bruno Marchal wrote:
>>
>>
>> On 21 Sep 2012, at 22:48, Craig Weinberg wrote:
>>
>> Post from my blog:
>>
>> Simple as that, really. From psychological discoveries of the 
>> subconscious and unconscious, to cognitive bias and logical fallacies, to 
>> quasi-religious faith in artificial intelligence, we seem to have a mental 
>> blind spot for emotional realities.
>>
>> What could be more human than making emotional mistakes or having one’s 
>> judgment cloud over because of favoritism or prejudice? Yet when it comes 
>> to assessing the feasibility of a sentient being composed of programmed 
>> functions, we tend to miss entirely this little detail: Personal 
>> preference. Opinion. Bias. It doesn’t bother us that machines completely 
>> lack this dimension and in all cases exhibit nothing but impersonal 
>> computation. This tends to lead the feel-blind intellect to unknowingly 
>> bond to the computer. The consistency of an automaton’s function is 
>> comforting to our cognitive self, who longs to be free of emotional bias, 
>> so much so that it is able to hide that longing from itself and project the 
>> clean lines of perfect consequences outward onto a program.
>>
>> It’s not that machines aren’t biased too - of course they are incredibly 
>> biased toward the most literal interpretations possible, but they are all 
>> biased in the same exact way so that is seems to us a decent tradeoff. The 
>> rootless consciousness of the prefrontal cortex thinks that is a small 
>> price to pay, and one which will inevitably be mitigated with improvements 
>> in technology. In its crossword puzzle universe of Boolean games, something 
>> like a lack of personhood or feeling is a minor glitch, an aesthetic ‘to be 
>> continued’ which need only be set aside for now while the more important 
>> problems of function can be solved.
>>
>> It seems that the ocean of feelings and dreams which were tapped into by 
>> Freud, Jung, and others in the 20th century have been entirely dismissed in 
>> favor of a more instrumental approach. Simulation of behaviors. Turing 
>> machine emulation. This approach has the fatal flaw of drawing the mind 
>> upside down, with intellect and logic at the base that builds up to complex 
>> mimicry of mood and inflection. The mind has an ego and doesn’t know it. 
>> Thinking has promoted itself to a cause of feeling and experience rather 
>> than a highly specialized and esoteric elaboration of personhood.
>>
>> We can see this of course in developmental psychology and anthropology. 
>> Babies don’t come out of the womb with a flashing cursor, ready to accept 
>> programming passively. Primitive societies don’t begin with impersonal 
>> state bureaucracies and progress to chiefdoms. We seem to have to learn 
>> this lesson again and again that our humanity is not a product of strategy 
>> and programming, but of authenticity and direct participation.
>>
>> When people talk about building advanced robots and computers which will 
>> be indistinguishable from or far surpass human beings, they always seem to 
>> project a human agenda on them. We define intelligence outside of ourselves 
>> as that which serves a function to us, not to the being itself. This again 
>> suggests to me the reflective quality of the mind, of being blinded by the 
>> reflection of our own eyes in our sunglasses. Thoughts have a hard time 
>> assessing the feeling behind themselves, and an even harder time admitting 
>> that it matters.
>>
>> I think we see this more and more in all areas of our lives - an 
>> overconfidence in theoretical approaches and a continuous disconnecting 
>> with the results. We keep hoping that it will work this time, even though 
>> we probably know that it never will. It’s as if our collective psyche is 
>> waiting for our deluded minds to catch up. Waiting for us to figure out 
>> that in spite of the graphs and tests and retooling, the machine is really 
>> not working any better.
>>
>>
>> You are right. We have very often dismissed emotion, feelings and 
>> consciousness in human. 
>>
>> Unfortunately, dismissing emotion feelings and consciousness in machine, 
>> will not help.
>>
>> Bruno
>>
>>
> You don't see a connection between the two? There is no chance of machine 
> feelings being a psychological projection?
>
>
> There is. But as far as we are concern with the "emotion dismissing" 
> problem, projecting emotion them, when they behave in some way, will be 
> less dismissing emotion that attribuating puppetness by decision.
>
>
Why would it be any less dismissive? You just have the opposite problem of 
Chalmers paper: Spontaneously present and advancing qualia. If someone 
writes a program that draws Bugs Bunny, as that program is improved to 
respond to other drawings of Elmer Fudd and Daffy Duck, and to talk like 
Bugs Bunny, you would have to have feelings and thoughts begin to appear 
and gradually become more real. Bugs Bunny would have to feel himself and 
his world as the faintest hint of non-zombie, with sudden infusions of 
realism and phenomenology coinciding with each software upgrade.


I'm not opposed to the idea of computers having emotions in theory, but the 
evidence we've seen so far shows precisely the opposite. If inorganic 
machines could grow and change and learn by themselves, then we would 
likely see a single example of just that. What we see instead is that even 
with many brilliant minds working hard with the finest technology, face a 
perpetual uphill battle. In spite of Moore's Law and 30 years of commercial 
explosion, there is still no sign of any authentic feeling or intentional 
act by a program. 



Because the shadows of those experiences, which exists (epistemologically) 
> in the comp theory, are still confined in complex mathematical theorems. 
> But PA thinks like you and me, I think. 
>

That's what I'm writing about...we *don't* think like we think we think. 
Our thoughts bubble up from emotion and sensation, desire and personhood. 
PA thinks like we think we think - skimmed off the top of the prefrontal 
cortex qualia of logical abstraction. It's the tip of the pyramid thinking 
that it's made up of smaller pyramidal peaks, but it isn't - it's made of 
sub-personal bricks of  trans-rational, non-mereological qualia.


>From my perspective you are just not listening to such machines, and from 
what seems to me arbitrary, you have just decide that they are zombie, when 
I think, they just lack our kind of long rich story, but they don't lack a 
soul. You could look at baby and decide that they are completely stupid, 
*at first sight*.

Give them time. You can't compare millions years evolution machinery, with 
the hundred thousand years of machine evolution, or the one century year of 
the universal machine.

 
I understand why you think that, but my view isn't arbitrary at all. For 20 
years I had your view. I saw 'patterns' as being the universal primitive. 
Babies cry, but machines never do. Babies have personal needs, but machines 
are only there to service our needs impersonally. I agree that the 
difference is only one of the richness of history, but I think that you are 
arbitrarily assuming that that history doesn't extend to the material 
substrate itself. I am saying that matter *is* history, and that it is the 
only vehicle of history within spacetime. I have no prejudice against 
non-human intelligence or non-biological experience at all, I just think 
that you dishonor the last billion years of biology by trying to skip right 
from inorganic mechanism to anthropomorphic psychology. You wind up with a 
puppet (not a zombie, since I don't have any expectation of 
biological-level feeling and experience in the first place). This isn't a 
person with no soul, it is an assembly of molecular souls that has been 
configured to impersonate human behavior - just like a cartoon or a puppet.

Craig




What we see is exactly what I would expect from a fundamentally flawed 
assumption being dragged out - like Ptolemaic astronomy...it just isn't 
working out because we aren't approaching it the right way. We are trying 
to build a house on top of a floating roof.

Craig



>
>
>
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To view this discussion on the web visit 
> https://groups.google.com/d/msg/everything-list/-/2h-lGPs0zXwJ.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to 
> everything-li...@googlegroups.com.
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en.
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>
-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/H0M1Zfk2tZoJ.
To post to this group, send email to everyth...@googlegroups.com<javascript:>
.
To unsubscribe from this group, send email to 
everything-li...@googlegroups.com <javascript:>.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.


> http://iridia.ulb.ac.be/~marchal/
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/MAdKJsO5JUcJ.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to