On Monday, October 15, 2018 at 10:49:44 PM UTC-5, Brent wrote:
>
>
>
> On 10/15/2018 5:06 PM, Philip Thrift wrote:
>
>
>
> On Monday, October 15, 2018 at 6:45:11 PM UTC-5, Brent wrote: 
>>
>>
>>
>> On 10/14/2018 11:13 PM, Philip Thrift wrote:
>>
>>
>>
>> On Sunday, October 14, 2018 at 9:53:07 PM UTC-5, Brent wrote: 
>>>
>>>
>>>
>>> On 10/14/2018 2:48 PM, John Clark wrote:
>>>
>>> >*And there are sound reasons for doubting the consciousness of 
>>>> computers -*
>>>>
>>>  
>>> Name one of them that could not also be used to doubt the consciousness 
>>> of your fellow human beings.
>>>
>>>
>>> The reason for not doubting that other human beings are conscious is 
>>> that (1) I am conscious and (2) other human beings are made of the same 
>>> stuff in approximately the same way that I am and (3) they behave the same 
>>> way in relation to what I am conscious of, e.g. they jump at a sudden loud 
>>> sound.
>>>
>>> Brent
>>>
>>
>>
>> The thought crossed my mind yesterday: I was helping a young man applying 
>> for a Ph.D. program in chemical engineering with his application, and we 
>> were talking about chemistry and consciousness*, and I mentioned a type of 
>> zombie - a being that could converse (like an advanced Google Assistant or 
>> Sophie Robot) but not be conscious - and I thought it was *possible* he was 
>> a zombie.
>>
>> * cf. 
>> *Experience processing*
>> https://codicalist.wordpress.com/2018/10/14/experience-processing/
>>
>>
>> I suppose you've read Scott Aaronson's take down of Tononi's theory.  So 
>> I wonder why you would reference Tononi.
>>
>> One problem with the "experience is primary" theory is that there's no 
>> way for it to evolve.  If it's a property of matter why are organized 
>> information processing lumps of matter more capable of experience than 
>> unorganized lumps of the same composition...the obvious answer is that the 
>> former process information, and processing information is something natural 
>> selection can work on.  Smart animals reproduce better.  Animals with 
>> experiences...who cares?
>>
>> “Emotional-BDI agents are BDI agents whose behavior is guided not only by 
>> beliefs, desires and intentions, but also by the role of emotions in 
>> reasoning and decision-making."  This makes a false assumption that 
>> emotions are something independent of beliefs, desire, intentions, 
>> reasoning, and decision making.  But this says nothing about the 
>> satisfaction and thwarting of desires and intentions.  Why are those enough 
>> to explain emotions.  I agree that emotions are necessary for reasoning in 
>> the sense that emotions are the value-weights given to events, including 
>> those imagined by foresight, and that some values are primitive.  
>>
>> I think it is false that "Purely informational processing, which includes 
>> intentional agent programming (learning from experience, self-modeling), 
>> does not capture all true experiential processing (phenomenal 
>> consciousness). "  It is a cheat to put in "purely".  In fact all learning 
>> and intentional planning must include weighing alternatives and assigning 
>> value/emotion to them.  I don't see any need for a further primitive 
>> modality.  For example, a feeling of dizziness is a failure to maintain 
>> personal spacial orientation which is a value at a very low (subconscious) 
>> level.  Sure there are feelings and emotions...but I think they are all 
>> derivative from more primitive values that are derivative from evolution.
>>
>> I think the reason you are attracted to this idea is that it is closed 
>> within the computer/information/program frame.  And that is why I use the 
>> example of the AI Mars Rover.  Sure, emotion cannot be derived within a 
>> computer.  Emotion is something useful to a robot, an AI that works and 
>> strives within an external world which also acts on it.
>>
>> Brent
>>
>>
>>
>>
> I'll include the reference to
>
>    *Why I Am Not An Integrated Information Theorist (or, The Unconscious 
> Expander)*
>    https://www.scottaaronson.com/blog/?p=1799
>
> Since I say that IIT is still in the *information-oriented paradigm *and 
> not in the *experience-oriented paradigm*, Aaronson's post helps my case.
>
> I don't quite follow the rest. A person may feel pleasure (a modality) 
> without reasoning "I need to feel pleasure now."
>
>
> It's not a question of what it's possible to feel, but whether that it can 
> be accounted for by information processing.  "I feel pleasure now." may 
> well be a function of specific perceptions, values, and reasoning about 
> them.  I don't see any attempt to prove this cannot be the case.  It seems 
> that helping yourself to a primitive "experience" built into matter is just 
> baseless speculation unless you have some project to measure or 
> characterize this experience and show how it interacts with 
> information...because we certainly know that information can give pain or 
> pleasure.
>
> Brent
>

 

Information can give pain or pleasure, like watching Fox News vs. a cuddly 
nature show.

But here is the thesis I think of the experience-oriented (vs. 
information-oriented) paradigm*: *Experience cannot be represented: It does 
not exist outside of its material instantiation.*


* and why a biocomputer is a new kind of computer

- pt

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to