On Saturday, October 13, 2018 at 6:51:53 PM UTC-5, Brent wrote:
>
>
>
> On 10/13/2018 3:05 AM, Philip Thrift wrote:
>
>
>
> On Saturday, October 13, 2018 at 12:37:22 AM UTC-5, Brent wrote: 
>>
>>
>>
>> On 10/12/2018 9:06 PM, Philip Thrift wrote:
>>
>>
>>
>> On Friday, October 12, 2018 at 4:09:03 PM UTC-5, Brent wrote: 
>>>
>>>
>>>
>>> On 10/10/2018 4:12 PM, Pierz wrote:
>>>
>>> It's not intelligent behaviour. There are tons of things (human 
>>> artifacts that have been created to automate certain complex input-output 
>>> systems) that exhibit complex, intelligent-ish behaviour that I seriously 
>>> doubt have any more sentience than a rock, though I'm open to the 
>>> possibility of some sentience in rocks. My "method of determining if 
>>> something is conscious" is the same as most people who don't believe their 
>>> smart phones are having experiences. It's being a biological organism with 
>>> a nervous system, though again, I'm agnostic on organisms like trees. When 
>>> *you're* not being a philosopher I bet that's your real criterion 
>>> too! You're not worrying about killing your smartphone when you trash it 
>>> for the next model.
>>>
>>> Of course this is based on a guess, as yours is. My lack of a good 
>>> theory of the relationship between matter and mind does not force me into 
>>> solipsism because the absence of a test proves nothing about reality. 
>>> Things are as they are. All people are conscious, I assume. Probably all 
>>> animals. Possibly plants and rocks and stars and atoms, in some very 
>>> different way from us. Whatever way it is, it *is* that way regardless 
>>> of whether I can devise a test for it, even in principle. 
>>>
>>>
>>> I generally agree.  I like to resort to my intuition pump, the AI Mars 
>>> Rover, because I think the present Mars Rovers have a tiny bit of 
>>> intelligence and a corresponding tiny bit of consciousness.  Their 
>>> intelligence is in their navigation, deployment of instruments, 
>>> self-monitoring, and reporting to JPL.  They make some decisions about 
>>> these things, but they don't learn from experience so they probably at the 
>>> level of some insects or spiders, except that they have more language with 
>>> which they communicate with JPL.  But an AI Mars Rover that was designed to 
>>> learn from experience would, I think, be made conscious in some degree.  
>>> This is because it will need to remember experiences and recall relevant 
>>> ones when faced with unusual problems.  Solving the problem by using 
>>> experience means having a self-model in a simulation to try to foresee the 
>>> outcome of different choices.  I think that's the essence of basic 
>>> consciousness, learning from experience and self-modeling as part of 
>>> decisions.
>>>
>>> Brent
>>>
>>
>>
>> Still, purely* informational* processing, which includes intentional 
>> agent programming (learning from experience, self-modeling), I don't think 
>> captures all true *experiential* processing (phenomenal consciousness).
>>
>> https://plato.stanford.edu/entries/consciousness-intentionality/
>>
>> *To say you are in a state that is (phenomenally) conscious is to say—on 
>> a certain understanding of these terms—that you have an experience, or a 
>> state there is something it’s like for you to be in. Feeling pain or 
>> dizziness, appearances of color or shape, and episodic thought are some 
>> widely accepted examples. Intentionality, on the other hand, has to do with 
>> the directedness, aboutness, or reference of mental states—the fact that, 
>> for example, you think of or about something. Intentionality includes, and 
>> is sometimes seen as equivalent to, what is called “mental representation”.*
>>
>>
>> If an AIMR runs a simulation in which it models itself in order to decide 
>> on a course of action isn't that "directedness", "intentionality", and 
>> "mental representation"?  
>>
>
>
> That is indeed intentional (or representational) processing.
>
> intentionalism (or representationalism): "Consciousness is entirely 
> intentional or representational.  intentionalism implies that facts about 
> the representational content of an experience (together with facts about 
> the representational content of the subject’s other mental events or 
> states) fix or determine the facts about its phenomenal character. In other 
> words, intentionalism implies that phenomenal character supervenes on 
> representational content." 
> [ http://web.mit.edu/abyrne/www/what_phen_conc_is_like.html ]
>
> The experientialist (Strawson, etc.) rejects the representationalist 
> supervenience thesis. In other words:
>
>
> *          experience processing > information processing.*
>
>
> You know if I wanted to read Strawson or the SEP I could easily do that.  
> When I post here I'd rather know what you think.
>
>
> This seems like it should be obvious:

"The [conventional] computer is indeed a great and powerful* information 
processing* machine, which the brain hardly is. The brain is an *experience 
processing* system, which creates information during its processing."


*Brain Experience: Neuroexperiential Perspectives of Brain-mind*
C.R. Mukundan
https://books.google.com/books?id=12_gDy18_YIC



 

>
>
>
>> When it stores into memory incidents to learn from it must also associate 
>> some valuation to the outcome of the incident.  Isn't that a "feeling" 
>> about it?  If it lost a wheel wouldn't it feel something analogous to 
>> "pain"? Damasio says that human emotion is just perception of internal 
>> states, e.g. hormones, etc.
>>
>> Brent
>>
>
>  
>
>  What I've read about Damasio, e.g.
>
> "He also demonstrated that while the insular cortex plays a major role in 
> feelings, it is not necessary for feelings to occur, suggesting that brain 
> stem structures play a basic role in the feeling process."
>
> suggests he is not decoupling biological material from emotions.
>
> Which would mean again : Substrate [Material composition] matters.
>
>
> That's just sloganeering.  It doesn't say anything about what kind of 
> substrate matters, except that we only have the one example.
>
> Brent
>
 

The particular molecular substrate matters for experience (vs. information) 
processing.

- pt

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to