I wrote a simple reinforcement learner which includes the line of code:

printf("Ouch!\n");

So I don't see communication of qualia as a major obstacle to AGI.

Or do you mean something else by qualia?


On Mon, Sep 24, 2018, 5:21 AM John Rose <johnr...@polyplexic.com> wrote:

> > -----Original Message-----
> > From: Matt Mahoney via AGI <agi@agi.topicbox.com>
> >
> > I was applying John's definition of qualia, not agreeing with it. My
> definition is
> > qualia is what perception feels like. Perception and feelings are both
> > computable. But the feelings condition you to believing there is
> something
> > magical and mysterious about it.
> >
> 
> And what I'm saying is that the communication of qualia is important for
> general intelligence in a system of agents. And how do agents interpret the
> signals, process and recommunicate them.
> 
> But without fully understanding qualia since they're intimately intrinsic
> to agent experience we can still explore their properties by answering
> questions such as: What is an expression of the information distance
> between qualia of differing agents with same stimuli? How do qualia map to
> modeled environment? How do they change over time in a system of learning
> agents? What is the compressional loss into communication? And how do
> multi-agent models change over time from communicated and decompressed
> qualia.
> 
> And what is the topology of qualia variance within an agent related to the
> complexity classes of environmental strategy?
> 
> And move on to questions such as can there be enhancements to agent
> language to accelerate learning in a simulated system? And enhancements to
> agent structure?
> 
> John
> 

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T9c94dabb0436859d-M2f5ee95344c2383d5154fb35
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to