Greetings Alan,
I'm thinking that we may be like the blind men feeling the elephant.
Not sure we are looking at the same components. My concept of
abstraction and promoters is only meant to be a mechanism to solve a
very small function of an AGI. The problem to be solved is how to
choose the action that will be taken in the next moment.
I realize there are other views. For example Matt Mahoney believes that
we don't want machines that do what they want to do, but rather,
machines that do what we want them to do. The implication is that we
tell them what to do and they find the procedure that gets it done to
our satisfaction. I personally think it would be informative to know
what an advanced AGI would do when it wasn't being a slave for a human.
The question I have for you Alan, is what is the mechanism in your AGI
architecture that causes the AGI to select the best thing to do for the
moment? It sounds as if you believe that the AGI will have a model of
the world and play the model at an accelerated rate and the model
reveals the outcome of various possible actions. Reminds me of a "Person
of Interest" episode where the super computer plays through multiple
scenarios to see how they would play out - and then directs the
characters to take the best scenario.
Another alternative to this method of finding the "best" thing to do is
to have the machine always involved in a goal that is given to it. Do
you see this as the way to go?
Stan
On 10/29/18 5:55 PM, Alan Grimes wrote:
WRONG WRONG WRONG, COMPLETELY AND UTTERLY WRONG, spectacularly wrong
even... Wrong enough even to be completely backwards!!!!!
First, the opposite of abstraction, is usually "resoultion".
Your stupid, misseducated, underporforming brain is actually
CONTINUOUSLY resolving abstract representations to produce your
conscious experience though it is obviously too oblivious to realize
that. =\
I'm not trying to be especially cruel to you individually, I'm mostly
expressing my frustration with this line of thought and how much time it
wastes. It is not an exaggeration to say that the ratio of resolution to
abstraction in human cognition in the adult brain is on the order of
several hundred to one, if not more... It is different in young children
for both physiological and developmental reasons.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T586df509299da774-Mc23197faa971c5dbe5b17f0e
Delivery options: https://agi.topicbox.com/groups/agi/subscription