The goal of chess is well defined: Avoid being checkmate and try to
checkmate your opponent.

What checkmate means can be specified formally.

Humans mainly learn chess from playing chess. Obviously their knowledge
about other domains are not sufficient for most beginners to be a good chess
player at once. This can be proven empirically.

Thus an AGI would not learn chess completely different from all what we now.
It would learn from experience which is one of  the most common kinds of
learning.

I am sure that everyone who learns chess by playing against chess computers
and is able to learn good chess playing (which is not sure as also not
everyone can learn to be a good mathematician) will be able to be a good
chess player against humans.

My first posting in this thread shows the very weak point in the
argumentation of those people who say that social and other experiences are
needed to play chess.

You suppose knowledge must be available from another domain to solve
problems of the domain of chess.
But everything of chess in on the chessboard itself. If you are not able to
solve chess problems from chess alone then you are not able to solve certain
solvable problems. And thus you cannot call your AI AGI.

If you give an AGI all facts which are sufficient to solve a problem then
your AGI must be able to solve the problem using nothing else than these
facts.

If you do not agree with this, then how should an AGI know which experiences
in which other domains are necessary to solve the problem? 

The magic you use is the overestimation of real-world experiences. It sounds
as if the ability to solve arbitrary problems in arbitrary domains depend
essentially on that your AGI plays in virtual gardens and speaks often with
other people. But this is completely nonsense. No one can play good chess by
those experiences. Thus such experiences are not sufficient. On the other
hand there are programs which definitely do not have such experiences and
outperform humans in chess. Thus those experiences are neither sufficient
nor necessary to play good chess and emphasizing on such experiences is
mystifying AGI, similar as it is done by the doubters of AGI who always
argue with Goedel or quantum physics which in fact has no relevance for
practical AGI at all.

- Matthias





Trent Waddington [mailto:[EMAIL PROTECTED] wrote

Gesendet: Donnerstag, 23. Oktober 2008 07:42
An: agi@v2.listbox.com
Betreff: Re: [agi] If your AGI can't learn to play chess it is no AGI

On Thu, Oct 23, 2008 at 3:19 PM, Dr. Matthias Heger <[EMAIL PROTECTED]> wrote:
> I do not think that it is essential for the quality of my chess who had
> taught me to play chess.
> I could have learned the rules from a book alone.
> Of course these rules are written in a language. But this is not important
> for the quality of my chess.
>
> If a system is in state x then it is not essential for the future how x
was
> generated.
> Thus a programmer can hardcode the rules of chess in his AGI and then,
> concerning chess the AGI would be in the same state as if someone teaches
> the AGI the chess rules via language.
>
> The social aspect of learning chess is of no relevance.

Sigh.

Ok, let's say I grant you the stipulation that you can hard code the
rules of chess some how.  My next question is, in a goal-based AGI
system, what goal are you going to set and how are you going to set
it?  You've ruled out language, so you're going to have to hard code
the goal too, so excuse my use of language:

"Play good chess"

Ohhhhh.. that sounds implementable.  Maybe you'll give it a copy of
GNUChess and let it go at it.. but I've known *humans* who learnt to
play chess that way and they get trounced by the first human they play
against.  How are you going to go about making an AGI that can learn
chess in a complete different way to all the known ways of learning
chess?

Or is the AGI supposed to figure that out?

I don't understand why so many of the people on this list seem to
think AGI = magic.

Trent


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
https://www.listbox.com/member/?&;
Powered by Listbox: http://www.listbox.com



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to