Abram Demski wrote:
On Wed, Jun 18, 2008 at 9:54 AM, Benjamin Johnston
<[EMAIL PROTECTED]> wrote:
[...]
In any case, this whole conversation bothers me. It seems like we're
focussing on the wrong problems; like using the Theory of Relativity to
decide on an appropriate speed limit for cars in school zones. If it could
take 1,000 years of thought and creativity to go from BB(n) to BB(n+1) for
some n, we're talking about problems of an incredible scale, far beyond what
most of us have in mind for our first prototypes. A challenge with the busy
beaver problem is that when n becomes big enough, you start being able to
encode long-standing and very difficult mathematical conjectures.
-Ben
My point is simply that an AGI should be able to think about such
concepts, like we do. It doesn't need to solve them. In this sense I
think it is a fundamental concern: how is it possible to have a form
of knowledge representation that can in principle capture all ideas a
human might express?
Intuition suggests that there should be a simple
sufficient representation, like 1st-order logic. But 1st-order logic
isn't enough, and neither are 2nd-order logics, 3rd order...
Well, what exactly are the constraints you wish you place on "capture".
Clearly humans can express the ideas so in some sense they are trivially
(say text and graphics) captured. :-)
- samantha
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com