On Wed, Dec 26, 2012 at 8:49 PM, Logan Streondj <[email protected]> wrote:
>
> cyc hardly used an "obvious" approach, they used some kind of weird syntax to 
> attempt to categorize a fallacy "common sense knowledge" something that 
> doesn't objectively exist.

Cyc uses augmented first order logic. Most approaches to AI prior to
this used similar knowledge representation schemes because it is
computationally cheap.

> topcoder competitions are interesting, in that with input, output, and output 
> testing it may be possible to evolve functions.
> Am hoping on using it later on to evolve drivers and things.

In my list of 20 or so requirements for AI, I included the ability to
write, test, and debug code.

> who knows maybe one day there will be evolutionary algorithmic drivers so 
> good they can beat human programmers. Anyways if that happens it'll still be 
> a win for "narrow AI", similar to chess, and jeopardy, as it's a minor aspect 
> of what it means to be a general intelligence.

Evolution is much more computationally expensive than anything the
brain does. It took evolution 3 billion years to create human
intelligence on a planet sized molecular computer.

> Best way of proving AGI in an undeniable fashion, is to have "wild" AGI's 
> robots running around self-replicating in the environment. Admitedly most 
> wild organisms have brains significantly larger than those of their domestic 
> counter-parts, so we may develop domestic AGI's first.

No, that is the best way to wipe out humanity. And no, self
replication + control is harder than self replication alone. It
requires more intelligence.


--
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to