What is wrong with the Legg and Hutter definition of intelligence? I think
that is it.

On Tue, Feb 17, 2015 at 1:48 PM, Telmo Menezes via AGI <[email protected]>
wrote:

>
>
> On Tue, Feb 17, 2015 at 10:24 AM, Piaget Modeler via AGI <[email protected]>
> wrote:
>
>> Classification:* given a set of inputs return a distinct output that
>> compresses the information of the input into a smaller set of values. *
>>
>> Classification tasks can be done with neural networks, fuzzy logic, case
>> based reasoning, specialized compression, etc.
>>
>> Construction: *given an initial state, a set of operations, and a goal
>> state, return a sequence of operations that transforms the initial state
>> into the goal state.*
>>
>
> Right, I have no problem with the definitions of Classification and
> Construction. My problem is with the definitions of Intelligence, AI, and
> AGI. Can you define that? We need those definitions to be able to judge if
> Classification and Construction are necessary and sufficient for AGI.
>
>
>>
>> Construction tasks can be done with planning algorithms (state space
>> search, plan space search, hierarchical search, etc.).
>>
>
>>
>> Both approaches *ARE *used in complex AI applications.
>>
>
> Yes, and if you look at what we know about how the human brain works, you
> can easily argue that the brain does Classification and Construction. What
> we don't know is if this will turn out to be a useful distinction. For
> example, I can conceive of an ANN being trained to do classification and
> construction at the same time, and without any well defined borders (as I
> suspect happens in the brain).
>
> Or you could argue that Construction is all that's happening and that
> classification is just a detail to help construction (along with learning,
> random exploration, whatever).
>
> Or...
>
> My point is, this is just a model. Models aren't really right/wrong as
> much as they are useful or not.
>
> Telmo.
>
>
>>
>> ~PM
>>
>>
>> ------------------------------
>> Date: Tue, 17 Feb 2015 09:03:40 +0100
>> Subject: Re: [agi] Couple thoughts
>>
>> From: [email protected]
>> To: [email protected]
>>
>>
>>
>> On Tue, Feb 17, 2015 at 8:42 AM, Piaget Modeler via AGI <[email protected]>
>> wrote:
>>
>>  I was taught that in AI there are two primary tasks, Classification and
>> Construction.
>>
>> Please correct me where I'm wrong, anyone.  I like to learn.
>>
>>
>> There has always been a lot of debate about what AI is. We don't even
>> have anything close to a consensus on a good definition of "intelligence".
>> This leads me to suspect that the main problem with AI is that we don't
>> have a well-defined problem to tackle, but that's a broader issue.
>>
>> Sure "Classification and Construction" is not so bad. It's not a matter
>> of being right or wrong. There are thousands of plausible alternatives to
>> this. You pick a model and run with it, but let's not pretend we are
>> dealing with some super-objective definition.
>>
>>
>>
>> Deep Learning and (many other methods) are good at classification tasks.
>>
>> We also need methods good at construction tasks (i.e. plan generation).
>>
>>
>> This "also need" mentality could be the problem. Maybe what we need is
>> something that can holistically perform both types of tasks.
>>
>> Suppose you take deep blue. It can play chess really well, a skill that
>> was up to then associated with humans. But then someone says: wait humans
>> are also usually good at driving cars. Then you merge Google cars and deep
>> blue and claim to be closer to AGI? Does this make any sense? Do you see
>> the problem?
>>
>> Best,
>> Telmo.
>>
>>
>>
>> ~PM
>>
>> > Date: Mon, 16 Feb 2015 16:09:00 -0800
>> > Subject: [agi] Couple thoughts
>> > From: [email protected]
>> > To: [email protected]
>> >
>> > I had a couple of things running through my mind --
>> >
>> > 1) "Deep learning algorithms are very good at one thing today:
>> > learning input and mapping it to an output. X to Y. Learning concepts
>> > is going to be hard." Andrew Ng.
>> >
>> > I guess I take that to be an acid test of where the big guys are with
>> concepts.
>> >
>> > 2) "brain inspired", "physics inspired", "math inspired," X-inspired,
>> > etc-inspired, hybird-inspired...
>> >
>> > It seems all AGI approaches take the "inspired by" approach. The only
>> > approach that is not deliberately inspired by some discipline, but
>> > aspires to the actual thing: Colin Hayes' approach.
>> >
>> > There is nothing wrong with the "inspired by" approach, of course.
>> >
>> > Mike
>> >
>> >
>> > -------------------------------------------
>> > AGI
>> > Archives: https://www.listbox.com/member/archive/303/=now
>> > RSS Feed:
>> https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc
>> > Modify Your Subscription: https://www.listbox.com/member/?&;
>> > Powered by Listbox: http://www.listbox.com
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/25129130-ee4f7d55> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>>
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/25129130-ee4f7d55> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/10872673-8f99760d> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to