If semantic nets can't do it, images can't either, because I can represent
an image as a semantic net and vice versa. They're just data formats. Some
are more handy for some purposes, others more handy for others. Semantic
nets are easier to work with when moving back and forth between the
concrete and the abstract, while images are mainly just useful in one of
these realms. So why are images superior?

On Tue, Oct 23, 2012 at 9:09 AM, Mike Tintner <[email protected]>wrote:

>   CHAIR
>
> ...
>
> It should be able to handle any transformation of the concept, as in
>
> DRAW ME (or POINT TO/RECOGNIZE)  A CHAIR IN TWO PIECES –..
>
> ..SQUASHED
> ..IN PIECES
> -HALF VISIBLE
> ..WITH AN ARM MISSING
> ...WITH NO SEAT
> ..IN POLKA DOTS
> ...WITH RED STRIPES
>
> Concepts are designed for a world of everchanging, everevolving multiform
> objects (and actions).  Semantic networks have zero creativity or
> adaptability – are applicable only to a uniform set of objects, (basically
> a database) -  and also, crucially, have zero ability to physically
> recognize or interact with the relevant objects. I’ve been into it at
> length recently. You’re the one not paying attention.
>
> The suggestion that networks or similar can handle concepts is completely
> absurd.
>
> This is yet another form of the central problem of AGI, which you clearly
> do not understand – and I’m not trying to be abusive  – I’ve been realising
> this again recently – people here are culturally punchdrunk with concepts
> like *concept* and *creativity*, and just don’t understand them in terms of
> AGI.
>
>  *From:* Jim Bromer <[email protected]>
> *Sent:* Tuesday, October 23, 2012 2:04 PM
> *To:* AGI <[email protected]>
>  *Subject:* Re: [agi] Re: Superficiality Produces Misunderstanding - Not
> Good Enough
>
>  Mike Tintner <[email protected]> wrote:
> AI doesn’t handle concepts.
>
>
> Give me one example to prove that AI doesn't handle concepts.
> Jim Bromer
>
>
>
> On Tue, Oct 23, 2012 at 4:24 AM, Mike Tintner <[email protected]>wrote:
>
>>   Jim: Mike refuses to try to understand what I am saying because he
>> would have to give up his sense of a superior point of view in order to
>> understand it
>>
>> Concepts have nothing to do with semantic networks.
>> AI doesn’t handle concepts.
>> That is the challenge for AGI.
>> The form of concepts is graphics.
>> The referents of concepts are infinite realms..
>>
>> What are you saying that is relevant to this, or that can challenge this
>> – from any evidence?
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com/>
>>
>
>   *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/6952829-59a2eca5> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com/>
>   *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/23050605-bcb45fb4> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com/>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to