In olden days there was a distinction between analogous reasoning and
reasoning by similarity. Objects usually  seem to have strong similarities
and dissimilarities to people.  But you could look at two dissimilar
'systems' and find that there were systematic similarities. This then would
constitute reasoning by analogy. However, the problems that seem to be
encountered in AI, especially something like text-based AI, is that you
start questioning which objects (of reference) really can stand as basics
(or elements) and which stand as relational or derived properties. This
leads me to believe that the only way AGI is going to work is by basing it
on relativist theories. The conclusion then is that the relations between
fundamental referent objects and the relations between them can change
relative to a point of view. Then I say we tend to need to rely on the
strength of the structures between objects of thought in order to see how
reasonable our points of view are.  It doesn't mean that there is no such
thing as a real world but rather our way of thinking about it has to hold
up from different (reasonable) vantages even if that means that the derived
systems of thought might stand (at least in some cases) as the fundamentals
in our insight. Another way of looking at this is that the elements of a
thought can be seen as complicated systems which may be composed of
relations that go beyond the similarities between the elements. So in order
to learn more about rocks you have to start realizing that they are systems
of minerals and atoms and that the interactions of the atoms (for example)
may have a lot in common with objects that are not rocks. This seems so
mundane that a lot of people don't understand what I am talking about. If
you are not interested in categorization by similarity then you are
probably not getting this.

Jim Bromer

On Thu, Jan 8, 2015 at 10:59 AM, Jim Bromer <[email protected]> wrote:

> Piaget Modeler via AGI <[email protected]> wrote:
>
>> Bipin Indurkhya would take it one step further and say, we don't juts
>> find relationships,
>> we create them.  He explains this in his book *Metaphor and Cognition*.
>>
>
>
> Well, I just took it a few steps further and said that we have to find
> -reasons- for a relation. This can be done with conjecture, for example,
> (or from 'education') but the reason has to fit in with the parts.
> Correlation or association might be a starting point but then there has to
> be some kind of 'story' which makes sense. Of course this process often
> does underlie metaphor and metaphor can be introduced as a method of
> explanation but it also can be based on substantive similarities. And I
> also believe that the imagination is an important part of understanding and
> that without it insight would be impossible. I feel that the emphasis of
> metaphor as if it were the only method to produce insight-like
> correlation is old-school.
> Jim Bromer
>
>
>
> On Thu, Jan 8, 2015 at 10:28 AM, Piaget Modeler via AGI <[email protected]>
> wrote:
>
>> Bipin Indurkhya would take it one step further and say, we don't juts
>> find relationships,
>> we create them.  He explains this in his book *Metaphor and Cognition*.
>>
>> See:
>> http://www.amazon.com/Metaphor-Cognition-Interactionist-Approach-Cognitive/dp/0792316878
>>
>> Piaget would also agree that the relationships are constructed rather
>> than detected.
>>
>> ~PM
>>
>> > Date: Thu, 8 Jan 2015 09:42:17 -0500
>> > Subject: [agi] Coherent Knowledge and Reason Based Reasoning
>> > From: [email protected]
>> > To: [email protected]
>>
>> >
>> > I believe that a (moderate) coherentist approach makes sense. You can
>> > use logic, correlation, abstraction, synthesis, generalization,
>> > specification, probability and conjecture across the conceptual
>> > objects of the system. But when some objects of interest are found to
>> > be related, I think there should be an attempt to find out why or how
>> > they are related. I feel that mere association or correlation is not
>> > enough to act as a basis for AGI. The program has to search for
>> > reason-based reasoning as well. If a reason can't be found or the
>> > observations do not stand out then association or correlation is
>> > adequate, but the idea that association or correlation is substantial
>> > as a basis for knowledge just does not seem right to me.
>> > Jim Bromer
>> >
>> >
>> > -------------------------------------------
>> > AGI
>> > Archives: https://www.listbox.com/member/archive/303/=now
>> > RSS Feed:
>> https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc
>> > Modify Your Subscription: https://www.listbox.com/member/?&;
>> > Powered by Listbox: http://www.listbox.com
>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b5> |
>> Modify
>> <https://www.listbox.com/member/?&;>
>> Your Subscription <http://www.listbox.com>
>>
>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to