Re: [agi] Context free text analysis is not a proper method of natural language understanding

2007-10-03 Thread Bob Mottram
On 03/10/2007, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
 Given (1), no context-free analysis can understand natural language.
 Given (2), no adaptive agent can learn (proper) understanding of natural
 language given only texts.

 For human-like understanding, an AGI would need to participate in
 (human) social society.


This is the age-old problem for AI.  Either you have to build a
physical system (a robot) which can in some sense experience the
non-linguistic concepts upon which language is based, or you have to
directly teach the system (enter a lot of common sense knowledge - the
things we all know but which are rarely written down).

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244id_secret=49226704-b6e4f3


Re: [agi] Context free text analysis is not a proper method of natural language understanding

2007-10-03 Thread Vladimir Nesov
... or maybe they can be inferred from texts alone. It all depends on
learning ability of particular design, and we as yet have none. Cart
before the horse.

On 10/3/07, Bob Mottram [EMAIL PROTECTED] wrote:
 On 03/10/2007, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
  Given (1), no context-free analysis can understand natural language.
  Given (2), no adaptive agent can learn (proper) understanding of natural
  language given only texts.

  For human-like understanding, an AGI would need to participate in
  (human) social society.


 This is the age-old problem for AI.  Either you have to build a
 physical system (a robot) which can in some sense experience the
 non-linguistic concepts upon which language is based, or you have to
 directly teach the system (enter a lot of common sense knowledge - the
 things we all know but which are rarely written down).

 -
 This list is sponsored by AGIRI: http://www.agiri.org/email
 To unsubscribe or change your options, please go to:
 http://v2.listbox.com/member/?;



-- 
Vladimir Nesovmailto:[EMAIL PROTECTED]

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244id_secret=49227353-9fd0f7


Re: [agi] Context free text analysis is not a proper method of natural language understanding

2007-10-03 Thread Matt Mahoney
--- [EMAIL PROTECTED] wrote:

 Relating to the idea that text compression (as demonstrated by general
 compression algorithms) is a measure of intelligence,
 Claims:
 (1) To understand natural language requires knowledge (CONTEXT) of the
 social world(s) it refers to.
 (2) Communication includes (at most) a shadow of the context necessary
 to understand it.
 
 Given (1), no context-free analysis can understand natural language.
 Given (2), no adaptive agent can learn (proper) understanding of natural
 language given only texts.
 
 For human-like understanding, an AGI would need to participate in
 (human) social society.

The ideal test set for text compression as a test for AI would be 1 GB of chat
sessions, such as the transcripts between judges and human confederates in the
Loebner contests.  Since I did not have this much data available I used
Wikipedia.  It lacks a discourse model but the problem is otherwise similar in
that good compression requires vast, real world knowledge.  For example,
compressing or predicting:

  Q. What color are roses?
  A. ___

is almost the same kind of problem as compressing or predicting:

  Roses are ___

Of course, the compressor would be learning an ungrounded language model. 
That should be sufficient for passing a Turing test.  A model need not have
actually seen a rose to know the answer to the question.  I don't think it is
possible to find any knowledge that could be tested through a text-only
channel that could not also be learned through a text-only channel.  Whether
sufficient testable knowledge is actually available in a training corpus is
another question.

I don't claim that lossless compression could be used to test for AGI, just
AI.  A lossless image compression test would be almost useless because the
small amount of perceptible information in video would be overwhelmed by
uncompressible pixel noise.  A lossy test would be appropriate, but would
require subjective human evaluation of the quality of the reproduced output. 
For text, a strictly objective lossless test is possible because the
perceptible content of text is a large fraction of the total content.


-- Matt Mahoney, [EMAIL PROTECTED]

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244id_secret=49471493-636320


Re: [agi] Context dependent words/concepts

2006-08-22 Thread James Ratcliff
This is one of the main concepts / problems of AI, is it not? Removing the ambiguity from our language in order to understand it. So you could remove it on the KR side, but you would still need to convert the regular language into the KR language, unless you would propose to have all inputs and outputs in the new KR language alone.So that conversion process would have to handle the context and ambiguity of the system still. And this is something we havnt accomplished to a high enough degree yet, but we have given it some attention, and realize that that alone is not enough.What kind of mapping would the KR need extra to handle the relationships between these newly seperated terms.JamesThe 'language' used in KR need not be context-dependent or ambiguous. If
 the blackboard is recognized (by the sensory perception module) as 'black', that would be the best description in KR, because that's the limit of sensory perception. Of course, normally the AGI will have more details of the board such asbetter color discernment, and that the board has a frame, etc.   I still think the KR language does not need context-dependency or ambiguity. Except ambiguities with respect to the external  world, which always exist.  YKY  To unsubscribe, change your address, or temporarily deactivate your subscription,  please go to http://v2.listbox.com/member/[EMAIL PROTECTED] Thank YouJames Ratcliffhttp://falazar.com 
		Talk is cheap. Use Yahoo! Messenger to make PC-to-Phone calls.  Great rates starting at 1ยข/min.
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Context dependent words/concepts

2006-08-20 Thread Ben Goertzel

I continue to maintain that:

* syntactic ambiguity is unnecessary in a language of thought or communication

* some level of semantic ambiguity is unavoidable and in fact essential...

ben

On 8/20/06, YKY (Yan King Yin) [EMAIL PROTECTED] wrote:



On 8/19/06, Ben Goertzel [EMAIL PROTECTED] wrote:
  In blackboard the NL word maps to either a board that is black in
color
  or a board for writing that is usually black/green/white.  The KR of
those
  concepts are unambiguous; it's just that there are 2 alternatives.

 This is very naive...  a concept such as a board that is black in
 color is not unambiguous at all ...

 -- what % of the board needs to be black ?

 -- what kind of object really qualifies as a board?

 -- how dark does something have to be, to be  black?

 etc.

 the answers to these questions depend on context, so whether an object
 is classified as a board that is black in color depends on
 context... quite independently of any linguistic ambiguities
 associated...

The 'language' used in KR need not be context-dependent or ambiguous.  If
the blackboard is recognized (by the sensory perception module) as 'black',
that would be the best description in KR, because that's the limit of
sensory perception.  Of course, normally the AGI will have more details of
the board such as better color discernment, and that the board has a frame,
etc.

I still think the KR language does not need context-dependency or ambiguity.
 Except ambiguities with respect to the external world, which always exist.


YKY 
 To unsubscribe, change your address, or temporarily deactivate your
subscription, please go to
http://v2.listbox.com/member/[EMAIL PROTECTED]


---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Context dependent words/concepts

2006-08-19 Thread YKY (Yan King Yin)

On 8/19/06, Ben Goertzel [EMAIL PROTECTED] wrote:   In blackboard the NL word maps to either a board that is black in color
  or a board for writing that is usually black/green/white.The KR of those  concepts are unambiguous; it's just that there are 2 alternatives.  This is very naive...a concept such as a board that is black in
 color is not unambiguous at all ...  -- what % of the board needs to be black ?  -- what kind of object really qualifies as a board?  -- how dark does something have to be, to beblack?
  etc.  the answers to these questions depend on context, so whether an object is classified as a board that is black in color depends on context... quite independently of any linguistic ambiguities
 associated...
The 'language' used in KR need not be context-dependent or ambiguous. If the blackboard is recognized (by the sensory perception module) as 'black', that would be the best description in KR, because that's the limit of sensory perception. Of course, normally the AGI will have more details of the board such asbetter color discernment, and that the board has a frame, etc.


I still think the KR language does not need context-dependency or ambiguity. Except ambiguities with respect to the external
 world, which always exist.

YKY

To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Context dependent words/concepts

2006-08-18 Thread YKY (Yan King Yin)

On 8/19/06, Ben Goertzel [EMAIL PROTECTED] wrote:  Well, but I can generate a hypothetical grounding for mushrooom pie on the fly even though I haven't seen one ;-)
  And I can form concepts of mathematical structures that I have never experienced nor exemplified and may in fact be inconsistent and not even exist...  Not all concepts are formed in episodic memory, of course..


Well, once again the distinction between NL (natural language) andKR (knowledge representation) is important here. Determining what is themeaningof mushroom pieis an NL -- KR problem. In this case a heuristic rule like try to construct the meaning of a compound word XY by looking for words like X'Y where X' is similar to X may be applicable.


So NL can be ambiguous, KR is not.

Within KR itself, we can form novel concepts out of existing ones, and this operation is not dependent on NL. For example I can form the concept unfair justice.


In blackboard the NL word maps to either a board that is black in color or a board for writingthatis usually black/green/white. The KR of those concepts are unambiguous; it's just that there are 2 alternatives.


YKY

To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Context dependent words/concepts

2006-08-18 Thread Ben Goertzel

In blackboard the NL word maps to either a board that is black in color
or a board for writing that is usually black/green/white.  The KR of those
concepts are unambiguous; it's just that there are 2 alternatives.


This is very naive...  a concept such as a board that is black in
color is not unambiguous at all ...

-- what % of the board needs to be black ?

-- what kind of object really qualifies as a board?

-- how dark does something have to be, to be  black?

etc.

the answers to these questions depend on context, so whether an object
is classified as a board that is black in color depends on
context... quite independently of any linguistic ambiguities
associated...

ben

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Context

2003-02-10 Thread Ben Goertzel

Hi,

 I see that Novamente has Context and NumericalContext Links,  but
 I'm wondering if something more is needed to handle the various
 subtypes of context?

yeah, those link types just deal with certain special situations, they are
not the whole of Novamente's contextuality-handling mechanism, by any
means...

 In summary, it's clear that context is a vital part of memory
 processes used by NGI's, and I was wondering to what extent
 context is emphasized in the design of Novamente.

context is not emphasized in a unified way, but it comes up in a lot of
places.

For example, in the inference module there's a specific parameter called
context size that controls the implicit sample space of the probability
estimates used in inference..

Generally speaking, Novamente is intended to be able to deal with
contextuality in all the senses you describe it, but not by a unified
mechanism -- by a host of different mechanisms, some more useful for some
kinds of contextuality, some for others...

 It's difficult
 to get a feel for it from the available documentation.

Heh.   That is certainly true.

All will be clear in 2004 when the 1500-page beast (the Novamente-design
book) finally appears in print ;-p

Or at least, then the difficulty will shift to a difficulty with
*understanding* what we're talking about rather than *guessing* it ;-)

 I'd also
 like to explore the idea of creating some more concrete words for
 the various types of context that will be a necessary part of any
 AGI.  The word context is too generalized to perform the many
 functions required of it.  Agree/disagree?  Am I reinventing the wheel?

I don't think you're reinventing the wheel.  Similar things have been
discussed, e.g. the situation semantics of John Barwise and others, which
tries to take formal semantics and make all meanings within it
situation-dependent.  But situation semantics is tied too closely to rigid
logicist theories of semantics to really appeal to me.  I think an adequate,
general conceptual and mathematical model of contextuality has yet to be
formulated  I am not sure such a model is needed for AGI, but it would
certainly be helpful.

ben  g

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]