Scratch my statement about it being useless :) It's useful, but no where
near sufficient for AGI like understanding.

On Tue, Jun 29, 2010 at 4:58 PM, David Jones <davidher...@gmail.com> wrote:

> notice how you said *context* of the conversation. The context is the real
> world, and is completely missing. You cannot "model" human communication
> using text alone. The responses you would get back would be exactly like
> eliza. Sure, it might be pleasing to someone that has never seen AI before,
> but its certainly not answering any questions.
>
> This reminds me of the Bing search engine commercials where people ask a
> question and get responses that include the words they asked about, but in a
> completely wrong context.
>
> Predicting the next word and understanding the question are completely
> different and cannot be solved the same way. In fact, predicting the next
> word is altogether useless (at least by itself) in my opinion.
>
> Dave
>
>
> On Tue, Jun 29, 2010 at 4:50 PM, Matt Mahoney <matmaho...@yahoo.com>wrote:
>
>> Answering questions is the same problem as predicting the answers. If you
>> can compute p(A|Q) where Q is the question (and previous context of the
>> conversation) and A is the answer, then you can also choose an answer A from
>> the same distribution. If p() correctly models human communication, then the
>> response would be indistinguishable from a human in a Turing test.
>>
>>
>> -- Matt Mahoney, matmaho...@yahoo.com
>>
>>
>> ------------------------------
>> *From:* David Jones <davidher...@gmail.com>
>> *To:* agi <agi@v2.listbox.com>
>> *Sent:* Tue, June 29, 2010 3:43:53 PM
>>
>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>
>> the purpose of text is to convey something. It has to be interpreted. who
>> cares about predicting the next word if you can't interpret a single bit of
>> it.
>>
>> On Tue, Jun 29, 2010 at 3:43 PM, David Jones <davidher...@gmail.com>wrote:
>>
>>> People do not predict the next words of text. We anticipate it, but when
>>> something different shows up, we accept it if it is *explanatory*. Using
>>> compression like algorithms though will never be able to do this type of
>>> explanatory reasoning, which is required to disambiguate text. It is
>>> certainly not sufficient for learning language, which is not at all about
>>> predicting text.
>>>
>>>
>>> On Tue, Jun 29, 2010 at 3:38 PM, Matt Mahoney <matmaho...@yahoo.com>wrote:
>>>
>>>> Experiments in text compression show that text alone is sufficient for
>>>> learning to predict text.
>>>>
>>>> I realize that for a machine to pass the Turing test, it needs a visual
>>>> model of the world. Otherwise it would have a hard time with questions like
>>>> "what word in this ernai1 did I spell wrong"? Obviously the easiest way to
>>>> build a visual model is with vision, but it is not the only way.
>>>>
>>>>
>>>> -- Matt Mahoney, matmaho...@yahoo.com
>>>>
>>>>
>>>> ------------------------------
>>>> *From:* David Jones <davidher...@gmail.com>
>>>> *To:* agi <agi@v2.listbox.com>
>>>> *Sent:* Tue, June 29, 2010 3:22:33 PM
>>>>
>>>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>>>
>>>> I certainly agree that the techniques and explanation generating
>>>> algorithms for learning language are hard coded into our brain. But, those
>>>> techniques alone are not sufficient to learn language in the absence of
>>>> sensory perception or some other way of getting the data required.
>>>>
>>>> Dave
>>>>
>>>> On Tue, Jun 29, 2010 at 3:19 PM, Matt Mahoney <matmaho...@yahoo.com>wrote:
>>>>
>>>>> David Jones wrote:
>>>>> >  The knowledge for interpreting language though should not be
>>>>> pre-programmed.
>>>>>
>>>>> I think that human brains are wired differently than other animals to
>>>>> make language learning easier. We have not been successful in training 
>>>>> other
>>>>> primates to speak, even though they have all the right anatomy such as 
>>>>> vocal
>>>>> chords, tongue, lips, etc. When primates have been taught sign language,
>>>>> they have not successfully mastered forming sentences.
>>>>>
>>>>>
>>>>> -- Matt Mahoney, matmaho...@yahoo.com
>>>>>
>>>>>
>>>>> ------------------------------
>>>>> *From:* David Jones <davidher...@gmail.com>
>>>>> *To:* agi <agi@v2.listbox.com>
>>>>> *Sent:* Tue, June 29, 2010 3:00:09 PM
>>>>>
>>>>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>>>>
>>>>> The point I was trying to make is that an approach that tries to
>>>>> interpret language just using language itself and without sufficient
>>>>> information or the means to realistically acquire that information, 
>>>>> *should*
>>>>> fail.
>>>>>
>>>>> On the other hand, an approach that tries to interpret vision with
>>>>> minimal upfront knowledge needs *should* succeed because the knowledge
>>>>> required to automatically learn to interpret images is amenable to
>>>>> preprogramming. In addition, such knowledge must be pre-programmed. The
>>>>> knowledge for interpreting language though should not be pre-programmed.
>>>>>
>>>>> Dave
>>>>>
>>>>> On Tue, Jun 29, 2010 at 2:51 PM, Matt Mahoney <matmaho...@yahoo.com>wrote:
>>>>>
>>>>>> David Jones wrote:
>>>>>> > I wish people understood this better.
>>>>>>
>>>>>> For example, animals can be intelligent even though they lack language
>>>>>> because they can see. True, but an AGI with language skills is more 
>>>>>> useful
>>>>>> than one without.
>>>>>>
>>>>>> And yes, I realize that language, vision, motor skills, hearing, and
>>>>>> all the other senses and outputs are tied together. Skills in any area 
>>>>>> make
>>>>>> learning the others easier.
>>>>>>
>>>>>>
>>>>>> -- Matt Mahoney, matmaho...@yahoo.com
>>>>>>
>>>>>>
>>>>>>  ------------------------------
>>>>>> *From:* David Jones <davidher...@gmail.com>
>>>>>> *To:* agi <agi@v2.listbox.com>
>>>>>> *Sent:* Tue, June 29, 2010 1:42:51 PM
>>>>>>
>>>>>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>>>>>
>>>>>> Mike,
>>>>>>
>>>>>> THIS is the flawed reasoning that causes people to ignore vision as
>>>>>> the right way to create AGI. And I've finally come up with a great way to
>>>>>> show you how wrong this reasoning is.
>>>>>>
>>>>>> I'll give you an extremely obvious argument that proves that vision
>>>>>> requires much less knowledge to interpret than language does. Let's say 
>>>>>> that
>>>>>> you have never been to egypt, you have never seen some particular movie
>>>>>> before.  But if you see the movie, an alien landscape, an alien world, a 
>>>>>> new
>>>>>> place or any such new visual experience, you can immediately interpret 
>>>>>> it in
>>>>>> terms of spacial, temporal, compositional and other relationships.
>>>>>>
>>>>>> Now, go to egypt and listen to them speak. Can you interpret it? Nope.
>>>>>> Why?! Because you don't have enough information. The language itself does
>>>>>> not contain any information to help you interpret it. We do not learn
>>>>>> language simply by listening. We learn based on evidence from how the
>>>>>> language is used and how it occurs in our daily lives. Without that
>>>>>> experience, you cannot interpret it.
>>>>>>
>>>>>> But with vision, you do not need extra knowledge to interpret a new
>>>>>> situation. You can recognize completely new objects without any training
>>>>>> except for simply observing them in their natural state.
>>>>>>
>>>>>> I wish people understood this better.
>>>>>>
>>>>>> Dave
>>>>>>
>>>>>> On Tue, Jun 29, 2010 at 12:51 PM, Mike Tintner <
>>>>>> tint...@blueyonder.co.uk> wrote:
>>>>>>
>>>>>>>  Just off the cuff here - isn't the same true for vision? You can't
>>>>>>> learn vision from vision. Just as all NLP has no connection with the 
>>>>>>> real
>>>>>>> world, and totally relies on the human programmer's knowledge of that 
>>>>>>> world.
>>>>>>>
>>>>>>>
>>>>>>> Your visual program actually relies totally on your visual
>>>>>>> "vocabulary" - not its own. That is the inevitable penalty of processing
>>>>>>> unreal signals on a computer screen which are not in fact connected to 
>>>>>>> the
>>>>>>> real world any more than the verbal/letter signals involved in NLP are.
>>>>>>>
>>>>>>> What you need to do - what anyone in your situation with anything
>>>>>>> like your asprations needs to do - is to hook up with a roboticist. 
>>>>>>> Everyone
>>>>>>> here should be doing that.
>>>>>>>
>>>>>>>
>>>>>>>  *From:* David Jones <davidher...@gmail.com>
>>>>>>> *Sent:* Tuesday, June 29, 2010 5:27 PM
>>>>>>> *To:* agi <agi@v2.listbox.com>
>>>>>>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>>>>>>
>>>>>>> You can't learn language from language without embedding way more
>>>>>>> knowledge than is reasonable. Language does not contain the information
>>>>>>> required for its interpretation. There is no *reason* to interpret the
>>>>>>> language into any of the infinite possible interpretaions. There is 
>>>>>>> nothing
>>>>>>> to explain but it requires explanatory reasoning to determine the 
>>>>>>> correct
>>>>>>> real world interpretation
>>>>>>>
>>>>>>> On Jun 29, 2010 10:58 AM, "Matt Mahoney" <matmaho...@yahoo.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>  David Jones wrote:
>>>>>>> > Natural language requires more than the words on the page in the
>>>>>>> real world. Of...
>>>>>>> Any knowledge that can be demonstrated over a text-only channel (as
>>>>>>> in the Turing test) can also be learned over a text-only channel.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> > Cyc also is trying to store knowledge about a super complicated
>>>>>>> world in simplistic forms and al...
>>>>>>> Cyc failed because it lacks natural language. The vast knowledge
>>>>>>> store of the internet is unintelligible to Cyc. The average person 
>>>>>>> can't use
>>>>>>> it because they don't speak Cycl and because they have neither the 
>>>>>>> ability
>>>>>>> nor the patience to translate their implicit thoughts into augmented 
>>>>>>> first
>>>>>>> order logic. Cyc's approach was understandable when they started in 1984
>>>>>>> when they had neither the internet nor the vast computing power that is
>>>>>>> required to learn natural language from unlabeled examples like 
>>>>>>> children do.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> > Vision and other sensory interpretaion, on the other hand, do not
>>>>>>> require more info because that...
>>>>>>> Without natural language, your system will fail too. You don't have
>>>>>>> enough computing power to learn language, much less the million times 
>>>>>>> more
>>>>>>> computing power you need to learn to see.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> -- Matt Mahoney, matmaho...@yahoo.com
>>>>>>>
>>>>>>>  ________________________________
>>>>>>> From: David Jones <davidher...@gmail.com>
>>>>>>> To: agi <a...@v2.listbox.c...
>>>>>>> *Sent:* Mon, June 28, 2010 9:28:57 PM
>>>>>>>
>>>>>>>
>>>>>>> Subject: Re: [agi] A Primary Distinction for an AGI
>>>>>>>
>>>>>>>
>>>>>>> Natural language requires more than the words on the page in the real
>>>>>>> world. Of course that didn't ...
>>>>>>>    *agi* | Archives<https://www.listbox.com/member/archive/303/=now>
>>>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>>>> <http://www.listbox.com>
>>>>>>>
>>>>>>>    *agi* | Archives<https://www.listbox.com/member/archive/303/=now>
>>>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>>>> <http://www.listbox.com>
>>>>>>>    *agi* | Archives<https://www.listbox.com/member/archive/303/=now>
>>>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>>>> <http://www.listbox.com>
>>>>>>>
>>>>>>
>>>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>>> <http://www.listbox.com>
>>>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>>> <http://www.listbox.com>
>>>>>>
>>>>>
>>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>> <http://www.listbox.com>
>>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>>> <http://www.listbox.com>
>>>>>
>>>>
>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>> <http://www.listbox.com>
>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>> <http://www.listbox.com>
>>>>
>>>
>>>
>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/> | 
>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>> <http://www.listbox.com>
>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/> | 
>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>> <http://www.listbox.com>
>>
>
>



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=8660244-6e7fb59c
Powered by Listbox: http://www.listbox.com

Reply via email to