On Thu, Oct 23, 2008 at 12:55 AM, Matt Mahoney wrote:
I suppose you are right. Instead of encoding mathematical rules as a grammar,
with enough training
data you can just code all possible instances that are likely to be
encountered. For example, instead
of a grammar rule to encode the
: [agi] Language learning (was Re: Defining AGI)
--- On Wed, 10/22/08, Dr. Matthias Heger [EMAIL PROTECTED] wrote:
You make the implicit assumption that a natural language
understanding system will pass the turing test. Can you prove this?
If you accept that a language model is a probability
You make the implicit assumption that a natural language understanding
system will pass the turing test. Can you prove this?
Furthermore, it is just an assumption that the ability to have and to apply
the rules are really necessary to pass the turing test.
For these two reasons, you still
--- On Wed, 10/22/08, Dr. Matthias Heger [EMAIL PROTECTED] wrote:
You make the implicit assumption that a natural language
understanding system will pass the turing test. Can you prove this?
If you accept that a language model is a probability distribution over text,
then I have already
Andi wrote
This really seems more like arguing that there is no such thing as
AI-complete at all. That is certainly a possibility. It could be that
there are only different competences. This would also seem to mean that
there isn't really anything that is truly general about intelligence,