On Sun, Jul 6, 2008 at 4:22 AM, Abram Demski <[EMAIL PROTECTED]> wrote:
...

> So the
> question is: is clustering in general powerful enough for AGI? Is it
> fundamental to how minds can and should work?
>

You seem to be referring to *k-means clustering*, which assumes a special
form of *mixture model*, which is a class of generative models. In any
mixture model, we have to make some strong assumptions like
1) the particular types of distributions we're mixing;
2) the amount of clusters is fixed.

Because we make such assumptions, we can solve the problem very fast by the
EM algorithm. Some assumptions can be lifted by adding some parameter
optimization (local search), obviously adding considerable computing time
and not leading to optimal solutions.

So that answered your question: mixture models lay strong assumptions on the
distribution we're drawing samples from, so they're NOT very powerful in
isolation.

If you're searching for a generative model that is general and can
approximate any distribution, you should look for example at Restricted
Boltzmann Machines (Geoffrey Hinton):
http://www.scholarpedia.org/article/Boltzmann_machine#Restricted_Boltzmann_machines

Sadly, the best training methods are still computationally expensive, but
they're becoming practical.

Regards,
Durk Kingma



-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com

Reply via email to