On Mon, Mar 3, 2008 at 6:33 AM, <[EMAIL PROTECTED]> wrote:

> Thanks for that.
>
> Dont you see the way to go on Neural nets is hybrid with genetic
> algorithms in mass amounts?
>

No, I dont agree with your buzzword-laden statement :) I experimented EA +
NN's and its still intractable when scaled up to nontrivial samples.

Luckily, there exist more efficient learning methods for NN's then search.
For multilayer perceptrons there's standard backpropagation (gradient
descent), conjugate gradient descent, newton's method, etc. For RBM's,
there's contrastive divergence (CD) or wake-sleep using Gibbs sampling, etc.

The great thing about RBM's is that while still slow at learning (can take a
few days to converge a complex model), it's a very very simple architecture
(just a few matrices) plus very simple learning methods (just a few matrix
multiplications) that SEEMS to be exceptionally good at building a *
generative* model from (labeled or unlabeled) complex data.
With RBM's you can do all kinds of interesting stuff like:
 - confabulate novel samples from model;
 - compression (although inherently lossy)
 - visualisation in 2D (compress to 2 neurons)

There's a nice flash demonstration about digit generation/classification
http://www.cs.toronto.edu/~hinton/adi/index.htm<http://www.cs.toronto.edu/%7Ehinton/adi/index.htm>

Did anyone on this list do experiments with these kind of generative models?
I'd can't find much research into this subject outside from the Univ. of
Tortonto's CS group, so the information reaching me might be positivily
biased. I don't have any affiliation with this group if anyone might ask.

Durk

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to