> support-vector machines, neural networks have been considered completely
> obsolete in the machine-learning community. From a marketing point of
> view, it is not a good idea to do research on neural networks nowadays.
> You must give your system another name.
That seems to be the case in the ac
[ Digression: Some ANN architectures are more biologically plausible than
others. "Neuromorphic Engineering" is a good search term to see what's going on
along those lines. (But for a beginner project, the standard multi-layer
perceptron with backpropogation would still be the natural choice.)
On Sat, Oct 10, 2009 at 5:32 PM, Álvaro Begué wrote:
> Are you not going to tell us what this new job is about?
>
I almost forgot to answer this, I had no intention to sound
mysterious. My job is to make autonomous avatars (also called NPCs or
'bots') for a new MMO platform called Blue Mars. The
Also not Remi, but...
Numenta is a startup funded by Palm founder Jeff Hawkins. He started
it following up on his book 'On Intelligence', which I think is a very
interesting read. I'd suggest it as a reading to anyone considering
applying some form of Neural simulation to Go or any other problem.
David Fotland wrote:
Remi, what do you think of Numenta http://www.numenta.com/, a startup that
is using feedforward/feedback networks to model learning and pattern
recognition in the neocortex. Does this approach make sense or is it just
startup hype?
http://www.numenta.com/for-developers/edu
I'm not Remi, but I know a bit about Numenta. I gave a "lightning
talk" at their
workshop about a year and a half ago. A few people at Numenta are
interested
in using their software for Go, and I was working with one of them
before my
heart problems stopped that work.
I do not think that th
Remi, what do you think of Numenta http://www.numenta.com/, a startup that
is using feedforward/feedback networks to model learning and pattern
recognition in the neocortex. Does this approach make sense or is it just
startup hype?
http://www.numenta.com/for-developers/education/biological-backg
Neural networks are not considered obsolete by the machine learning
community; in fact there is much active research on neural networks
and the term is understood to be quite general. SVMs are linear
classifiers for hand-engineered features. When a single layer of
template-matchers isn't enough,
Hideki Kato: <4ad5e7f1.77%hideki_ka...@ybb.ne.jp>:
>Álvaro Begué: <7b0793ea0910140721l2819723bl12af6c1c3dd9...@mail.gmail.com>:
>>We should let go of this idea that artificial neural networks have
>>anything to do with the brain. ANNs are just a family of parametric
>>functions (often with too many
On Wed, Oct 14, 2009 at 03:34:59PM +0300, Petri Pitkanen wrote:
> Neural network tend to work well in those cases where evaluation function is
> smooth, like backgammon. Even inbackgammon neural networks do give good
> results if situation has possibility of sudden equity changes like deep
> backga
Álvaro Begué: <7b0793ea0910140721l2819723bl12af6c1c3dd9...@mail.gmail.com>:
>We should let go of this idea that artificial neural networks have
>anything to do with the brain. ANNs are just a family of parametric
>functions (often with too many parameters for their own good) and
>associated tuning
On Wed, Oct 14, 2009 at 10:21 AM, Álvaro Begué wrote:
> We should let go of this idea that artificial neural networks have
> anything to do with the brain. ANNs are just a family of parametric
> functions (often with too many parameters for their own good) and
> associated tuning algorithms ("lear
We should let go of this idea that artificial neural networks have
anything to do with the brain. ANNs are just a family of parametric
functions (often with too many parameters for their own good) and
associated tuning algorithms ("learning" is a bit pretentious).
Perhaps they took vague inspiratio
IMHO, when applying artificial neural networks to an application, the
structure (as well as the learning algorithm) of the network is very
important. For Go, we haven't invetigated the mechanism how the brain
is used yet. Backpropagation-style layered network is just a model of
the cerebellum
Petr Baudis wrote:
Hi!
Is there some "high-level reason" hypothesised about why there are
no successful programs using neural networks in Go?
I'd also like to ask if someone has a research tip for some
interesting Go sub-problem that could make for a nice beginner neural
networks project.
I guess neural networks is fine for learning pattern priorities for
example. There are probably just simpler and faster methods for doing
that.
Anyway a good project would be learning 3x3 patterns for MC heavy
playouts with a large number of extra features such as exact liberty
counts, di
On Wed, Oct 14, 2009 at 02:45:18PM +0200, Erik van der Werf wrote:
> In my opinion NeuroGo was quite succesful with neural networks.
> Magog's main strength came from neural networks. Steenvreter uses
> 'neural networks' to set priors in the Monte Carlo Tree.
Ah, you are right, that sounds like fa
In my opinion NeuroGo was quite succesful with neural networks.
Magog's main strength came from neural networks. Steenvreter uses
'neural networks' to set priors in the Monte Carlo Tree.
Erik
On Wed, Oct 14, 2009 at 2:26 PM, Petr Baudis wrote:
> Hi!
>
> Is there some "high-level reason" hypot
Neural network tend to work well in those cases where evaluation function is
smooth, like backgammon. Even inbackgammon neural networks do give good
results if situation has possibility of sudden equity changes like deep
backgames and deep anchor games. Top backgammon programs 3-ply search on top
n
Hi!
Is there some "high-level reason" hypothesised about why there are
no successful programs using neural networks in Go?
I'd also like to ask if someone has a research tip for some
interesting Go sub-problem that could make for a nice beginner neural
networks project.
Thanks,
--
20 matches
Mail list logo