fmap maps a function on the values in a data structure, returning an
equivalent data structure containing the function results. This is
different from map that applies a function to a sequence and returns
another sequence.
Very neat, Konrad! That sure saves some typing.
I always find
On 09.06.2009, at 20:07, alfred.morgan.al...@gmail.com wrote:
Thanks for the advice, but at present I'm simply aiming to get the
very basics of a neural net up and running without having to worry
about a training algorithm at all. Here's what I have so far (again,
very basic)
;; Net0
Hi! I would expect the implementation of any neural network to be
dictated by the particular mathematical/algorithmic description. I am
not at all sure what description might have given rise to your code.
Do you have any particular type of neural network in mind? Or any
particular task to
Thanks for all the tips, particularly on the subject refs and atoms-
that makes things substantially simpler now. (Pointer math...
Grrr...)
I also wonder why you consider the construction of the network graph
(as above) to be an inherently stateful activity. And why you choose
to have those
Again, thanks for all the help. One last question, though- how would
I apply the 'map' function to an actual associative mapping? I mean-
(defn testFunc [x] (* x 2))
(println (map testFunc {:a 1 :b 2 :c 3}))
The items in a associative container come out as a pair, which you can
I'm pretty well a complete beginner at clojure, but I was hoping I
could get some advice on how to do this sort of thing efficiently/
concisely, because as far as I can tell this involves handling an
awful lot of heavily mutable state, so right now I really feel like
I'm fighting the language.
On Jun 9, 2009, at 14:59, alfred.morgan.al...@gmail.com wrote:
I'm pretty well a complete beginner at clojure, but I was hoping I
could get some advice on how to do this sort of thing efficiently/
concisely, because as far as I can tell this involves handling an
awful lot of heavily mutable
Thanks for the advice, but at present I'm simply aiming to get the
very basics of a neural net up and running without having to worry
about a training algorithm at all. Here's what I have so far (again,
very basic)
;; Net0
(def nodes {})
(defn insertNode [node]
(do (def nodes (assoc nodes
On Jun 9, 11:07 am, alfred.morgan.al...@gmail.com
alfred.morgan.al...@gmail.com wrote:
Thanks for the advice, but at present I'm simply aiming to get the
very basics of a neural net up and running without having to worry
about a training algorithm at all. Here's what I have so far (again,
On Jun 10, 2:07 am, alfred.morgan.al...@gmail.com
alfred.morgan.al...@gmail.com wrote:
Thanks for the advice, but at present I'm simply aiming to get the
very basics of a neural net up and running without having to worry
about a training algorithm at all. Here's what I have so far (again,
On Jun 10, 9:55 am, Asbjørn Bjørnstad asbj...@gmail.com wrote:
(defn connect-node [m a b weight]
(update-in (update-in m [a :outputs]
assoc b weight)
[b :inputs] assoc a weight))
Ugh... Looks better this way:
(defn connect-node [m a b weight]
(-
I'm playing around with neural networks and went for a functional
approach. There's some code at
http://github.com/fffej/ClojureProjects/tree/master
in the neural-networks directory. See
http://www.fatvat.co.uk/2009/06/back-propagation-algorithm-in-clojure.html
for some explanation.
Lack of
12 matches
Mail list logo