Dear Siargi's,
After having made a first conatct with neural networks, I come up with attached resumé.
Next step will be to get accustomed with the JavaSNNS.
Next step will be to get accustomed with the JavaSNNS.
What I'd like to know is what would be the next step? Has any one an idea?
Regards,
Khairy
units links output unit hidden unit input units unit activation activation function output function sites : allows a grouping and different treatment of the input signals of a cell
The actual information processing within the units is modeled in the SNNS
simulator with the activation function and the output function.
The activation function
first computes the net input of the unit from the weighted output
values of prior units.
It then computes the new activation from this net input (and possibly
its previous activation).
The output function
takes this result
generate the output of the unit
24/6/06
unit attribute
no
name
io-type
input
output
dual
hidden
special input
special output
special hidden
activation
initial activation
output
bias
activation function
activation formula
a_j(t + 1) = f_act(net_j(t); a_j (t); threshold (j))
where
a_j (t+1) : activation of unit j in step t+1
a_j (t) : activation of unit j in step t
net_j (t) : net input in unit j in step t :
sum(w_(ij)*output_i) : weighted sum of network input
threshold (j) : bias of unit j
f_act : example : logistic function : 1/(1+e^(
net_j (t) - threshold (j))
Note : a_j(t) should belong to ]0,1[
output function or outFunc
o_j (t) = f_out (a_j(t))
example: f_out is the identity
f-type : used for grouping units into a set of unit.
posistion : coordinates in space
subnet no : subnetwork number to which unit can belong
layers: allow an easy representation of units
frozen : when this flag is true: unit activation and output don't
change during the activation
Connections (links)
links are made between source (="source unit") and target (=target unit)
recursive link is possible
redundant link is prohibited
weight < 0 : inhibitory connection
weight > 0 : excitatory connection
bottom up architectury : the input links come only from preceding
layers => feed forward layer
Updates mode
synchronous : activation value of all units is calculated at the same
time (arbitrary). Then for each unit the output is calculated.
random permutation : Each unit computes its activation then output.
Execution is made at a random order. All units are processed
random : The same as random permutation but it is not guaranteed that
all units will be processed. Also it could be that a unit is updated more than
once
serial : the processing order lies on ascending unit id
?topological: the processing order depends on topography
Learning in Neural Nets
Forward propagation phase: An input patter is presented to the network.
Input propagated til it reaches output layer
Backward propagation phase: Links values are updated according to
Hebbian rule
online learning : links are updated after each pattern
offline learning: links are updated for all changes.
Example of online learning algorithm : backpropagation weight update
rule.
Generalization of Neural Networks
The training samples are divided into 3 sets:
Training set
Validation set
Test set
_______________________________________________ Developer mailing list [email protected] http://lists.arabeyes.org/mailman/listinfo/developer

