https://openreview.net/forum?id=PdauS7wZBfC

Abstract: The backpropagation of error (backprop) is a powerful algorithm 
for training machine learning architectures through end-to-end 
differentiation. Recently it has been shown that backprop in 
multilayer-perceptrons (MLPs) can be approximated using predictive coding, 
a biologically-plausible process theory of cortical computation which 
relies solely on local and Hebbian updates. The power of backprop, however, 
lies not in its instantiation in MLPs, but rather in the concept of 
automatic differentiation which allows for the optimisation of any 
differentiable program expressed as a computation graph. Here, we 
demonstrate that predictive coding converges asymptotically (and in 
practice rapidly) to exact backprop gradients on arbitrary computation 
graphs using only local learning rules. We apply this result to develop a 
straightforward strategy to translate core machine learning architectures 
into their predictive coding equivalents. We construct predictive coding 
CNNs, RNNs, and the more complex LSTMs, which include a non-layer-like 
branching internal graph structure and multiplicative interactions. Our 
models perform equivalently to backprop on challenging machine learning 
benchmarks, while utilising only local and (mostly) Hebbian plasticity. Our 
method raises the potential that standard machine learning algorithms could 
in principle be directly implemented in neural circuitry, and may also 
contribute to the development of completely distributed neuromorphic 
architectures.
One-sentence Summary: We show that predictive coding algorithms from 
neuroscience can be setup to approximate the backpropagation of error 
algorithm on any computational graph.

https://iclr.cc/
The Ninth International Conference on Learning Representations (Virtual 
Only, 2021)

@philipthrift

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/f644207b-a6ea-4ed3-8b94-5ac6a0ccb3ben%40googlegroups.com.

Reply via email to