Abstract
Feedforward inhibition and synaptic scaling are important adaptive processes that control the total input a neuron can receive from its afferents. While often studied in isolation, the two have been reported to co-occur in various brain regions. The functional implications of their interactions remain unclear, however. Based on a probabilistic modeling approach, we show here that fast feedforward inhibition and synaptic scaling interact synergistically during unsupervised learning. In technical terms, we model the input to a neural circuit using a normalized mixture model with Poisson noise. We demonstrate analytically and numerically that, in the presence of lateral inhibition introducing competition between different neurons, Hebbian plasticity and synaptic scaling approximate the optimal maximum likelihood solutions for this model. Our results suggest that, beyond its conventional use as a mechanism to remove undesired pattern variations, input normalization can make typical neural interaction and learning rules optimal on the stimulus subspace defined through feedforward inhibition. Furthermore, learning within this subspace is more efficient in practice, as it helps avoid locally optimal solutions. Our results suggest a close connection between feedforward inhibition and synaptic scaling which may have important functional implications for general cortical processing.
Original language | English (US) |
---|---|
Article number | e1002432 |
Journal | PLoS Computational Biology |
Volume | 8 |
Issue number | 3 |
DOIs | |
State | Published - Mar 2012 |
Fingerprint
ASJC Scopus subject areas
- Ecology, Evolution, Behavior and Systematics
- Modeling and Simulation
- Ecology
- Molecular Biology
- Genetics
- Cellular and Molecular Neuroscience
- Computational Theory and Mathematics
Cite this
Feedforward inhibition and synaptic scaling - Two sides of the same coin? / Keck, Christian; Savin, Cristina; Lücke, Jörg.
In: PLoS Computational Biology, Vol. 8, No. 3, e1002432, 03.2012.Research output: Contribution to journal › Article
}
TY - JOUR
T1 - Feedforward inhibition and synaptic scaling - Two sides of the same coin?
AU - Keck, Christian
AU - Savin, Cristina
AU - Lücke, Jörg
PY - 2012/3
Y1 - 2012/3
N2 - Feedforward inhibition and synaptic scaling are important adaptive processes that control the total input a neuron can receive from its afferents. While often studied in isolation, the two have been reported to co-occur in various brain regions. The functional implications of their interactions remain unclear, however. Based on a probabilistic modeling approach, we show here that fast feedforward inhibition and synaptic scaling interact synergistically during unsupervised learning. In technical terms, we model the input to a neural circuit using a normalized mixture model with Poisson noise. We demonstrate analytically and numerically that, in the presence of lateral inhibition introducing competition between different neurons, Hebbian plasticity and synaptic scaling approximate the optimal maximum likelihood solutions for this model. Our results suggest that, beyond its conventional use as a mechanism to remove undesired pattern variations, input normalization can make typical neural interaction and learning rules optimal on the stimulus subspace defined through feedforward inhibition. Furthermore, learning within this subspace is more efficient in practice, as it helps avoid locally optimal solutions. Our results suggest a close connection between feedforward inhibition and synaptic scaling which may have important functional implications for general cortical processing.
AB - Feedforward inhibition and synaptic scaling are important adaptive processes that control the total input a neuron can receive from its afferents. While often studied in isolation, the two have been reported to co-occur in various brain regions. The functional implications of their interactions remain unclear, however. Based on a probabilistic modeling approach, we show here that fast feedforward inhibition and synaptic scaling interact synergistically during unsupervised learning. In technical terms, we model the input to a neural circuit using a normalized mixture model with Poisson noise. We demonstrate analytically and numerically that, in the presence of lateral inhibition introducing competition between different neurons, Hebbian plasticity and synaptic scaling approximate the optimal maximum likelihood solutions for this model. Our results suggest that, beyond its conventional use as a mechanism to remove undesired pattern variations, input normalization can make typical neural interaction and learning rules optimal on the stimulus subspace defined through feedforward inhibition. Furthermore, learning within this subspace is more efficient in practice, as it helps avoid locally optimal solutions. Our results suggest a close connection between feedforward inhibition and synaptic scaling which may have important functional implications for general cortical processing.
UR - http://www.scopus.com/inward/record.url?scp=84861110370&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84861110370&partnerID=8YFLogxK
U2 - 10.1371/journal.pcbi.1002432
DO - 10.1371/journal.pcbi.1002432
M3 - Article
C2 - 22457610
AN - SCOPUS:84861110370
VL - 8
JO - PLoS Computational Biology
JF - PLoS Computational Biology
SN - 1553-734X
IS - 3
M1 - e1002432
ER -