A population density approach that facilitates large-scale modeling of neural networks: Analysis and an application to orientation tuning

Duane Q. Nykamp, Daniel Tranchina

Research output: Contribution to journalArticle

Abstract

We explore a computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich in which integrate-and-fire neurons are grouped into large populations of similar neurons. For each population, we form a probability density that represents the distribution of neurons over all possible states. The populations are coupled via stochastic synapses in which the conductance of a neuron is modulated according to the firing rates of its presynaptic populations. The evolution equation for each of these probability densities is a partial differential-integral equation, which we solve numerically. Results obtained for several example networks are tested against conventional computations for groups of individual neurons. We apply this approach to modeling orientation tuning in the visual cortex. Our population density model is based on the recurrent feedback model of a hypercolumn in cat visual cortex of Somers et al. We simulate the response to oriented flashed bars. As in the Somers model, a weak orientation bias provided by feed-forward lateral geniculate input is transformed by intracortical circuitry into sharper orientation tuning that is independent of stimulus contrast. The population density approach appears to be a viable method for simulating large neural networks. Its computational efficiency overcomes some of the restrictions imposed by computation time in individual neuron simulations, allowing one to build more complex networks and to explore parameter space more easily. The method produces smooth rate functions with one pass of the stimulus and does not require signal averaging. At the same time, this model captures the dynamics of single-neuron activity that are missed in simple firing-rate models.

Original languageEnglish (US)
Pages (from-to)19-50
Number of pages32
JournalJournal of Computational Neuroscience
Volume8
Issue number1
DOIs
StatePublished - 2000

Fingerprint

Population Density
Neurons
Visual Cortex
Population
Synapses
Cats

Keywords

  • Modeling
  • Neural networks
  • Orientation tuning
  • Population density
  • Visual cortex

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this

@article{742f2ff6df574e3bbb79c908cdbffa00,
title = "A population density approach that facilitates large-scale modeling of neural networks: Analysis and an application to orientation tuning",
abstract = "We explore a computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich in which integrate-and-fire neurons are grouped into large populations of similar neurons. For each population, we form a probability density that represents the distribution of neurons over all possible states. The populations are coupled via stochastic synapses in which the conductance of a neuron is modulated according to the firing rates of its presynaptic populations. The evolution equation for each of these probability densities is a partial differential-integral equation, which we solve numerically. Results obtained for several example networks are tested against conventional computations for groups of individual neurons. We apply this approach to modeling orientation tuning in the visual cortex. Our population density model is based on the recurrent feedback model of a hypercolumn in cat visual cortex of Somers et al. We simulate the response to oriented flashed bars. As in the Somers model, a weak orientation bias provided by feed-forward lateral geniculate input is transformed by intracortical circuitry into sharper orientation tuning that is independent of stimulus contrast. The population density approach appears to be a viable method for simulating large neural networks. Its computational efficiency overcomes some of the restrictions imposed by computation time in individual neuron simulations, allowing one to build more complex networks and to explore parameter space more easily. The method produces smooth rate functions with one pass of the stimulus and does not require signal averaging. At the same time, this model captures the dynamics of single-neuron activity that are missed in simple firing-rate models.",
keywords = "Modeling, Neural networks, Orientation tuning, Population density, Visual cortex",
author = "Nykamp, {Duane Q.} and Daniel Tranchina",
year = "2000",
doi = "10.1023/A:1008912914816",
language = "English (US)",
volume = "8",
pages = "19--50",
journal = "Journal of Computational Neuroscience",
issn = "0929-5313",
publisher = "Springer Netherlands",
number = "1",

}

TY - JOUR

T1 - A population density approach that facilitates large-scale modeling of neural networks

T2 - Analysis and an application to orientation tuning

AU - Nykamp, Duane Q.

AU - Tranchina, Daniel

PY - 2000

Y1 - 2000

N2 - We explore a computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich in which integrate-and-fire neurons are grouped into large populations of similar neurons. For each population, we form a probability density that represents the distribution of neurons over all possible states. The populations are coupled via stochastic synapses in which the conductance of a neuron is modulated according to the firing rates of its presynaptic populations. The evolution equation for each of these probability densities is a partial differential-integral equation, which we solve numerically. Results obtained for several example networks are tested against conventional computations for groups of individual neurons. We apply this approach to modeling orientation tuning in the visual cortex. Our population density model is based on the recurrent feedback model of a hypercolumn in cat visual cortex of Somers et al. We simulate the response to oriented flashed bars. As in the Somers model, a weak orientation bias provided by feed-forward lateral geniculate input is transformed by intracortical circuitry into sharper orientation tuning that is independent of stimulus contrast. The population density approach appears to be a viable method for simulating large neural networks. Its computational efficiency overcomes some of the restrictions imposed by computation time in individual neuron simulations, allowing one to build more complex networks and to explore parameter space more easily. The method produces smooth rate functions with one pass of the stimulus and does not require signal averaging. At the same time, this model captures the dynamics of single-neuron activity that are missed in simple firing-rate models.

AB - We explore a computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich in which integrate-and-fire neurons are grouped into large populations of similar neurons. For each population, we form a probability density that represents the distribution of neurons over all possible states. The populations are coupled via stochastic synapses in which the conductance of a neuron is modulated according to the firing rates of its presynaptic populations. The evolution equation for each of these probability densities is a partial differential-integral equation, which we solve numerically. Results obtained for several example networks are tested against conventional computations for groups of individual neurons. We apply this approach to modeling orientation tuning in the visual cortex. Our population density model is based on the recurrent feedback model of a hypercolumn in cat visual cortex of Somers et al. We simulate the response to oriented flashed bars. As in the Somers model, a weak orientation bias provided by feed-forward lateral geniculate input is transformed by intracortical circuitry into sharper orientation tuning that is independent of stimulus contrast. The population density approach appears to be a viable method for simulating large neural networks. Its computational efficiency overcomes some of the restrictions imposed by computation time in individual neuron simulations, allowing one to build more complex networks and to explore parameter space more easily. The method produces smooth rate functions with one pass of the stimulus and does not require signal averaging. At the same time, this model captures the dynamics of single-neuron activity that are missed in simple firing-rate models.

KW - Modeling

KW - Neural networks

KW - Orientation tuning

KW - Population density

KW - Visual cortex

UR - http://www.scopus.com/inward/record.url?scp=0034076273&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034076273&partnerID=8YFLogxK

U2 - 10.1023/A:1008912914816

DO - 10.1023/A:1008912914816

M3 - Article

C2 - 10798498

AN - SCOPUS:0034076273

VL - 8

SP - 19

EP - 50

JO - Journal of Computational Neuroscience

JF - Journal of Computational Neuroscience

SN - 0929-5313

IS - 1

ER -