Optimal neural inference of stimulus intensities

Travis Monk, Cristina Savin, Jörg Lücke

Research output: Contribution to journalArticle

Abstract

In natural data, the class and intensity of stimuli are correlated. Current machine learning algorithms ignore this ubiquitous statistical property of stimuli, usually by requiring normalized inputs. From a biological perspective, it remains unclear how neural circuits may account for these dependencies in inference and learning. Here, we use a probabilistic framework to model class-specific intensity variations, and we derive approximate inference and online learning rules which reflect common hallmarks of neural computation. Concretely, we show that a neural circuit equipped with specific forms of synaptic and intrinsic plasticity (IP) can learn the class-specific features and intensities of stimuli simultaneously. Our model provides a normative interpretation of IP as a critical part of sensory learning and predicts that neurons can represent nontrivial input statistics in their excitabilities. Computationally, our approach yields improved statistical representations for realistic datasets in the visual and auditory domains. In particular, we demonstrate the utility of the model in estimating the contrastive stress of speech.

Original languageEnglish (US)
Article number10038
JournalScientific Reports
Volume8
Issue number1
DOIs
StatePublished - Dec 1 2018

Fingerprint

Plasticity
Networks (circuits)
Learning algorithms
Neurons
Learning systems
Statistics

ASJC Scopus subject areas

  • General

Cite this

Optimal neural inference of stimulus intensities. / Monk, Travis; Savin, Cristina; Lücke, Jörg.

In: Scientific Reports, Vol. 8, No. 1, 10038, 01.12.2018.

Research output: Contribution to journalArticle

Monk, Travis ; Savin, Cristina ; Lücke, Jörg. / Optimal neural inference of stimulus intensities. In: Scientific Reports. 2018 ; Vol. 8, No. 1.
@article{1d460d24d0de4144b16a1addc07d900c,
title = "Optimal neural inference of stimulus intensities",
abstract = "In natural data, the class and intensity of stimuli are correlated. Current machine learning algorithms ignore this ubiquitous statistical property of stimuli, usually by requiring normalized inputs. From a biological perspective, it remains unclear how neural circuits may account for these dependencies in inference and learning. Here, we use a probabilistic framework to model class-specific intensity variations, and we derive approximate inference and online learning rules which reflect common hallmarks of neural computation. Concretely, we show that a neural circuit equipped with specific forms of synaptic and intrinsic plasticity (IP) can learn the class-specific features and intensities of stimuli simultaneously. Our model provides a normative interpretation of IP as a critical part of sensory learning and predicts that neurons can represent nontrivial input statistics in their excitabilities. Computationally, our approach yields improved statistical representations for realistic datasets in the visual and auditory domains. In particular, we demonstrate the utility of the model in estimating the contrastive stress of speech.",
author = "Travis Monk and Cristina Savin and J{\"o}rg L{\"u}cke",
year = "2018",
month = "12",
day = "1",
doi = "10.1038/s41598-018-28184-5",
language = "English (US)",
volume = "8",
journal = "Scientific Reports",
issn = "2045-2322",
publisher = "Nature Publishing Group",
number = "1",

}

TY - JOUR

T1 - Optimal neural inference of stimulus intensities

AU - Monk, Travis

AU - Savin, Cristina

AU - Lücke, Jörg

PY - 2018/12/1

Y1 - 2018/12/1

N2 - In natural data, the class and intensity of stimuli are correlated. Current machine learning algorithms ignore this ubiquitous statistical property of stimuli, usually by requiring normalized inputs. From a biological perspective, it remains unclear how neural circuits may account for these dependencies in inference and learning. Here, we use a probabilistic framework to model class-specific intensity variations, and we derive approximate inference and online learning rules which reflect common hallmarks of neural computation. Concretely, we show that a neural circuit equipped with specific forms of synaptic and intrinsic plasticity (IP) can learn the class-specific features and intensities of stimuli simultaneously. Our model provides a normative interpretation of IP as a critical part of sensory learning and predicts that neurons can represent nontrivial input statistics in their excitabilities. Computationally, our approach yields improved statistical representations for realistic datasets in the visual and auditory domains. In particular, we demonstrate the utility of the model in estimating the contrastive stress of speech.

AB - In natural data, the class and intensity of stimuli are correlated. Current machine learning algorithms ignore this ubiquitous statistical property of stimuli, usually by requiring normalized inputs. From a biological perspective, it remains unclear how neural circuits may account for these dependencies in inference and learning. Here, we use a probabilistic framework to model class-specific intensity variations, and we derive approximate inference and online learning rules which reflect common hallmarks of neural computation. Concretely, we show that a neural circuit equipped with specific forms of synaptic and intrinsic plasticity (IP) can learn the class-specific features and intensities of stimuli simultaneously. Our model provides a normative interpretation of IP as a critical part of sensory learning and predicts that neurons can represent nontrivial input statistics in their excitabilities. Computationally, our approach yields improved statistical representations for realistic datasets in the visual and auditory domains. In particular, we demonstrate the utility of the model in estimating the contrastive stress of speech.

UR - http://www.scopus.com/inward/record.url?scp=85049524492&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85049524492&partnerID=8YFLogxK

U2 - 10.1038/s41598-018-28184-5

DO - 10.1038/s41598-018-28184-5

M3 - Article

VL - 8

JO - Scientific Reports

JF - Scientific Reports

SN - 2045-2322

IS - 1

M1 - 10038

ER -