No Evidence for an Item Limit in Change Detection

Shaiyan Keshvari, Ronald van den Berg, Wei Ji Ma

Research output: Contribution to journalArticle

Abstract

Change detection is a classic paradigm that has been used for decades to argue that working memory can hold no more than a fixed number of items ("item-limit models"). Recent findings force us to consider the alternative view that working memory is limited by the precision in stimulus encoding, with mean precision decreasing with increasing set size ("continuous-resource models"). Most previous studies that used the change detection paradigm have ignored effects of limited encoding precision by using highly discriminable stimuli and only large changes. We conducted two change detection experiments (orientation and color) in which change magnitudes were drawn from a wide range, including small changes. In a rigorous comparison of five models, we found no evidence of an item limit. Instead, human change detection performance was best explained by a continuous-resource model in which encoding precision is variable across items and trials even at a given set size. This model accounts for comparison errors in a principled, probabilistic manner. Our findings sharply challenge the theoretical basis for most neural studies of working memory capacity.

Original languageEnglish (US)
Article numbere1002927
JournalPLoS Computational Biology
Volume9
Issue number2
DOIs
StatePublished - Feb 2013

Fingerprint

Change Detection
Short-Term Memory
Working Memory
Encoding
Data storage equipment
Paradigm
Human Detection
Color
Resources
Model
resource
Evidence
detection
color
Alternatives
Range of data
Experiment
experiment
Experiments
comparison

ASJC Scopus subject areas

  • Cellular and Molecular Neuroscience
  • Ecology
  • Molecular Biology
  • Genetics
  • Ecology, Evolution, Behavior and Systematics
  • Modeling and Simulation
  • Computational Theory and Mathematics

Cite this

No Evidence for an Item Limit in Change Detection. / Keshvari, Shaiyan; van den Berg, Ronald; Ma, Wei Ji.

In: PLoS Computational Biology, Vol. 9, No. 2, e1002927, 02.2013.

Research output: Contribution to journalArticle

Keshvari, Shaiyan ; van den Berg, Ronald ; Ma, Wei Ji. / No Evidence for an Item Limit in Change Detection. In: PLoS Computational Biology. 2013 ; Vol. 9, No. 2.
@article{9c47ee5f0d7649729ab84003a9c8adc5,
title = "No Evidence for an Item Limit in Change Detection",
abstract = "Change detection is a classic paradigm that has been used for decades to argue that working memory can hold no more than a fixed number of items ({"}item-limit models{"}). Recent findings force us to consider the alternative view that working memory is limited by the precision in stimulus encoding, with mean precision decreasing with increasing set size ({"}continuous-resource models{"}). Most previous studies that used the change detection paradigm have ignored effects of limited encoding precision by using highly discriminable stimuli and only large changes. We conducted two change detection experiments (orientation and color) in which change magnitudes were drawn from a wide range, including small changes. In a rigorous comparison of five models, we found no evidence of an item limit. Instead, human change detection performance was best explained by a continuous-resource model in which encoding precision is variable across items and trials even at a given set size. This model accounts for comparison errors in a principled, probabilistic manner. Our findings sharply challenge the theoretical basis for most neural studies of working memory capacity.",
author = "Shaiyan Keshvari and {van den Berg}, Ronald and Ma, {Wei Ji}",
year = "2013",
month = "2",
doi = "10.1371/journal.pcbi.1002927",
language = "English (US)",
volume = "9",
journal = "PLoS Computational Biology",
issn = "1553-734X",
publisher = "Public Library of Science",
number = "2",

}

TY - JOUR

T1 - No Evidence for an Item Limit in Change Detection

AU - Keshvari, Shaiyan

AU - van den Berg, Ronald

AU - Ma, Wei Ji

PY - 2013/2

Y1 - 2013/2

N2 - Change detection is a classic paradigm that has been used for decades to argue that working memory can hold no more than a fixed number of items ("item-limit models"). Recent findings force us to consider the alternative view that working memory is limited by the precision in stimulus encoding, with mean precision decreasing with increasing set size ("continuous-resource models"). Most previous studies that used the change detection paradigm have ignored effects of limited encoding precision by using highly discriminable stimuli and only large changes. We conducted two change detection experiments (orientation and color) in which change magnitudes were drawn from a wide range, including small changes. In a rigorous comparison of five models, we found no evidence of an item limit. Instead, human change detection performance was best explained by a continuous-resource model in which encoding precision is variable across items and trials even at a given set size. This model accounts for comparison errors in a principled, probabilistic manner. Our findings sharply challenge the theoretical basis for most neural studies of working memory capacity.

AB - Change detection is a classic paradigm that has been used for decades to argue that working memory can hold no more than a fixed number of items ("item-limit models"). Recent findings force us to consider the alternative view that working memory is limited by the precision in stimulus encoding, with mean precision decreasing with increasing set size ("continuous-resource models"). Most previous studies that used the change detection paradigm have ignored effects of limited encoding precision by using highly discriminable stimuli and only large changes. We conducted two change detection experiments (orientation and color) in which change magnitudes were drawn from a wide range, including small changes. In a rigorous comparison of five models, we found no evidence of an item limit. Instead, human change detection performance was best explained by a continuous-resource model in which encoding precision is variable across items and trials even at a given set size. This model accounts for comparison errors in a principled, probabilistic manner. Our findings sharply challenge the theoretical basis for most neural studies of working memory capacity.

UR - http://www.scopus.com/inward/record.url?scp=84874770550&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84874770550&partnerID=8YFLogxK

U2 - 10.1371/journal.pcbi.1002927

DO - 10.1371/journal.pcbi.1002927

M3 - Article

C2 - 23468613

AN - SCOPUS:84874770550

VL - 9

JO - PLoS Computational Biology

JF - PLoS Computational Biology

SN - 1553-734X

IS - 2

M1 - e1002927

ER -