Visual speech speeds up the neural processing of auditory speech

Virginie Van Wassenhove, Ken W. Grant, David Poeppel

Research output: Contribution to journalArticle

Abstract

Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a percept in either modality. For example, perception of auditory speech is improved when the speaker's facial articulatory movements are visible. Neural convergence onto multisensory sites exhibiting supra-additivity has been proposed as the principal mechanism for integration. Recent findings, however, have suggested that putative sensory-specific cortices are responsive to inputs presented through a different modality. Consequently, when and where audiovisual representations emerge remain unsettled. In combined psychophysical and electroencephalography experiments we show that visual speech speeds up the cortical processing of auditory signals early (within 100 ms of signal onset). The auditory-visual interaction is reflected as an articulator-specific temporal facilitation (as well as a non-specific amplitude reduction). The latency facilitation systematically depends on the degree to which the visual signal predicts possible auditory targets. The observed auditory-visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs. This is evidence for the existence of an "analysis-by-synthesis" mechanism in auditory-visual speech perception.

Original languageEnglish (US)
Pages (from-to)1181-1186
Number of pages6
JournalProceedings of the National Academy of Sciences of the United States of America
Volume102
Issue number4
DOIs
StatePublished - Jan 25 2005

Fingerprint

Speech Perception
Dental Articulators
Auditory Perception
Visual Perception
Electroencephalography

Keywords

  • EEG
  • Multisensory
  • Predictive coding

ASJC Scopus subject areas

  • Genetics
  • General

Cite this

Visual speech speeds up the neural processing of auditory speech. / Van Wassenhove, Virginie; Grant, Ken W.; Poeppel, David.

In: Proceedings of the National Academy of Sciences of the United States of America, Vol. 102, No. 4, 25.01.2005, p. 1181-1186.

Research output: Contribution to journalArticle

@article{97dc488190f8445e8ccfe741bc4e161e,
title = "Visual speech speeds up the neural processing of auditory speech",
abstract = "Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a percept in either modality. For example, perception of auditory speech is improved when the speaker's facial articulatory movements are visible. Neural convergence onto multisensory sites exhibiting supra-additivity has been proposed as the principal mechanism for integration. Recent findings, however, have suggested that putative sensory-specific cortices are responsive to inputs presented through a different modality. Consequently, when and where audiovisual representations emerge remain unsettled. In combined psychophysical and electroencephalography experiments we show that visual speech speeds up the cortical processing of auditory signals early (within 100 ms of signal onset). The auditory-visual interaction is reflected as an articulator-specific temporal facilitation (as well as a non-specific amplitude reduction). The latency facilitation systematically depends on the degree to which the visual signal predicts possible auditory targets. The observed auditory-visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs. This is evidence for the existence of an {"}analysis-by-synthesis{"} mechanism in auditory-visual speech perception.",
keywords = "EEG, Multisensory, Predictive coding",
author = "{Van Wassenhove}, Virginie and Grant, {Ken W.} and David Poeppel",
year = "2005",
month = "1",
day = "25",
doi = "10.1073/pnas.0408949102",
language = "English (US)",
volume = "102",
pages = "1181--1186",
journal = "Proceedings of the National Academy of Sciences of the United States of America",
issn = "0027-8424",
number = "4",

}

TY - JOUR

T1 - Visual speech speeds up the neural processing of auditory speech

AU - Van Wassenhove, Virginie

AU - Grant, Ken W.

AU - Poeppel, David

PY - 2005/1/25

Y1 - 2005/1/25

N2 - Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a percept in either modality. For example, perception of auditory speech is improved when the speaker's facial articulatory movements are visible. Neural convergence onto multisensory sites exhibiting supra-additivity has been proposed as the principal mechanism for integration. Recent findings, however, have suggested that putative sensory-specific cortices are responsive to inputs presented through a different modality. Consequently, when and where audiovisual representations emerge remain unsettled. In combined psychophysical and electroencephalography experiments we show that visual speech speeds up the cortical processing of auditory signals early (within 100 ms of signal onset). The auditory-visual interaction is reflected as an articulator-specific temporal facilitation (as well as a non-specific amplitude reduction). The latency facilitation systematically depends on the degree to which the visual signal predicts possible auditory targets. The observed auditory-visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs. This is evidence for the existence of an "analysis-by-synthesis" mechanism in auditory-visual speech perception.

AB - Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a percept in either modality. For example, perception of auditory speech is improved when the speaker's facial articulatory movements are visible. Neural convergence onto multisensory sites exhibiting supra-additivity has been proposed as the principal mechanism for integration. Recent findings, however, have suggested that putative sensory-specific cortices are responsive to inputs presented through a different modality. Consequently, when and where audiovisual representations emerge remain unsettled. In combined psychophysical and electroencephalography experiments we show that visual speech speeds up the cortical processing of auditory signals early (within 100 ms of signal onset). The auditory-visual interaction is reflected as an articulator-specific temporal facilitation (as well as a non-specific amplitude reduction). The latency facilitation systematically depends on the degree to which the visual signal predicts possible auditory targets. The observed auditory-visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs. This is evidence for the existence of an "analysis-by-synthesis" mechanism in auditory-visual speech perception.

KW - EEG

KW - Multisensory

KW - Predictive coding

UR - http://www.scopus.com/inward/record.url?scp=12844278634&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=12844278634&partnerID=8YFLogxK

U2 - 10.1073/pnas.0408949102

DO - 10.1073/pnas.0408949102

M3 - Article

C2 - 15647358

AN - SCOPUS:12844278634

VL - 102

SP - 1181

EP - 1186

JO - Proceedings of the National Academy of Sciences of the United States of America

JF - Proceedings of the National Academy of Sciences of the United States of America

SN - 0027-8424

IS - 4

ER -