Estimating distance during self-motion: A role for visual-vestibular interactions

Kalpana Dokka, Paul R. MacNeilage, Gregory C. DeAngelis, Dora Angelaki

Research output: Contribution to journalArticle

Abstract

A fundamental challenge for the visual system is to extract the 3D spatial structure of the environment. When an observer translates without moving the eyes, the retinal speed of a stationary object is related to its distance by a scale factor that depends on the velocity of the observer's self-motion. Here, we aim to test whether the brain uses vestibular cues to selfmotion to estimate distance to stationary surfaces in the environment. This relationship was systematically probed using a two-alternative forced-choice task in which distance perceived from monocular image motion during passive body translation was compared to distance perceived from binocular disparity while subjects were stationary. We show that perceived distance from motion depended on both observer velocity and retinal speed. For a given head speed, slower retinal speeds led to the perception of farther distances. Likewise, for a given retinal speed, slower head speeds led to the perception of nearer distances. However, these relationships were weak in some subjects and absent in others, and distance estimated from self-motion and retinal image motion was substantially compressed relative to distance estimated from binocular disparity. Overall, our findings suggest that the combination of retinal image motion and vestibular signals related to head velocity can provide a rudimentary capacity for distance estimation.

Original languageEnglish (US)
Article number2
JournalJournal of Vision
Volume11
Issue number13
DOIs
StatePublished - Nov 24 2011

Fingerprint

Distance Perception
Vision Disparity
Head
Cues
Brain

Keywords

  • Absolute distance
  • Binocular disparity
  • Depth perception
  • Distance scaling
  • Motion parallax
  • Optic flow
  • Vestibular
  • Visual motion

ASJC Scopus subject areas

  • Ophthalmology
  • Sensory Systems

Cite this

Estimating distance during self-motion : A role for visual-vestibular interactions. / Dokka, Kalpana; MacNeilage, Paul R.; DeAngelis, Gregory C.; Angelaki, Dora.

In: Journal of Vision, Vol. 11, No. 13, 2, 24.11.2011.

Research output: Contribution to journalArticle

Dokka, Kalpana ; MacNeilage, Paul R. ; DeAngelis, Gregory C. ; Angelaki, Dora. / Estimating distance during self-motion : A role for visual-vestibular interactions. In: Journal of Vision. 2011 ; Vol. 11, No. 13.
@article{e3ffbd492f194c8b9181bf6232c287dd,
title = "Estimating distance during self-motion: A role for visual-vestibular interactions",
abstract = "A fundamental challenge for the visual system is to extract the 3D spatial structure of the environment. When an observer translates without moving the eyes, the retinal speed of a stationary object is related to its distance by a scale factor that depends on the velocity of the observer's self-motion. Here, we aim to test whether the brain uses vestibular cues to selfmotion to estimate distance to stationary surfaces in the environment. This relationship was systematically probed using a two-alternative forced-choice task in which distance perceived from monocular image motion during passive body translation was compared to distance perceived from binocular disparity while subjects were stationary. We show that perceived distance from motion depended on both observer velocity and retinal speed. For a given head speed, slower retinal speeds led to the perception of farther distances. Likewise, for a given retinal speed, slower head speeds led to the perception of nearer distances. However, these relationships were weak in some subjects and absent in others, and distance estimated from self-motion and retinal image motion was substantially compressed relative to distance estimated from binocular disparity. Overall, our findings suggest that the combination of retinal image motion and vestibular signals related to head velocity can provide a rudimentary capacity for distance estimation.",
keywords = "Absolute distance, Binocular disparity, Depth perception, Distance scaling, Motion parallax, Optic flow, Vestibular, Visual motion",
author = "Kalpana Dokka and MacNeilage, {Paul R.} and DeAngelis, {Gregory C.} and Dora Angelaki",
year = "2011",
month = "11",
day = "24",
doi = "10.1167/11.13.2",
language = "English (US)",
volume = "11",
journal = "Journal of Vision",
issn = "1534-7362",
publisher = "Association for Research in Vision and Ophthalmology Inc.",
number = "13",

}

TY - JOUR

T1 - Estimating distance during self-motion

T2 - A role for visual-vestibular interactions

AU - Dokka, Kalpana

AU - MacNeilage, Paul R.

AU - DeAngelis, Gregory C.

AU - Angelaki, Dora

PY - 2011/11/24

Y1 - 2011/11/24

N2 - A fundamental challenge for the visual system is to extract the 3D spatial structure of the environment. When an observer translates without moving the eyes, the retinal speed of a stationary object is related to its distance by a scale factor that depends on the velocity of the observer's self-motion. Here, we aim to test whether the brain uses vestibular cues to selfmotion to estimate distance to stationary surfaces in the environment. This relationship was systematically probed using a two-alternative forced-choice task in which distance perceived from monocular image motion during passive body translation was compared to distance perceived from binocular disparity while subjects were stationary. We show that perceived distance from motion depended on both observer velocity and retinal speed. For a given head speed, slower retinal speeds led to the perception of farther distances. Likewise, for a given retinal speed, slower head speeds led to the perception of nearer distances. However, these relationships were weak in some subjects and absent in others, and distance estimated from self-motion and retinal image motion was substantially compressed relative to distance estimated from binocular disparity. Overall, our findings suggest that the combination of retinal image motion and vestibular signals related to head velocity can provide a rudimentary capacity for distance estimation.

AB - A fundamental challenge for the visual system is to extract the 3D spatial structure of the environment. When an observer translates without moving the eyes, the retinal speed of a stationary object is related to its distance by a scale factor that depends on the velocity of the observer's self-motion. Here, we aim to test whether the brain uses vestibular cues to selfmotion to estimate distance to stationary surfaces in the environment. This relationship was systematically probed using a two-alternative forced-choice task in which distance perceived from monocular image motion during passive body translation was compared to distance perceived from binocular disparity while subjects were stationary. We show that perceived distance from motion depended on both observer velocity and retinal speed. For a given head speed, slower retinal speeds led to the perception of farther distances. Likewise, for a given retinal speed, slower head speeds led to the perception of nearer distances. However, these relationships were weak in some subjects and absent in others, and distance estimated from self-motion and retinal image motion was substantially compressed relative to distance estimated from binocular disparity. Overall, our findings suggest that the combination of retinal image motion and vestibular signals related to head velocity can provide a rudimentary capacity for distance estimation.

KW - Absolute distance

KW - Binocular disparity

KW - Depth perception

KW - Distance scaling

KW - Motion parallax

KW - Optic flow

KW - Vestibular

KW - Visual motion

UR - http://www.scopus.com/inward/record.url?scp=81555196384&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=81555196384&partnerID=8YFLogxK

U2 - 10.1167/11.13.2

DO - 10.1167/11.13.2

M3 - Article

VL - 11

JO - Journal of Vision

JF - Journal of Vision

SN - 1534-7362

IS - 13

M1 - 2

ER -