A novel role for visual perspective cues in the neural computation of depth

Hyunggoo R. Kim, Dora Angelaki, Gregory C. Deangelis

Research output: Contribution to journalReview article

Abstract

As we explore a scene, our eye movements add global patterns of motion to the retinal image, complicating visual motion produced by self-motion or moving objects. Conventionally, it has been assumed that extraretinal signals, such as efference copy of smooth pursuit commands, are required to compensate for the visual consequences of eye rotations. We consider an alternative possibility: namely, that the visual system can infer eye rotations from global patterns of image motion. We visually simulated combinations of eye translation and rotation, including perspective distortions that change dynamically over time. We found that incorporating these 'dynamic perspective' cues allowed the visual system to generate selectivity for depth sign from motion parallax in macaque cortical area MT, a computation that was previously thought to require extraretinal signals regarding eye velocity. Our findings suggest neural mechanisms that analyze global patterns of visual motion to perform computations that require knowledge of eye rotations.

Original languageEnglish (US)
Pages (from-to)129-137
Number of pages9
JournalNature Neuroscience
Volume18
Issue number1
DOIs
StatePublished - Jan 1 2015

Fingerprint

Cues
Smooth Pursuit
Macaca
Eye Movements

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this

A novel role for visual perspective cues in the neural computation of depth. / Kim, Hyunggoo R.; Angelaki, Dora; Deangelis, Gregory C.

In: Nature Neuroscience, Vol. 18, No. 1, 01.01.2015, p. 129-137.

Research output: Contribution to journalReview article

Kim, Hyunggoo R. ; Angelaki, Dora ; Deangelis, Gregory C. / A novel role for visual perspective cues in the neural computation of depth. In: Nature Neuroscience. 2015 ; Vol. 18, No. 1. pp. 129-137.
@article{b1cc0af818ab4bb1ab7e4188d17b8a0d,
title = "A novel role for visual perspective cues in the neural computation of depth",
abstract = "As we explore a scene, our eye movements add global patterns of motion to the retinal image, complicating visual motion produced by self-motion or moving objects. Conventionally, it has been assumed that extraretinal signals, such as efference copy of smooth pursuit commands, are required to compensate for the visual consequences of eye rotations. We consider an alternative possibility: namely, that the visual system can infer eye rotations from global patterns of image motion. We visually simulated combinations of eye translation and rotation, including perspective distortions that change dynamically over time. We found that incorporating these 'dynamic perspective' cues allowed the visual system to generate selectivity for depth sign from motion parallax in macaque cortical area MT, a computation that was previously thought to require extraretinal signals regarding eye velocity. Our findings suggest neural mechanisms that analyze global patterns of visual motion to perform computations that require knowledge of eye rotations.",
author = "Kim, {Hyunggoo R.} and Dora Angelaki and Deangelis, {Gregory C.}",
year = "2015",
month = "1",
day = "1",
doi = "10.1038/nn.3889",
language = "English (US)",
volume = "18",
pages = "129--137",
journal = "Nature Neuroscience",
issn = "1097-6256",
publisher = "Nature Publishing Group",
number = "1",

}

TY - JOUR

T1 - A novel role for visual perspective cues in the neural computation of depth

AU - Kim, Hyunggoo R.

AU - Angelaki, Dora

AU - Deangelis, Gregory C.

PY - 2015/1/1

Y1 - 2015/1/1

N2 - As we explore a scene, our eye movements add global patterns of motion to the retinal image, complicating visual motion produced by self-motion or moving objects. Conventionally, it has been assumed that extraretinal signals, such as efference copy of smooth pursuit commands, are required to compensate for the visual consequences of eye rotations. We consider an alternative possibility: namely, that the visual system can infer eye rotations from global patterns of image motion. We visually simulated combinations of eye translation and rotation, including perspective distortions that change dynamically over time. We found that incorporating these 'dynamic perspective' cues allowed the visual system to generate selectivity for depth sign from motion parallax in macaque cortical area MT, a computation that was previously thought to require extraretinal signals regarding eye velocity. Our findings suggest neural mechanisms that analyze global patterns of visual motion to perform computations that require knowledge of eye rotations.

AB - As we explore a scene, our eye movements add global patterns of motion to the retinal image, complicating visual motion produced by self-motion or moving objects. Conventionally, it has been assumed that extraretinal signals, such as efference copy of smooth pursuit commands, are required to compensate for the visual consequences of eye rotations. We consider an alternative possibility: namely, that the visual system can infer eye rotations from global patterns of image motion. We visually simulated combinations of eye translation and rotation, including perspective distortions that change dynamically over time. We found that incorporating these 'dynamic perspective' cues allowed the visual system to generate selectivity for depth sign from motion parallax in macaque cortical area MT, a computation that was previously thought to require extraretinal signals regarding eye velocity. Our findings suggest neural mechanisms that analyze global patterns of visual motion to perform computations that require knowledge of eye rotations.

UR - http://www.scopus.com/inward/record.url?scp=84961289546&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84961289546&partnerID=8YFLogxK

U2 - 10.1038/nn.3889

DO - 10.1038/nn.3889

M3 - Review article

C2 - 25436667

AN - SCOPUS:84961289546

VL - 18

SP - 129

EP - 137

JO - Nature Neuroscience

JF - Nature Neuroscience

SN - 1097-6256

IS - 1

ER -