Gravity influences the visual representation of object tilt in parietal cortex

Ari Rosenberg, Dora Angelaki

Research output: Contribution to journalArticle

Abstract

Sensory systems encode the environment in egocentric (e.g., eye, head, or body) reference frames, creating inherently unstable representations that shift and rotate as we move. However, it is widely speculated that the brain transforms these signals into an allocentric, gravity-centered representation of the world that is stable and independent of the observer’s spatial pose. Where and how this representation may be achieved is currently unknown. Here we demonstrate that a subpopulation of neurons in the macaque caudal intraparietal area (CIP) visually encodes object tilt in nonegocentric coordinates defined relative to the gravitational vector. Neuronal responses to the tilt of a visually presented planar surface were measured with the monkey in different spatial orientations (upright and rolled left/right ear down) and then compared. This revealed a continuum of representations in which planar tilt was encoded in a gravity-centered reference frame in approximately one-tenth of the comparisons, intermediate reference frames ranging between gravity-centered and egocentric in approximately two-tenths of the comparisons, and in an egocentric reference frame in less than half of the comparisons. Altogether, almost half of the comparisons revealed a shift in the preferred tilt and/or a gain change consistent with encoding object orientation in nonegocentric coordinates. Through neural network modeling, we further show that a purely gravity-centered representation of object tilt can be achieved directly from the population activity of CIP-like units. These results suggest that area CIP may play a key role in creating a stable, allocentric representation of the environment defined relative to an “earth-vertical” direction.

Original languageEnglish (US)
Pages (from-to)14170-14180
Number of pages11
JournalJournal of Neuroscience
Volume34
Issue number43
DOIs
StatePublished - Jan 1 2014

Fingerprint

Parietal Lobe
Gravitation
Macaca
Haplorhini
Ear
Head
Neurons
Brain
Population

Keywords

  • Allocentric
  • Gravity
  • Multisensory
  • Parietal cortex
  • Spatial pose
  • Visual orientation

ASJC Scopus subject areas

  • Neuroscience(all)
  • Medicine(all)

Cite this

Gravity influences the visual representation of object tilt in parietal cortex. / Rosenberg, Ari; Angelaki, Dora.

In: Journal of Neuroscience, Vol. 34, No. 43, 01.01.2014, p. 14170-14180.

Research output: Contribution to journalArticle

@article{87db1583207c42a392b265dd9729dcf0,
title = "Gravity influences the visual representation of object tilt in parietal cortex",
abstract = "Sensory systems encode the environment in egocentric (e.g., eye, head, or body) reference frames, creating inherently unstable representations that shift and rotate as we move. However, it is widely speculated that the brain transforms these signals into an allocentric, gravity-centered representation of the world that is stable and independent of the observer’s spatial pose. Where and how this representation may be achieved is currently unknown. Here we demonstrate that a subpopulation of neurons in the macaque caudal intraparietal area (CIP) visually encodes object tilt in nonegocentric coordinates defined relative to the gravitational vector. Neuronal responses to the tilt of a visually presented planar surface were measured with the monkey in different spatial orientations (upright and rolled left/right ear down) and then compared. This revealed a continuum of representations in which planar tilt was encoded in a gravity-centered reference frame in approximately one-tenth of the comparisons, intermediate reference frames ranging between gravity-centered and egocentric in approximately two-tenths of the comparisons, and in an egocentric reference frame in less than half of the comparisons. Altogether, almost half of the comparisons revealed a shift in the preferred tilt and/or a gain change consistent with encoding object orientation in nonegocentric coordinates. Through neural network modeling, we further show that a purely gravity-centered representation of object tilt can be achieved directly from the population activity of CIP-like units. These results suggest that area CIP may play a key role in creating a stable, allocentric representation of the environment defined relative to an “earth-vertical” direction.",
keywords = "Allocentric, Gravity, Multisensory, Parietal cortex, Spatial pose, Visual orientation",
author = "Ari Rosenberg and Dora Angelaki",
year = "2014",
month = "1",
day = "1",
doi = "10.1523/JNEUROSCI.2030-14.2014",
language = "English (US)",
volume = "34",
pages = "14170--14180",
journal = "Journal of Neuroscience",
issn = "0270-6474",
publisher = "Society for Neuroscience",
number = "43",

}

TY - JOUR

T1 - Gravity influences the visual representation of object tilt in parietal cortex

AU - Rosenberg, Ari

AU - Angelaki, Dora

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Sensory systems encode the environment in egocentric (e.g., eye, head, or body) reference frames, creating inherently unstable representations that shift and rotate as we move. However, it is widely speculated that the brain transforms these signals into an allocentric, gravity-centered representation of the world that is stable and independent of the observer’s spatial pose. Where and how this representation may be achieved is currently unknown. Here we demonstrate that a subpopulation of neurons in the macaque caudal intraparietal area (CIP) visually encodes object tilt in nonegocentric coordinates defined relative to the gravitational vector. Neuronal responses to the tilt of a visually presented planar surface were measured with the monkey in different spatial orientations (upright and rolled left/right ear down) and then compared. This revealed a continuum of representations in which planar tilt was encoded in a gravity-centered reference frame in approximately one-tenth of the comparisons, intermediate reference frames ranging between gravity-centered and egocentric in approximately two-tenths of the comparisons, and in an egocentric reference frame in less than half of the comparisons. Altogether, almost half of the comparisons revealed a shift in the preferred tilt and/or a gain change consistent with encoding object orientation in nonegocentric coordinates. Through neural network modeling, we further show that a purely gravity-centered representation of object tilt can be achieved directly from the population activity of CIP-like units. These results suggest that area CIP may play a key role in creating a stable, allocentric representation of the environment defined relative to an “earth-vertical” direction.

AB - Sensory systems encode the environment in egocentric (e.g., eye, head, or body) reference frames, creating inherently unstable representations that shift and rotate as we move. However, it is widely speculated that the brain transforms these signals into an allocentric, gravity-centered representation of the world that is stable and independent of the observer’s spatial pose. Where and how this representation may be achieved is currently unknown. Here we demonstrate that a subpopulation of neurons in the macaque caudal intraparietal area (CIP) visually encodes object tilt in nonegocentric coordinates defined relative to the gravitational vector. Neuronal responses to the tilt of a visually presented planar surface were measured with the monkey in different spatial orientations (upright and rolled left/right ear down) and then compared. This revealed a continuum of representations in which planar tilt was encoded in a gravity-centered reference frame in approximately one-tenth of the comparisons, intermediate reference frames ranging between gravity-centered and egocentric in approximately two-tenths of the comparisons, and in an egocentric reference frame in less than half of the comparisons. Altogether, almost half of the comparisons revealed a shift in the preferred tilt and/or a gain change consistent with encoding object orientation in nonegocentric coordinates. Through neural network modeling, we further show that a purely gravity-centered representation of object tilt can be achieved directly from the population activity of CIP-like units. These results suggest that area CIP may play a key role in creating a stable, allocentric representation of the environment defined relative to an “earth-vertical” direction.

KW - Allocentric

KW - Gravity

KW - Multisensory

KW - Parietal cortex

KW - Spatial pose

KW - Visual orientation

UR - http://www.scopus.com/inward/record.url?scp=84908077703&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84908077703&partnerID=8YFLogxK

U2 - 10.1523/JNEUROSCI.2030-14.2014

DO - 10.1523/JNEUROSCI.2030-14.2014

M3 - Article

VL - 34

SP - 14170

EP - 14180

JO - Journal of Neuroscience

JF - Journal of Neuroscience

SN - 0270-6474

IS - 43

ER -