No integration without structured representations: Response to pater

Iris Berent, Gary Marcus

Research output: Contribution to journalArticle

Abstract

Pater’s (2019) expansive review is a significant contribution toward bridging the disconnect of generative linguistics with connectionism, and as such, it is an important service to the field. But Pater’s efforts for inclusion and reconciliation obscure crucial substantive disagreements on foundational matters. Most connectionist models are antithetical to the algebraic hypothesis that has guided generative linguistics from its inception. They eschew the notions that mental representations have formal constituent structure and that mental operations are structure-sensitive. These representational commitments critically limit the scope of learning and productivity in connectionist models. Moving forward, we see only two options: either those connectionist models are right, and generative linguistics must be radically revised, or they must be replaced by alternatives that are compatible with the algebraic hypothesis. There can be no integration without structured representations.

Original languageEnglish (US)
Pages (from-to)e75-e86
JournalLanguage
Volume95
Issue number1
DOIs
StatePublished - Mar 1 2019

Fingerprint

linguistics
reconciliation
productivity
inclusion
commitment
learning
Generative Linguistics
Connectionist Models
Algebra
Connectionism
Reconciliation
Mental Operations
Inclusion
Constituent
Productivity
Mental Representation

Keywords

  • Algebraic rules
  • Associationism
  • Connectionism
  • Structured representation
  • The computational theory of mind

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language

Cite this

No integration without structured representations : Response to pater. / Berent, Iris; Marcus, Gary.

In: Language, Vol. 95, No. 1, 01.03.2019, p. e75-e86.

Research output: Contribution to journalArticle

Berent, Iris ; Marcus, Gary. / No integration without structured representations : Response to pater. In: Language. 2019 ; Vol. 95, No. 1. pp. e75-e86.
@article{96ccfd7ec1de4bce93b2a522866443f6,
title = "No integration without structured representations: Response to pater",
abstract = "Pater’s (2019) expansive review is a significant contribution toward bridging the disconnect of generative linguistics with connectionism, and as such, it is an important service to the field. But Pater’s efforts for inclusion and reconciliation obscure crucial substantive disagreements on foundational matters. Most connectionist models are antithetical to the algebraic hypothesis that has guided generative linguistics from its inception. They eschew the notions that mental representations have formal constituent structure and that mental operations are structure-sensitive. These representational commitments critically limit the scope of learning and productivity in connectionist models. Moving forward, we see only two options: either those connectionist models are right, and generative linguistics must be radically revised, or they must be replaced by alternatives that are compatible with the algebraic hypothesis. There can be no integration without structured representations.",
keywords = "Algebraic rules, Associationism, Connectionism, Structured representation, The computational theory of mind",
author = "Iris Berent and Gary Marcus",
year = "2019",
month = "3",
day = "1",
doi = "10.1353/lan.2019.0011",
language = "English (US)",
volume = "95",
pages = "e75--e86",
journal = "Language",
issn = "0097-8507",
publisher = "Linguistic Society of America",
number = "1",

}

TY - JOUR

T1 - No integration without structured representations

T2 - Response to pater

AU - Berent, Iris

AU - Marcus, Gary

PY - 2019/3/1

Y1 - 2019/3/1

N2 - Pater’s (2019) expansive review is a significant contribution toward bridging the disconnect of generative linguistics with connectionism, and as such, it is an important service to the field. But Pater’s efforts for inclusion and reconciliation obscure crucial substantive disagreements on foundational matters. Most connectionist models are antithetical to the algebraic hypothesis that has guided generative linguistics from its inception. They eschew the notions that mental representations have formal constituent structure and that mental operations are structure-sensitive. These representational commitments critically limit the scope of learning and productivity in connectionist models. Moving forward, we see only two options: either those connectionist models are right, and generative linguistics must be radically revised, or they must be replaced by alternatives that are compatible with the algebraic hypothesis. There can be no integration without structured representations.

AB - Pater’s (2019) expansive review is a significant contribution toward bridging the disconnect of generative linguistics with connectionism, and as such, it is an important service to the field. But Pater’s efforts for inclusion and reconciliation obscure crucial substantive disagreements on foundational matters. Most connectionist models are antithetical to the algebraic hypothesis that has guided generative linguistics from its inception. They eschew the notions that mental representations have formal constituent structure and that mental operations are structure-sensitive. These representational commitments critically limit the scope of learning and productivity in connectionist models. Moving forward, we see only two options: either those connectionist models are right, and generative linguistics must be radically revised, or they must be replaced by alternatives that are compatible with the algebraic hypothesis. There can be no integration without structured representations.

KW - Algebraic rules

KW - Associationism

KW - Connectionism

KW - Structured representation

KW - The computational theory of mind

UR - http://www.scopus.com/inward/record.url?scp=85067258358&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067258358&partnerID=8YFLogxK

U2 - 10.1353/lan.2019.0011

DO - 10.1353/lan.2019.0011

M3 - Article

VL - 95

SP - e75-e86

JO - Language

JF - Language

SN - 0097-8507

IS - 1

ER -