The Chinese room argument reconsidered: Essentialism, indeterminacy, and strong AI

Jerome C. Wakefield

    Research output: Contribution to journalArticle

    Abstract

    I argue that John Searle's (1980) influential Chinese room argument (CRA) against computationalism and strong AT survives existing objections, including Block's (1998) internalized systems reply, Fodor's (1991b) deviant causal chain reply, and Hauser's (1997) unconscious content reply. However, a new "essentialist" reply I construct shows that the CRA as presented by Searle is an unsound argument that relies on a question-begging appeal to intuition. My diagnosis of the CRA relies on an interpretation of computationalism as a scientific theory about the essential nature of intentional content; such theories often yield non-intuitive results in non-standard cases, and so cannot be judged by such intuitions. However, I further argue that the CRA can be transformed into a potentially valid argument against computationalism simply by reinterpreting it as an indeterminacy argument that shows that computationalism cannot explain the ordinary distinction between semantic content and sheer syntactic manipulation, and thus cannot be an adequate account of content. This conclusion admittedly rests on the arguable but plausible assumption that thought content is interestingly determinate. I conclude that the viability of computationalism and strong AI depends on their addressing the indeterminacy objection, but that it is currently unclear how this objection can be successfully addressed.

    Original languageEnglish (US)
    Pages (from-to)285-319
    Number of pages35
    JournalMinds and Machines
    Volume13
    Issue number2
    DOIs
    StatePublished - May 1 2003

    Fingerprint

    Syntactics
    Semantics
    Computationalism
    Essentialism
    Chinese Room Argument
    Indeterminacy
    Intuition

    Keywords

    • Artificial intelligence
    • Cognitive science
    • Computation
    • Essentialism
    • Functionalism
    • Indeterminacy
    • Philosophy of mind
    • Searle's Chinese room argument
    • Semantics

    ASJC Scopus subject areas

    • Philosophy
    • Artificial Intelligence

    Cite this

    The Chinese room argument reconsidered : Essentialism, indeterminacy, and strong AI. / Wakefield, Jerome C.

    In: Minds and Machines, Vol. 13, No. 2, 01.05.2003, p. 285-319.

    Research output: Contribution to journalArticle

    @article{978f27fa679f42fdbd9bac7ef01ea436,
    title = "The Chinese room argument reconsidered: Essentialism, indeterminacy, and strong AI",
    abstract = "I argue that John Searle's (1980) influential Chinese room argument (CRA) against computationalism and strong AT survives existing objections, including Block's (1998) internalized systems reply, Fodor's (1991b) deviant causal chain reply, and Hauser's (1997) unconscious content reply. However, a new {"}essentialist{"} reply I construct shows that the CRA as presented by Searle is an unsound argument that relies on a question-begging appeal to intuition. My diagnosis of the CRA relies on an interpretation of computationalism as a scientific theory about the essential nature of intentional content; such theories often yield non-intuitive results in non-standard cases, and so cannot be judged by such intuitions. However, I further argue that the CRA can be transformed into a potentially valid argument against computationalism simply by reinterpreting it as an indeterminacy argument that shows that computationalism cannot explain the ordinary distinction between semantic content and sheer syntactic manipulation, and thus cannot be an adequate account of content. This conclusion admittedly rests on the arguable but plausible assumption that thought content is interestingly determinate. I conclude that the viability of computationalism and strong AI depends on their addressing the indeterminacy objection, but that it is currently unclear how this objection can be successfully addressed.",
    keywords = "Artificial intelligence, Cognitive science, Computation, Essentialism, Functionalism, Indeterminacy, Philosophy of mind, Searle's Chinese room argument, Semantics",
    author = "Wakefield, {Jerome C.}",
    year = "2003",
    month = "5",
    day = "1",
    doi = "10.1023/A:1022947527614",
    language = "English (US)",
    volume = "13",
    pages = "285--319",
    journal = "Minds and Machines",
    issn = "0924-6495",
    publisher = "Springer Netherlands",
    number = "2",

    }

    TY - JOUR

    T1 - The Chinese room argument reconsidered

    T2 - Essentialism, indeterminacy, and strong AI

    AU - Wakefield, Jerome C.

    PY - 2003/5/1

    Y1 - 2003/5/1

    N2 - I argue that John Searle's (1980) influential Chinese room argument (CRA) against computationalism and strong AT survives existing objections, including Block's (1998) internalized systems reply, Fodor's (1991b) deviant causal chain reply, and Hauser's (1997) unconscious content reply. However, a new "essentialist" reply I construct shows that the CRA as presented by Searle is an unsound argument that relies on a question-begging appeal to intuition. My diagnosis of the CRA relies on an interpretation of computationalism as a scientific theory about the essential nature of intentional content; such theories often yield non-intuitive results in non-standard cases, and so cannot be judged by such intuitions. However, I further argue that the CRA can be transformed into a potentially valid argument against computationalism simply by reinterpreting it as an indeterminacy argument that shows that computationalism cannot explain the ordinary distinction between semantic content and sheer syntactic manipulation, and thus cannot be an adequate account of content. This conclusion admittedly rests on the arguable but plausible assumption that thought content is interestingly determinate. I conclude that the viability of computationalism and strong AI depends on their addressing the indeterminacy objection, but that it is currently unclear how this objection can be successfully addressed.

    AB - I argue that John Searle's (1980) influential Chinese room argument (CRA) against computationalism and strong AT survives existing objections, including Block's (1998) internalized systems reply, Fodor's (1991b) deviant causal chain reply, and Hauser's (1997) unconscious content reply. However, a new "essentialist" reply I construct shows that the CRA as presented by Searle is an unsound argument that relies on a question-begging appeal to intuition. My diagnosis of the CRA relies on an interpretation of computationalism as a scientific theory about the essential nature of intentional content; such theories often yield non-intuitive results in non-standard cases, and so cannot be judged by such intuitions. However, I further argue that the CRA can be transformed into a potentially valid argument against computationalism simply by reinterpreting it as an indeterminacy argument that shows that computationalism cannot explain the ordinary distinction between semantic content and sheer syntactic manipulation, and thus cannot be an adequate account of content. This conclusion admittedly rests on the arguable but plausible assumption that thought content is interestingly determinate. I conclude that the viability of computationalism and strong AI depends on their addressing the indeterminacy objection, but that it is currently unclear how this objection can be successfully addressed.

    KW - Artificial intelligence

    KW - Cognitive science

    KW - Computation

    KW - Essentialism

    KW - Functionalism

    KW - Indeterminacy

    KW - Philosophy of mind

    KW - Searle's Chinese room argument

    KW - Semantics

    UR - http://www.scopus.com/inward/record.url?scp=0037404304&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0037404304&partnerID=8YFLogxK

    U2 - 10.1023/A:1022947527614

    DO - 10.1023/A:1022947527614

    M3 - Article

    AN - SCOPUS:0037404304

    VL - 13

    SP - 285

    EP - 319

    JO - Minds and Machines

    JF - Minds and Machines

    SN - 0924-6495

    IS - 2

    ER -