Memory capacity in neural network models: Rigorous lower bounds

Research output: Contribution to journalArticle

Abstract

We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l ≥ 3: 1. 1. m can grow as fast as αNl-1. 2. 2. α can be as large as Bl/ln(1/δ) as δ → 0. 3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small α. These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) · α represents the number of stored bits per distinct synapse. The constant (l!) · Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l → ∞. We obtain rigorous lower bounds for the threshold value (l!) · αc (the maximum possible value of (l!) · α with δ unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l → ∞, the bound tends to zero as fast as (l!) · Bl.

Original languageEnglish (US)
Pages (from-to)223-238
Number of pages16
JournalNeural Networks
Volume1
Issue number3
DOIs
StatePublished - 1988

Fingerprint

Neural Networks (Computer)
Neural networks
Data storage equipment
Synapses
Neurons
Computer simulation

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Cite this

Memory capacity in neural network models : Rigorous lower bounds. / Newman, Charles.

In: Neural Networks, Vol. 1, No. 3, 1988, p. 223-238.

Research output: Contribution to journalArticle

@article{cd14364a97634072a0cbe3683679ef39,
title = "Memory capacity in neural network models: Rigorous lower bounds",
abstract = "We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l ≥ 3: 1. 1. m can grow as fast as αNl-1. 2. 2. α can be as large as Bl/ln(1/δ) as δ → 0. 3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small α. These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) · α represents the number of stored bits per distinct synapse. The constant (l!) · Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l → ∞. We obtain rigorous lower bounds for the threshold value (l!) · αc (the maximum possible value of (l!) · α with δ unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l → ∞, the bound tends to zero as fast as (l!) · Bl.",
author = "Charles Newman",
year = "1988",
doi = "10.1016/0893-6080(88)90028-7",
language = "English (US)",
volume = "1",
pages = "223--238",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",
number = "3",

}

TY - JOUR

T1 - Memory capacity in neural network models

T2 - Rigorous lower bounds

AU - Newman, Charles

PY - 1988

Y1 - 1988

N2 - We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l ≥ 3: 1. 1. m can grow as fast as αNl-1. 2. 2. α can be as large as Bl/ln(1/δ) as δ → 0. 3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small α. These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) · α represents the number of stored bits per distinct synapse. The constant (l!) · Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l → ∞. We obtain rigorous lower bounds for the threshold value (l!) · αc (the maximum possible value of (l!) · α with δ unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l → ∞, the bound tends to zero as fast as (l!) · Bl.

AB - We consider certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed. Rigorous proofs of the following are presented both for l = 2 and l ≥ 3: 1. 1. m can grow as fast as αNl-1. 2. 2. α can be as large as Bl/ln(1/δ) as δ → 0. 3. 3. Retrieved memories overlapping with several initial patterns persist for (very) small α. These phenomena were previously supported by numerical simulations or nonrigorous calculations. The quantity (l!) · α represents the number of stored bits per distinct synapse. The constant (l!) · Bl is determined explicitly; it decreases monotonically with l and tends to zero exponentially fast as l → ∞. We obtain rigorous lower bounds for the threshold value (l!) · αc (the maximum possible value of (l!) · α with δ unconstrained): 0.11 for l = 2 (compared to the actual value between 0.28 and 0.30 as estimated by Hopfield and by Amit, Gutfreund, and Sompolinsky), 0.22 for l = 3 and 0.16 for l = 4; as l → ∞, the bound tends to zero as fast as (l!) · Bl.

UR - http://www.scopus.com/inward/record.url?scp=0023869164&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0023869164&partnerID=8YFLogxK

U2 - 10.1016/0893-6080(88)90028-7

DO - 10.1016/0893-6080(88)90028-7

M3 - Article

VL - 1

SP - 223

EP - 238

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 3

ER -