The Hebbian unlearning algorithm, i.e., an unsupervised local procedure used to improve the retrieval properties in Hopfield-like neural networks, is numerically compared to a supervised algorithm to train a linear symmetric perceptron. We analyze the stability of the stored memories: basins of attraction obtained by the Hebbian unlearning technique are found to be comparable in size to those obtained in the symmetric perceptron, while the two algorithms are found to converge in the same region of Gardner’s space of interactions, having followed similar learning paths. A geometric interpretation of Hebbian unlearning is proposed to explain its optimal performances. Because the Hopfield model is also a prototypical model of the disordered magnetic system, it might be possible to translate our results to other models of interest for memory storage in materials.

1.
J. J.
Hopfield
, “
Neural networks and physical systems with emergent collective computational abilities
,”
Proc. Natl. Acad. Sci. U. S. A.
79
,
2554
(
1982
).
2.
D. J.
Amit
,
Modeling Brain Functions: The World of Attractor Neural Networks
(
Cambridge University Press
,
1989
).
3.
P.
Peretto
, “
Collective properties of neural networks: A statistical physics approach
,”
Biol. Cybern.
50
,
51
(
1984
).
4.
D. O.
Hebb
,
The Organization of Behavior: A Neuropsychological Theory
(
Wiley
,
1949
).
5.
D. J.
Amit
,
H.
Gutfreund
, and
H.
Sompolinsky
, “
Storing infinite numbers of patterns in a spin-glass model of neural networks
,”
J. Stat. Phys.
55
,
1530
(
1985
).
6.
E.
Gardner
, “
Structure of metastable states in the Hopfield model
,”
J. Phys. A: Math. Gen.
19
,
L1047
(
1986
).
7.
M.
Benedetti
,
V.
Dotsenko
,
G.
Fischetti
,
E.
Marinari
, and
G.
Oshanin
, “
Recognition capabilities of a Hopfield model with auxiliary hidden neurons
,”
Phys. Rev. E
103
,
L060401
(
2021
).
8.
I.
Kanter
and
H.
Sompolinsky
, “
Associative recall of memory without errors
,”
Phys. Rev. A
35
(
1
),
380
(
1987
).
9.
V. S.
Dotsenko
,
N. D.
Yarunin
, and
E. A.
Dorotheyev
, “
Statistical mechanics of Hopfield-like neural networks with modified interactions
,”
J. Stat. Phys.
24
,
2419
(
1991
).
10.
A.
Plakhov
and
S.
Semenov
, “
The modified unlearning procedure for enhancing storage capacity in Hopfield network
,” in
RNNS/IEEE Symposium on Neuroinformatics and Neurocomputers
(
IEEE
,
1992
).
11.
J. J.
Hopfield
,
D. I.
Feinstein
, and
R. G.
Palmer
, “
Unlearning has a stabilizing effect in collective memories
,”
Nature
304
,
158
(
1983
).
12.
A.
Fachechi
,
E.
Agliari
, and
A.
Barra
, “
Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones
,”
Neural Networks
112
,
24
(
2019
).
13.
V.
Folli
,
M.
Leonetti
, and
G.
Ruocco
, “
On the maximum storage capacity of the Hopfield model
,”
Front. Comput. Neurosci.
10
,
144
(
2017
).
14.
E.
Gardner
, “
The space of interactions in neural network models
,”
J. Phys. A: Math. Gen.
21
,
257
(
1988
).
15.
E.
Gardner
,
H.
Gutfreund
, and
I.
Yekutieli
, “
The phase space of interactions in neural networks with definite symmetry
,”
J. Phys. A: Math. Gen.
22
,
1995
(
1989
).
16.
B. M.
Forrest
, “
Content-addressability and learning in neural networks
,”
J. Phys. A: Math. Gen.
21
,
245
(
1988
).
17.
J. L.
van Hemmen
,
L. B.
Ioffe
,
R.
Kühn
, and
M.
Vaas
, “
Increasing the efficiency of a neural network through unlearning
,”
Physica A
163
,
386
(
1990
).
18.
J. L.
van Hemmen
and
N.
Klemmer
, in
Unlearning and Its Relevance to REM Sleep: Decorrelating Correlated Data
, edited by
J.
Taylor
,
E.
Caianiello
,
R.
Cotterill
, and
J.
Clark
(
Springer
,
London, UK
,
1992
).
19.
S.
Wimbauer
,
N.
Klemmer
, and
J. L.
van Hemmen
, “
Universality of unlearning
,”
Neural Networks
7
,
261
(
1994
).
20.
M.
Minsky
and
S.
Papert
,
Perceptrons: An Introduction to Computational Geometry
(
MIT Press
,
1969
).
21.
W.
Krauth
,
J.-P.
Nadal
, and
M.
Mezard
, “
The roles of stability and symmetry in the dynamics of neural networks
,”
J. Phys. A: Math. Gen.
21
(
13
),
2995
(
1988
).
22.
A.
Theumann
, “
Space of interactions with definite symmetry in neural networks with biased patterns as a spin-glass problem
,”
Phys. Rev. E
53
(
6
),
6361
(
1996
).
23.
F.
Aguirre-Lopez
,
M.
Pastore
, and
S.
Franz
, “
Satisfiability transition in asymmetric neural networks
” (unpublished) (
2022
).
24.
M.
Barber
, “
Finite-size scaling
,” in
Phase Transitions and Critical Phenomena
(
Academic Press
,
London
,
1983
), Vol. 8, pp.
145
266
.
25.
F.
Altarelli
,
R.
Monasson
,
G.
Semerjian
, and
F.
Zamponi
, “
Connections to statistical physics
,” in
Handbook of Satisfiability
, 2nd ed. (
IOS
,
2021
), Chap. 22, pp.
859
901
.
26.
J. L.
van Hemmen
, “
Hebbian learning, its correlation catastrophe, and unlearning
,”
Network: Comput. Neural Syst.
9
,
153
(
1998
).
27.
D.
Kleinfield
and
D.
Pendergraft
, “
‘Unlearning’ increases the storage capacity of content addressable memories
,”
Biophys. J.
51
,
47
(
1987
).
28.
J. A.
Horas
and
P. M.
Pasinetti
, “
On the unlearning procedure yielding as high-performance associative memory neural network
,”
J. Phys. A: Math. Gen.
31
,
L463
(
1998
).
29.
L.
Personnaz
,
I.
Guyon
, and
G.
Dreyfus
, “
Information storage and retrieval in spin-glass like neural networks
,”
J. Phys. Lett.
46
,
359
(
1985
).
30.
U.
Farooq
and
G.
Dragoi
, “
Emergence of preconfigured and plastic time-compressed sequences in early postnatal development
,”
Science
363
,
168
(
2019
).
31.
J. D.
Payne
and
L.
Nadel
, “
Sleep, dreams, and memory consolidation: The role of the stress hormone cortisol
,”
Learn. Mem.
11
(
6
),
671
(
2019
).
32.
F.
Crick
and
G.
Mitchison
, “
The function of dream sleep
,”
Nature
304
,
111
(
1983
).
33.
O.
Kinouchi
and
R.
Kinouchi
, “
Dreams, endocannabinoids and itinerant dynamics in neural networks: Re-elaborating the Crick-Mitchison unlearning hypothesis
,” arXiv:cond-mat/0208590 (
2002
).
34.
E.
Hoel
, “
The overfitted brain: Dreams evolved to assist generalization
,”
Patterns
2
(
5
),
100244
(
2021
).
35.
N.
Srivastava
,
G.
Hinton
,
A.
Krizhevsky
,
I.
Sutskever
, and
R.
Salakhutdinov
, “
Dropout: A simple way to prevent neural networks from overfitting
,”
J. Mach. Learn. Res.
15
,
1929
(
2014
).
36.
T.
Tadros
,
G. P.
Krishnan
,
R.
Ramyaa
, and
M.
Bazhenov
, “
Biologically inspired sleep algorithm for increased generalization and adversarial robustness in deep neural networks
,” in
International Conference on Learning Representations
,
2019
.
37.
J.
Dapello
,
J.
Feather
,
H.
Le
,
T.
Marques
,
D.
Cox
,
J.
McDermott
,
J.
DiCarlo
, and
S.
Chung
, “
Neural population geometry reveals the role of stochasticity in robust perception
,” in
NeurIPS Proceedings
,
2021
.
38.
A.
Treves
and
D. J.
Amit
, “
Metastable states in asymmetrically diluted Hopfield networks
,”
J. Phys. A: Math. Gen.
21
,
3155
(
1988
).
39.
S.
Hwang
,
V.
Folli
,
E.
Lanza
,
G.
Parisi
,
G.
Ruocco
, and
F.
Zamponi
, “
On the number of limit cycles in asymmetric neural networks
,”
J. Stat. Mech.: Theory Exp.
2019
(
5
),
053402
.
40.
S.
Hwang
,
E.
Lanza
,
G.
Parisi
,
J.
Rocchi
,
G.
Ruocco
, and
F.
Zamponi
, “
On the number of limit cycles in diluted neural networks
,”
J. Stat. Phys.
181
(
6
),
2304
(
2020
).
41.
A.
Battista
and
R.
Monasson
, “
Capacity-resolution trade-off in the optimal learning of multiple low-dimensional manifolds by attractor neural networks
,”
Phys. Rev. Lett.
124
,
048302
(
2019
).
42.
N.
Pashine
,
D.
Hexner
,
A. J.
Liu
, and
S. R.
Nagel
, “
Directed aging, memory, and nature’s greed
,”
Sci. Adv.
5
,
eaax4215
(
2019
).
43.
N. C.
Keim
,
J. D.
Paulsen
,
Z.
Zeravcic
,
S.
Sastry
, and
S. R.
Nagel
, “
Memory formation in matter
,”
Rev. Mod. Phys.
91
,
035002
(
2019
).
You do not currently have access to this content.