I. INTRODUCTION
The recollection of events from our childhood is part of what makes each of us unique. The storage and manipulation of memories allows us to think and to reason. Repetitive action—practice—allows precision performance in music and in sports. When no reason is left as our faculties fail, we may still recall the names of our closest relatives; that memory seems to last when our brains can no longer retrieve newer information. Our muscular aches and pains remind us of our activities of recent days. Yes! Our experience of memories is an indelible imprint of being alive.
What has not always been so obvious is that non-biological materials can also store memories that are reminiscent of those in the biological realm and can mimic those mentioned above. Thus, there are materials and networks that perform desired functions only because of the way in which they were initially manipulated—a form of rote memory. There are other forms of physical matter that learn pathways between initial and final states as they are repetitively strained. There are physical systems of non-biological origin that can store many memories initially and then forget all but one, and, of course, there are materials that accumulate the dings and scratches caused by their everyday previous applications.
This is only a very small fraction of the types of memories that can be observed both in the biology and physical science—but these examples do show that there are many ways in which memories in these two seemingly distinct worlds may mirror one another. While many of the examples that we mentioned were taken from storage in our brains, biology uses many different forms of memory that appear throughout an organism. These appear in the exquisitely intricate operation of the immune system or the storage of heritable characteristics in our DNA. We can now ask whether these forms of memory can also be encountered in the non-biological world.
The topic of memories has become increasingly important in materials science because, as a phenomenon, memories can occur only in systems that are neither fully ordered nor in thermal equilibrium. After all, systems that are perfectly ordered or in complete thermal equilibrium are not distinct from one another—while at the atomic scale the precise point in phase space may be unique, there is no practical way of reading that information out. Thus, in order to store and read out information, one needs to have different states that are distinguishable at a scale that can be readily manipulated. This introduces the need for some form of coarse-grained quantity that allows imprinting and easy recovery of information. That information will be lost as the system approaches equilibrium. Thus, memory formation and retrieval may provide a productive foothold for distinguishing different kinds of processes and states in systems that have not yet reached equilibrium.
Likewise, disorder plays an important role in training. There are so many ways that an amorphous system can be rearranged to give nearly the same behavior, that disordered materials are a promising platform for creating functionality via training. Indeed, networks based on jammed configurations of particles have been trained to enable long-range allosteric interactions—a function often found in proteins but more unusual in physical spring networks. Glasses and other disordered materials can be manipulated so that they behave in specific and desired ways due to their manipulation. The study of such glassy states raises many issues, such as what is the precise object that stores the memory, the number of memories that can be stored, the entropy of memories, or the complexity of each individual behavior (that is, how many targets can be controlled by a single source).
From these observations, it becomes apparent that understanding the intricacies of memory formation in matter—how memories are encoded and how they are retrieved—can lead to a novel way for characterizing far-from-equilibrium behavior. Different systems can encode memories in different ways. Thus, memory formation becomes a probe for studying in which ways different disordered and far-from-equilibrium systems are similar or different from one another.
II. SUMMARY OF AREAS COVERED
This special issue of the Journal of Chemical Physics describes some of the most recent advances on this topic. We hope it will help develop unifying viewpoints to understand memory effects in matter and that it spurs more thoughts about how different physical and biological phenomena can be productively understood in terms of memory formation. In what follows, we classify the articles in this volume into four sections: (i) cyclically driven suspensions and solids, (ii) complex solids and polymers, (iii) systems of biological relevance, and (iv) platforms for storing memory. We will give an overview of these articles in the remainder of this introduction.
III. OVERVIEW OF THE SPECIAL ISSUE
A. Cyclically driven suspensions and solids
Memory formation in systems subjected to periodic (or cyclic) deformation (or forces) has received considerable attention in past work. Such systems include glasses and other amorphous solids, colloidal suspensions, grains, bubble rafts, or models thereof, which have been investigated computationally and experimentally. In models and experiments on athermal suspensions, cyclically deformed suspensions have been shown to retain memory of the deformation amplitude, but multiple memories are transient. In investigations of amorphous solids, the possibility of multiple memories has been demonstrated and has been analyzed in terms of transitions between microscopic states characterized by the occupancy of sets of idealized two-level systems, or hysterons, and interactions among them. This theme of investigations is well represented in this volume, with new results from an experimental investigation of memory effects in a dense suspension of particles with adhesive interactions (cornstarch), computational investigation of particles cyclically driven through arrays of obstacles, investigation of coarse-grained (elastoplastic) models that aim to represent the response of amorphous solids to cyclic shear deformation, and a model that incorporates interactions between memory-carrying soft spots (or two-level systems) to understand the emergence of multi-periodicity in cyclically sheared amorphous solids.
Chattopadhyay and Majumdar1 investigate experimentally the formation and readout of the memory of the shear strain amplitude that a dense suspension of cornstarch particles is subjected to. This system is distinguished from non-Brownian suspensions investigated in previous work by the adhesive interactions possible between the constituent particles that can form extended connected structures. The authors also employ the differential shear modulus to characterize the memory rather than the mean squared displacement of particles that has been previously employed both in previous computational and experimental investigations. The memory of a shear amplitude at which the system is trained is manifested as a peak at that amplitude in the differential modulus. The authors offer an intuitive explanation in terms of the stretching of connected clusters upon the application of strain. They also demonstrate the possibility to store multiple memories. As with previous studies, the authors find that the application of a larger amplitude of deformation erases smaller amplitude memories, and thus, the order in which deformation is applied is important in retaining multiple memories.
Reichhardt and Reichhardt2 consider the dynamics of particles that are cyclically driven in the presence of a fixed, periodic, square array of obstacles in two dimensions, a model previously employed to investigate locking and clogging effects. They investigate the transition between reversible and irreversible motion of the free particles as a function of the driving amplitude, the total density of the system, and the angle at which the driving is applied. Broadly, their study demonstrates, in this system, the phenomenology of the reversible–irreversible (R-IR) transition that has been investigated for colloidal suspensions and glasses. They also show the R-IR transition has an interesting non-monotonic dependence on the angle of forcing, a feature not present in related systems. In previous studies, memory effects have been investigated in reference to the R-IR transition, typically below such a transition, and thus these results provide a characterization that will be useful in investigating memory effects.
In the work of Liu et al.,3 the authors investigate the response to cyclic shear in a coarse-grained, elastoplastic, model of amorphous solids. While such models have been employed in studying yielding behavior under uniform shear, they have been used to study the effect of cyclic shear deformation relatively recently (see also Ref. 4 in this volume, described below). The model incorporates the key features of (a) different annealing states of the amorphous solid, in specifying the local description of individual cells (into which the entire system is decomposed), and (b) long-range interactions between cells as a result of plastic rearrangements within individual cells. The transitions to new states when a given cell becomes unstable are treated stochastically, subject to a set of rules. The work demonstrates the ability of the coarse-grained model to reproduce several key aspects of the phenomenology associated with yielding behavior under cyclic shear as seen in earlier atomistic simulations of model glasses. In particular, they discuss the distinction of states below a yielding line in the shear amplitude–initial energy plane that separates a regime wherein the system reaches a periodic final state that retains memory of the initial state and the regime wherein the memory of initial conditions is lost, and the final state is independent of initial conditions.
In the work of Kumar et al. also,4 the authors investigate the response to cyclic shear in a coarse-grained elastoplastic model, constructed along lines similar to those presented in the work of Liu et al.3 However, the model developed in this case incorporates additional elements that pertain to features that are important for memory formation. The authors consider transition rules between states of the individual cells (that correspond to elastic branches) such that the new state reached is always the same for a given initial state and the direction of stress change rather than being stochastic as in Ref. 3. Such an approach, as the authors argue, is necessary in order to properly describe limit cycles that have been demonstrated to emerge under athermal dynamics in atomistic model glasses. In exploring aspects relevant to memory formation, the authors construct transition graphs that encapsulate the transitions that occur between states as a result of shear, both for the coarse-grained model and an atomistic glass. They investigate the characteristics of strongly connected components (SCCs) within which transitions must be confined in order to obtain cyclic response and study the dependence of SCCs on the aging of the glasses. This analysis offers a way of understanding the relatively larger memory capacity of poorly aged glasses compared to well-aged glasses.
In the work of Szulc et al.,5 the authors consider an intriguing aspect of cyclically sheared amorphous solids, which is the presence of limit cycles that are multi-periodic. Under certain conditions of cyclic driving, amorphous solids can exhibit periodic response in which a return to the initial state occurs after more than one cycle of driving. Within an idealized description of plasticity as arising from the presence of noninteracting two-level systems, such multi-periodic behavior cannot be explained. However, the authors show that including interactions among such two-level systems, or soft spots, results in a modulation of the switching thresholds of the soft spots, which cause them to be active during some cycles of driving but idle in others, leading to multi-periodic cycles. This behavior is investigated both in a model the authors construct and from analysis of simulations of model amorphous solids. The authors discuss how the presence of frustrated interactions leading to multi-periodic behavior has relevance to manipulating memory encoding and readout, which have indeed been investigated in crumpled and corrugated sheets.
B. Complex solids and polymers
Memory effects are generally intrinsic to flexible macromolecules and their assemblies. Even a single macromolecule can exhibit memory effects due to its capacity to adopt an enormous number of conformations and their associated conformational entropy. As an example, when a macromolecule is cooled in a solution, it is forced to explore a free energy landscape consisting of a large number of metastable conformations, before reaching its final structure. The pathway taken by the molecule in the free energy landscape determines the molecule’s final structure. When the molecule is heated back up, its kinetics remembers the original pathway taken during cooling. Another example is the response of a macromolecule to an externally imposed force on the molecule. The response is such that relaxations occurring at shorter length and time scales determine the relaxations at length and time scales pertinent to the whole molecule. Such hierarchical cascades of relaxational modes and energy dissipation are commonly expressed in terms of conformational memory. Such features are amplified in many physical processes involving many macromolecules as in polymer crystallization and glass transition.
Considering a system of flexible ideal polymer chains without any nonbonded interactions, Müller6 has shown that the memory kernel for the time evolution of density fluctuations is required to be nonlocal in space and time to adequately capture the time evolution of density fluctuations occurring at all length scales inside each chain. In demonstrating this result using analytical calculations and simulations, the author has shown that the commonly used dynamic self-consistent field theory is insufficient to portray the memory effect that couples the various internal modes of the polymer chains.
A fundamental understanding of the kinetics of microphase separation in diblock copolymer melts doped with salts is of considerable interest in formulating rational design of solid polymer electrolytes toward energy storage devices. The memory effects caused by conformational entropy of the constituent diblock macromolecules are known to result in nonlocal transport coefficients. So far, there are no microscopic polymer models to address the connection among polymer chemistry, conformational memory, and transport coefficients to describe ordering of microphases from thin films of disordered diblock copolymers in the presence of ionic currents generated by doped ions under applied electric fields. Toward developing a microscopic model for this situation, Zhang and Kumar7 have investigated a local order-parameter dependent transport coefficient that is self-consistent with linear irreversible thermodynamics, with promise for future developments of microscopic models to address nonlocal transport coefficients and conformational memory for microphase separating ion-doped diblock copolymer systems.
As mentioned above, memory effects are common in the behavior of crystallization and melting of semicrystalline polymers. As an example, a molten polymer can remember its previous history of semicrystalline state when it is cooled to undergo crystallization. Due to this “melt memory,” the crystallization temperature is higher than in the preceding cycle of crystallization. By tuning thermal cycles, the cost of processing of industrial semicrystalline polymers can be substantially reduced. As a complimentary approach, the paper of Luo et al.8 addresses the role of externally applied strains on semicrystalline polymers prepared at different temperatures. Using dynamic Monte Carlo simulations, the authors show that there are three regimes for the response of semicrystalline polymers to strain. At low temperatures, the semicrystalline state undergoes local melting followed by annealing due to strain. At high temperatures, the semicrystalline state melts completely at weak strains, followed by strain induced crystallization at stronger strains. At intermediate temperatures, the semicrystalline polymers melt only partially at lower strains, followed by memory-influenced crystallization at higher strains. At very high strains, the memory effect is lost and the final semicrystalline state is essentially the same at all temperatures. The authors suggest that, analogous to thermal treatment at high enough temperatures, large enough strains can be used to remove memory effects associated with thermal history.
In the paper of Sangroniz et al.,9 the role of intermolecular interactions on melt memory is investigated. A polymer melt obtained by heating from its semicrystalline state undergoes crystallization at a higher crystallization temperature upon cooling than in the previous cycle of crystallization. The melt appears as if it remembers its previous crystalline state. The authors have systematically explored the role of nonuniversal chemical details of the polymer on the universal features of melt memory. Furthermore, the effects of self-seeding temperature, holding time at this temperature, and the cooling rate on melt memory are reported in this article.
C. Systems of biological relevance
In this volume, two different classes of memory are discussed in the context of biological systems: (1) hysteretic behavior in blood flows and (2) transient memory formation due to mechanical feedback in growing tissue monolayers.
In the paper of Javadi and Jamali,10 thixotropy in blood flow—memory formation and sensitivity of blood to the flow history—is studied. Blood, a suspension of blood cells in plasma, is constantly subject to different deformation rates as it flows through the circulatory system of organisms. Concentration of red blood cells as well as cell aggregation, deformability, and cell membrane rigidity are known to directly influence viscosity, non-Newtonian effects, and other properties of blood. However, the thixotropic (memory) and viscoelastic time scales in the fluid are not easily distinguishable, and it is a challenge to determine the biophysical underpinning of the thermokinematic memory formation. The authors address this challenge with detailed in silico experiments and find that the main drivers for the non-Newtonian behavior are concentrations of the red blood cells and circulating protein complexes (fibrinogen). Moreover, the authors find that the microstructure and the aggregation of clusters of red blood cells show memory formation that manifests itself in the hysteretic behavior of the orientation angle of the individual cells. The authors show that the larger the thixotropic timescale, the longer it will take for the blood to forget its memory of previous flow. These results have an impact on hemorheology, which is an important diagnostic tool for many blood-altering diseases.
The paper of Sinha et al.11 looks carefully at the interplay between cell growth and division, and a mechanical feedback mechanism that regulates different process during tissue growth, by using a minimal two-dimensional computational model. Starting from cells that do not grow or divide, where the dynamics is governed only by the short-ranged cell–cell interactions, the authors systematically include different processes one by one in order to find the origin of observed cell dynamics in this nonequilibrium system. When both cell growth and division are included, the dynamics is regulated by the mechanical feedback mediated by a stress threshold. The authors find that the force autocorrelation function exhibits an increase with increase in the stress threshold, implying non-Markovian dynamics and memory effects in growing tissues. Moreover, as they can follow dynamics of individual cells, they find trajectories very different from simple random motion, whose persistence increases as the stress threshold is increased. The authors conclude that these are at the origin of memory formation in tissue-like active systems.
D. Platforms for storing memory
Different classes of memory require specific forms of interactions between the components in order to store information. For example, the memory encoded in a spin echo needs a multitude of oscillators whose phases can be manipulated by an external field; the pulse duration memory in charge-density waves requires the ability of the material to approach a fixed point, specified by an external drive and that differs depending on the amplitude of that drive. There are many other examples12 showing that memories can take many forms, with different attributes, different interactions, different components, different decay times, different storage ability, and different robustness against defects. Thus, it is important to understand not only what information needs to be remembered but also what platform and what types of training are required to best encode that information. This is an ongoing endeavor. One ongoing line of research, well represented in this volume, is to investigate theoretical concepts of memory and implement them so that they can be made experimentally manifest.
In the paper by Ding and van Hecke,13 they invent a platform for studying the memories where the individual elements are bistable. These elementary units can each store one bit of memory, depending on which of its two stable configurations is occupied. The hysteresis encoded in each element is apparent when the system is driven by larger or smaller external forcing so that the energy barrier between the two states disappears. On returning to an intermediate value, the system is caught in one of the two states that depends on the history of the forcing. These individual elements are sometimes referred to as hysterons. When there are many independent hysterons in a sample, the resulting memory can grow to be very complex even though each element is itself very simple. In a sample in which the two-state systems are infinitesimal, the hysterons can become difficult to observe or control directly. Therefore, it becomes ambiguous how to model such a system. In their paper, Ding and van Hecke explicitly introduce hysterons into a mechanical metamaterial where slender struts buckle when forced by an external pusher. This brings the hysterons into the macroscopic scale so that under cyclic compression, the authors can tune the precise pathway and ordering of the buckling struts and observe how one strut couples to the others and to the externally imposed stresses.
In the paper of Wycoff et al.,14 the focus is not so much on what are the individual elements of the memory formation but rather on the rules for updating the system as learning proceeds. Inspired by the behavior of biological systems, where each synapse updates its state using only local information without reference to an external clock, the authors ask how an artificial network responds to different updating rules. The answer was not obvious—allowing a system to use all available (i.e., global) information is advantageous for finding particularly low-lying states; however, desynchronization, where each element proceeds at its own rate without waiting for a timing signal for the entire system to reset, removes the need for a global communication across the network. The authors showed that a desynchronous learning model can in many cases be as effective at performing a task as its synchronous counterpart. Moreover, they tested these results in self-adjusting-resistor-network experiments and showed that desynchronous updating can even improve the performance by allowing more efficient exploration of the phase space. The conclusion is that by distributing the learning process, networks can not only have better performance but also, crucially important for future application, be scalable so that they can be easily fabricated and deployed.
The paper of Benedetti et al.15 concerns the retrieval of memories from a neural network. In the Hopfield model for associative memory, a Hebbian algorithm was used to encode the memories in the system by adjusting the interactions between the variables on each node. This is an unsupervised process with desynchronous dynamics. However, despite its successes at encoding memories, the capacity of such a network is limited due to the proliferation of spurious memories when multiple memories are encoded. There have been many efforts at trying to increase the number of memories that can be retrieved. One process involves the idea of smoothing the landscape (Hebbian unlearning) so that extraneous minima in the energy landscape do not detract from the true memories in which the information is stored. The paper compares this form of memory storage and retrieval with one based on the symmetric perceptron model in which the learning is supervised and the updates are synchronous. The Hebbian unlearning results were comparable in size to those obtained in the case of the symmetric perceptron, which is explained by geometric interpretation of Hebbian unlearning procedure. It is intriguing to understand the generality of these results and to see how they can be applied to memory storage in other physical systems.
IV. CONCLUDING REMARKS
The papers included in this volume show that the fledgling study of material memory provides an exciting opportunity to create a new discipline for organizing our thoughts about matter—especially about disordered matter that is far-from-equilibrium. It is an exciting time as we begin to wrestle intellectually with how to distinguish different categories of memory, how to classify storage and entropy in each of these different categories, and how to train matter effectively.