A neural network is a large, highly interconnected assembly of simple elements. The elements, called neurons, are usually two‐state devices that switch from one state to the other when their input exceeds a specific threshold value. In this respect the elements resemble biological neurons, which fire—that is, send a voltage pulse down their axons—when the sum of the inputs from their synapses exceeds a “firing” threshold. Neural networks therefore serve as models for studies of cooperative behavior and computational properties of the sort exhibited by the nervous system.
Statistical Mechanics of Neural Networks
Haim Sompolinsky; Statistical Mechanics of Neural Networks. Physics Today 1 December 1988; 41 (12): 70–80. https://doi.org/10.1063/1.881142
Download citation file:
Purchase an annual subscription for $25. A subscription grants you access to all of Physics Today's current and backfile content.