In the brain, coherent neuronal activities often appear simultaneously in multiple frequency bands, e.g., as combinations of alpha (8–12 Hz), beta (12.5–30 Hz), and gamma (30–120 Hz) oscillations, among others. These rhythms are believed to underlie information processing and cognitive functions and have been subjected to intense experimental and theoretical scrutiny. Computational modeling has provided a framework for the emergence of network-level oscillatory behavior from the interaction of spiking neurons. However, due to the strong nonlinear interactions between highly recurrent spiking populations, the interplay between cortical rhythms in multiple frequency bands has rarely been theoretically investigated. Many studies invoke multiple physiological timescales (e.g., various ion channels or multiple types of inhibitory neurons) or oscillatory inputs to produce rhythms in multi-bands. Here, we demonstrate the emergence of multi-band oscillations in a simple network consisting of one excitatory and one inhibitory neuronal population driven by constant input. First, we construct a data-driven, Poincaré section theory for robust numerical observations of single-frequency oscillations bifurcating into multiple bands. Then, we develop model reductions of the stochastic, nonlinear, high-dimensional neuronal network to capture the appearance of multi-band dynamics and the underlying bifurcations theoretically. Furthermore, when viewed within the reduced state space, our analysis reveals conserved geometrical features of the bifurcations on low-dimensional dynamical manifolds. These results suggest a simple geometric mechanism behind the emergence of multi-band oscillations without appealing to oscillatory inputs or multiple synaptic or neuronal timescales. Thus, our work points to unexplored regimes of stochastic competition between excitation and inhibition behind the generation of dynamic, patterned neuronal activities.

Coherent neuronal activities are found in many brain regions, ranging from visual and auditory cortices,1–5 to frontal and parietal cortices,6–14 to the hippocampus,15–18 the amygdala,19 and the striatum,20 and are often manifested as neuronal oscillations. Dynamically, these rhythms are likely to be a reflection of the many possible interactions between excitation and inhibition, between disparate synaptic and neuronal timescales, and between local and long-range recurrent connectivities. It is also commonly believed that they form a neurophysiological basis of sensory processing and cognitive functions, such as attention,21,22 learning,23 and memory.13 Therefore, neuronal oscillations have been subjected to intense experimental and theoretical examinations, ranging from the specific functions of these brain rhythms to inferences of their roles in information processing and brain computations.

Much experimental evidence has correlated various population oscillations to enhanced cognition or behavioral tasks. For instance, the enhancement of gamma oscillations has been shown to sharpen visual feature detection in the primary visual cortex2,24–26 and direction selectivity in MT.27,28 Attentional states are often accompanied by more significant gamma-band synchronization.21,29–31 Theta-band oscillations have been found to be correlated with visual spatial memory,32–35 while physiologically and pharmacologically enhanced theta rhythms improve learning and memory.33,36–38 Alpha oscillations have been linked to perceptual learning,31,39–42 and recent work has suggested a possible role in coordinating activity between brain regions.43–45 Cortical beta rhythms have been correlated with working memory, decision-making, and sensorimotor activities.46–49 

Many studies, experimental, theoretical, and modeling, have focused on the emergence of collective rhythms of a single-frequency band. Numerical simulations of large-scale models of the visual cortex have shown dynamics consistent with the experimentally observed gamma rhythms.50 Studies demonstrated that coherent neuronal oscillations can be driven by external inputs or may emerge as internal states of the network.51,52 Theoretical investigations have pointed to the importance of recurrent couplings between interneurons and excitatory pyramidal cells and the role of synaptic timescales.50,53–58 Substantial theoretical progress has been made by mapping the so-called theta neurons to systems of phase oscillators, which, in the weakly coupled limit, are amenable to mean-field reductions demonstrating how collective dynamics can lead to phase synchronization.59,60

Compared to single-frequency band neuronal rhythms, there has been much less work on the mechanisms underlying multi-band oscillations. Theoretically, studies have built upon single-frequency synchronization by introducing multiple synaptic or neuronal timescales or by the interaction between multiple neuronal pools.57,61,62 Researchers have also pointed out that different frequency input forcing near the Hopf bifurcation of a Hodgkin–Huxley type neuron can result in nested oscillations, similar to the theta-gamma nested rhythms observed in the hippocampus and during rapid eye movements.63,64

Here, we demonstrate that multi-band rhythms can emerge from a small neuronal network consisting of a single excitatory population recurrently coupled with a single inhibitory population, both driven by constant input. Recently, we investigated how stochastic gamma oscillations form a low-dimensional manifold in a dimension-reduced state space.65 Here, we show that a more detailed exploration of the dynamics reveals regions in parameter space where single-frequency, stochastic oscillations break up into different frequency bands, even in the presence of temporally stationary inputs.

In Sec. II, we introduce our integrate-and-fire (I&F) neuronal network model and explore the regime of repetitive, nearly periodic multiple-firing events (MFEs) as a function of model parameters. Focusing first on the repetition, we formulate and examine a data-driven Poincaré section theory. The Poincaré sections allow us to map the iterative, nearly synchronous coherent spiking events in a reduced, low-dimensional space. Our analysis reveals that the multi-band oscillations emerge from different modes of excitation-inhibition (E–I) competitions between neural populations, which are sufficiently captured by membrane potential distributions.

Experimentally, neuronal oscillations are typically studied by electrophysiological methods, and rhythmic oscillatory behavior is usually observed in the recordings of local field potentials.4,66 Mathematically, neuronal oscillations are usually modeled by temporally reoccurring activity patterns of a neuronal network. Well-known theories include the interneuronal network gamma (ING53,67,68), the pyramidal interneuronal network gamma (PING54,55), multiple-firing events (MFEs50,56), among others. Although these models include different levels of biological details and describe various types of oscillations in different brain areas, they generally share two common views: First, neuronal oscillations are reflected by reoccurring synchronized spiking patterns generated by neural ensembles; second, the reoccurring spiking patterns are induced by the rapid, recurrent interactions between neurons. Therefore, the informative aspects of neuronal oscillations (including frequencies, amplitudes, and phases) can be represented by the statistics of the spiking patterns. Furthermore, any study of the mechanism relies on the understanding of the neuronal interactions within the ensemble.

This paper investigates multi-band neuronal oscillations emerging from the dynamics of a simple spiking neuronal network containing only two neuronal populations: excitatory (E) and inhibitory (I), which are recurrently coupled. Oscillatory or any other temporally inhomogeneous inputs are absent. Following previous work,69 we focus on the so-called multiple-firing events (MFEs). We first present a few different temporally repetitive, synchronized spiking patterns found during our parameter space explorations. Appearing with different amplitudes and phases, these reoccurring MFEs form different beats robustly in many parts of parameter space. The firing patterns can exhibit multiple peaks in the frequency spectrum from the beta band (12.5–30 Hz) to the gamma band (30–90 Hz).

To understand the emergence of different beats, we focus on explaining how the tight competition between E/I populations shapes the repetitions and alternations of MFEs. Toward this end, we develop a novel data-driven Poincaré section theory for network dynamics. Most remarkably, our Poincaré section reveals that the E–I competition is essentially captured by the membrane potential distributions just before the occurrence of each MFE. Furthermore, by considering only the most informative features of the membrane potential distributions, a stochastic bifurcation analysis illustrates that

  1. different outcomes of the E–I competition give rise to different beats and

  2. the multi-band oscillations are prevalent in the high-dimensional parameter space.

In addition, we use a novel representation method to visualize the crucial bifurcations. Specifically, we relate the MFE beats to different geometrical structures on the principal manifolds of a reduced model proposed in our previous study.

The data-driven Poincaré section theory presents a top-down view of the emergence of multi-band oscillations. On the other hand, we also develop an autonomous ODE model from a bottom-up perspective, which qualitatively reproduces the MFE beats and bifurcations. The ODE model only preserves the most crucial features of the E–I competition (the interactions between the low-order statistics of the membrane potentials) and ignores many other biophysical details. We demonstrate that the model reduction and the ODE model meet halfway. They exhibit surprisingly similar geometrical features in the same reduced phase space.

Taken together, our results provide a theoretical framework to analyze the complicated oscillatory dynamics produced by the rapid competition between excitatory and inhibitory neuronal populations.

We numerically simulate a small, 400-neuron network to demonstrate multi-band oscillations. The network has been shown to produce stochastic, oscillatory dynamics in the gamma band.56,65,70,71 In this paper, we will vary the parameters based on our previous study,65 which aimed to model an input layer of the macaque primary visual cortex.

1. A simple, integrate-and-fire network

The network contains two populations of spiking neurons: 300 excitatory (E) cells and 100 inhibitory (I) cells. All neurons are driven by independent, identically distributed Poisson spike trains. The strength of the Poisson input maintains a physiologically realistic firing rate for each neuron.

The neuronal dynamics is described by an integrate-and-fire (I&F) model; i.e., the membrane potential ( v i) of neuron i is advanced by
d v i d t = I i E + I i I .
(1)
The membrane potential v i is a dimensionless variable running between the reversal potential V I = 2 / 3 and the spiking threshold V t h = 1. v is driven toward V t h by the excitatory current I i E and away from it through the inhibitory current I i I. When the membrane potential v i arrives at V t h, a spike is emitted by this neuron to all its postsynaptic cells, and v i is immediately reset to the rest potential V r = 0. After that, v i is held at V r for an absolute refractory period τ R (we first keep τ R = 0 for simplicity, leaving more explorations of the refractory time scale to Sec. II B 2). The E and I conductances (hence currents) are computed from the spiking series received by neuron i. The key features of the I&F model are depicted in Fig. 1(a). See more details in Sec. IV.
FIG. 1.

A network model producing different oscillatory dynamics. (a) The I&F network structure. (b) Raster plots and spike count plots (in 5-ms windows) for E neurons exhibiting three different canonical rhythms of MFEs: 1, 2, and 3-beat. E/I spikes are indicated by red/blue dots. (c) Averaged frequency spectrum of the three MFE patterns in (b). The error bars indicate the standard error (see Sec. IV for more details).

FIG. 1.

A network model producing different oscillatory dynamics. (a) The I&F network structure. (b) Raster plots and spike count plots (in 5-ms windows) for E neurons exhibiting three different canonical rhythms of MFEs: 1, 2, and 3-beat. E/I spikes are indicated by red/blue dots. (c) Averaged frequency spectrum of the three MFE patterns in (b). The error bars indicate the standard error (see Sec. IV for more details).

Close modal

Previous studies have revealed that the architecture of a neuronal network can significantly affect the dynamics of the network.72–76 In this paper, we investigate the multi-band oscillations emerging from structurally homogeneous networks, leaving detailed analysis of the impact of network architecture to future work. Specifically, we focus on two types of architectures: 1. the Erdös–Rényi (ER) model and 2. “annealed” network architectures.

When using the ER model, all possible synaptic couplings are independently selected with probability P, excluding self-couplings. Once generated, the network architecture is fixed during each simulation, and spikes of neuron i are only sent to its postsynaptic cells. As for the annealed architecture, when neuron i fires, all other cells have the same probability P to receive the spike independently. In this case, the recipients of neuron i’s spikes vary from spike to spike. Therefore, the same-type neurons with the same membrane potentials are interchangeable.

In the rest of the paper, we mainly demonstrate the results by using an ER network with P = 0.8. The network architecture is fixed for all network simulations, except when we test different synaptic coupling probabilities [Figs. 4(d) and 5(d); see more in Sec. II B 2]. All simulations and analyses are carried out on the annealed architecture as well, and the results are qualitatively similar to the ER network. Later, in Sec. II D, we will argue that different network architectures generated from the ER model exhibit similar dynamical statistics, especially for larger P. We will also compare the statistics of network dynamics of the ER networks to annealed architectures.

2. Different MFE beats

The network described above has been shown to produce stochastic gamma-band oscillations in many previous studies.65,70,71 Namely, spikes of E and I neurons cluster in time as MFEs, which reoccur in a frequency of roughly 40 Hz [see, e.g., Fig. 1(b), upper panel]. Despite the stochastic external drive, previous work demonstrated that the robust recurrence of MFEs comes from the dynamical competition between the E and I populations.50,58,65,70 Specifically, an MFE is triggered by the stochastic firing of a couple of E cells, leading to the rapid recruitment of both E and I neuron firings. However, the inhibitory synapses result in a longer timescale of I-current than the E-current (see Sec. IV). Therefore, in a typical regime of gamma oscillations, both E and I populations initially undergo highly correlated firing, followed by a dominance of inhibition, resulting in the termination of the MFE. After the inhibition wears off, the network again enters an excitable state, allowing for the next occurrence of an MFE. We refer the reader to Ref. 56 for more details of this recurrent excitation–inhibition mechanism of stochastic gamma oscillations.

However, the recurrence of MFE does not have to return to similar spiking patterns. During our parameter space exploration, the amplitudes of MFEs (i.e., the number of neurons involved) can alter in different fashions, forming different beats and rhythms. We describe the three most prevalent MFE rhythms below.

If consecutive MFE amplitudes do not vary significantly, it results in regular gamma oscillations [Fig. 1(b), upper]. This classical 1-beat MFE rhythm has been well investigated by previous studies.56,65,70,71 Besides that, MFEs can also alternate with large and small amplitudes. For example, in a 2-beat rhythm, a strong MFE is usually followed by a weak one and vice versa [Fig. 1(b), middle]. In addition, a weak MFE may occur after every two consecutive strong MFEs [Fig. 1(b), bottom], forming a 3-beat rhythm. We remark that the three distinct MFE rhythms can take place when varying only one physiological parameter [in Fig. 1(b), the cortical coupling I-to-E weights S E I: 1-beat, S E I = 2.45 × 10 2; 3-beat, S E I = 2.55 × 10 2; 2-beat, S E I = 2.61 × 10 2].

These rhythms result in different peaks in the temporal frequency spectrum [Fig. 1(c)]. Compared to the single peak 45 Hz for the 1-beat rhythm, the alternation of MFE amplitudes in the 2-beat rhythm introduces an additional peak 25 Hz (high beta band), and the 3-beat rhythm gives one more 15 Hz (low beta band). At the same time, the gamma-band peak is shifted toward higher frequencies as the network recovers more rapidly from the weaker inhibition induced by weaker MFEs.

We comment that the above rhythms are only a small subset of all possible network dynamics. On the other hand, the patterns are the most robust and persistent ones during our parameter exploration (via one-dimensional sweeps of a few critical parameters; see Sec. II B 2). Thus, in the rest of this paper, we focus on these rhythms, leaving more thorough investigations of all possible dynamics for future work.

The regularity of MFE recurrence in different beats suggests the importance of dynamical paths between MFEs. We propose to study this via iteration since the next MFE is induced by the E–I competition during the current MFE. Specifically, if one could predict the next MFE from a mapping F, then F n can predict all subsequent MFEs; hence, the long-term network dynamics are revealed.

To address the underlying mechanism, this section provides a data-driven Poincaré section theory to understand the iteration between MFEs. We will perform a top-down analysis: Starting from the high-dimensional network dynamics, we deduce and simplify the MFE mapping F by combining first-principles arguments and surrogate data collected from simulations.

1. Reduction of a high-dimensional MFE mapping

We start by dissecting the network dynamics onto a sequence of time intervals { I n = [ t n , t n + 1 ) }, where t n is the starting time of the nth MFE (see definition in Sec. IV). We first assume that there exists an MFE mapping F predicting the dynamical path on I n + 1 given the network dynamics on the interval I n. Despite its complexity, the dynamical path on I n is determined by two components:
  1. The network state on time section t n (denoted by S n), i.e., the initial condition of the dynamical path.

  2. The realization of all noise ξ within interval I n, including noises in external drives and internal randomness in synaptic couplings.

Therefore, F is equivalent to a stochastic mapping F 1 sending S n to S n + 1. F 1 can be viewed as a “pseudo”-Poincaré section mapping (pseudo in the sense that the recognition { t n } relies on a short history),
S n + 1 = F 1 ( S n , ξ ) ,
(2)
where S n includes membrane potentials and E/I synaptic conductances ( v , g E , g I ) of all neurons at time t n.

Even for our small 400-neuron network, S n contains 3 × 400 = 1200 variables. Therefore, it would be hard to study F 1 without a drastic reduction of dimensions. To make it worse, MFEs are produced by rapid, transient competitions between E/I populations.56,65,70,71 Each MFE can be very sensitive to the randomness lying in a couple of spike times. To the best of our knowledge, an analytical description of MFE mapping F 1 is unfeasible.

Instead, we propose to investigate F 1 in a data-informed manner. That is, we first perform a three-level reduction of F 1 and then compute the reduced mapping f from surrogate data.

a. Step 1: Coarse-graining
We coarse-grain the spiking network by assuming the interchangeability between neurons i and j when v i ( t ) v j ( t ). Instead of tracing ( v , g E , g I ) for every neuron, the coarse-grained network state is determined by the population distribution of the membrane potentials [ ρ E ( v ), ρ I ( v )] and the total conductances g E E, g E I, g I E, and g I I. Here, g Q Q represents the sum of Q-conductances of all type- Q neurons ( Q , Q { E , I }). Therefore, the mapping F 1 is replaced by
S n + 1 = F 2 ( S n , ξ ) ,
(3)
where S n = ( ρ E ( v ) , ρ I ( v ) , g E E , g E I , g I E , g I I ) t = t n. Figure 2(b) illustrates the schematics of the iteration in this coarse-grained state space. Figure 2(c) gives an example of ρ E ( v ) , ρ I ( v ) at the initiation of an MFE.
FIG. 2.

A theoretical framework to analyze the rhythms of MFEs. (a) The raster plots are divided into MFEs (the accumulated dots) and inter-MFE intervals (relatively quiescent periods) by a MFE-detection algorithm. The initiation/termination of an MFE is labeled by a solid/dash green line. Two different rhythms are depicted in this panel to demonstrate the effectiveness of the algorithm. (b) The MFE-to-MFE iteration is equivalent to a stochastic, high-dimensional discrete dynamical system. (c) An example of E/I population potential distributions before the initiation of MFEs. At least one neuron is fixed at the firing threshold ( V t h = 1) so that the MFE can be triggered right away. The difference between the averages is indicated ( m).

FIG. 2.

A theoretical framework to analyze the rhythms of MFEs. (a) The raster plots are divided into MFEs (the accumulated dots) and inter-MFE intervals (relatively quiescent periods) by a MFE-detection algorithm. The initiation/termination of an MFE is labeled by a solid/dash green line. Two different rhythms are depicted in this panel to demonstrate the effectiveness of the algorithm. (b) The MFE-to-MFE iteration is equivalent to a stochastic, high-dimensional discrete dynamical system. (c) An example of E/I population potential distributions before the initiation of MFEs. At least one neuron is fixed at the firing threshold ( V t h = 1) so that the MFE can be triggered right away. The difference between the averages is indicated ( m).

Close modal
b. Step 2: Simplifying Sn

The coarse-grained network state S n includes two distributions (of membrane potentials) and four variables (conductances). To further reduce the complexity of F 2, we impose three simplifying assumptions to S n (detailed in this and Step 3 below), starting with

  1. g Q Q 0 at t n, the beginning of an MFE ( Q , Q { E , I }).

Assumption 1 comes from the following observation of network simulations: the number of spikes produced during inter-MFE intervals is much smaller than spikes during MFEs [Fig. 2(a)]. On the other hand, the lengths of inter-MFE intervals are generally three to six times the longest synaptic timescale (see details in Sec. IV). Therefore, assuming g Q Q 0 at the initiation of the next MFE after their exponential decay is reasonable.

After that, we turn to the membrane potential distributions ρ E ( v ) and ρ I ( v ). Previous studies demonstrated that the dynamics of membrane potential distributions are well approximated by a reduced system concerning the first few orders of moments.77,78 Thus, we can replace Eq. (3) by
S n + 1 = F 3 ( S n , ξ ) .
(4)
Here S n = ( M 1 E , , M k E ; M 1 I , , M k I ) t = t n, i.e., it includes the first k moments of ρ E ( v , t n ) and ρ I ( v , t n ). Generally, the more moments are considered, the more features of the distributions are preserved, and thus the closer F 3 to F 2.

It is hard to deduce the explicit form of F 3, but it can be computed from short-term network simulations. The initial conditions of membrane potentials of all neurons can be sampled from ρ E ( v , t n ) and ρ I ( v , t n ), which are inferred from the moments in S n. After the simulation, S n + 1 can be collected from the empirical distributions of the membrane potentials at the initiation time of the second MFE. See more details in Step 4 below.

c. Step 3: First-order moments

As a specific realization of F 3, we focus on the first-order moments, i.e., M 1 E and M 1 I. This is of course a drastic simplification, leading to substantial information loss about ρ E ( v ) and ρ I ( v ). Nevertheless, our hope is that the first-order moments are sufficient to qualitatively capture the E–I competition during and after MFEs, thus explaining the different MFE beats and dynamical bifurcations observed in network simulations.

There are different ways to infer the probability distributions from a finite number of moments (e.g., maximum entropy closures79). In this paper, we use a simple, truncated Gaussian scheme by assuming

  1. The inferred membrane potential distributions ρ E ( v , t n ) and ρ I ( v , t n ) are truncated Gaussian distributions. The variances of the inferred distributions are dominated by the noise of external input.

The truncation takes place at 3 σ away from the averages, covering 99.7 % of the probability mass. Therefore, we expect fewer than a single E/I neuron to be neglected by this truncation.

Here, we explain the motivation of the truncated Gaussian scheme: First, without the resetting process, ρ E , I ( v , t n ) should be nearly Gaussian when only driven by the external Poisson inputs. This is especially the case after a strong beat takes place at t n 1 since most neurons reset to v = 0 and are strongly suppressed by the inhibitory spikes in the MFE. After that, neurons are primarily driven by the external inputs [see, e.g., Fig. 2(c)]. On the other hand, after a weak beat at t n 1, ρ E , I ( v , t n ) can split into multiple peaks since only a small fraction of the neurons fire in the MFE. In this case, the truncated Gaussian scheme does not approximate ρ E ( v , t n ) and ρ I ( v , t n ) well. Nevertheless, as we observe from the network simulations, a weak beat right before t n usually results in a large, positive difference M 1 E ( t n ) M 1 I ( t n ). This is because only a small number of E neurons (and more I neurons) are involved in the weak MFE, and more E neurons remain excitable with high membrane potentials. Therefore, the MFE taking place at t n is very likely a strong beat. Our preliminary tests show that the strong beat is usually sufficiently guaranteed by a positive M 1 E ( t n ) M 1 I ( t n ), and the shapes of ρ E ( v , t n ) and ρ I ( v , t n ) become less important.

We present three videos showing the temporal evolution of the empirical distributions of membrane potentials, which are collected from network simulations (see the supplementary material). The multi-peak feature also guides us develop an autonomous ODE model (see Sec. II C).

One last assumption is made so that we can focus on the difference m n = def m ( t n ) = M 1 E ( t n ) M 1 I ( t n ) between the first-order moments rather than themselves:

  1. At t n, there is at least one neuron whose membrane potential is around the firing threshold V th.

Assumption 3 comes from the definition of t n: it would take more time after t n to trigger the first spike of the next MFE if no neuron had a potential at V th. Therefore, we can align the upper truncation bound of at least one of ρ ^ E , I ( v ) with V th.

Finally, F 3 is reduced to a one-dimensional mapping f,
m n + 1 = f ( m n , ξ ) .
(5)
d. Step 4: Compute the one-dimensional MFE mapping

We compute f via Monte Carlo simulations. Specifically, for every selected m 0 [ 0.15 , 0.3 ], we perform 50 Monte Carlo simulations of the short-term network dynamics. Each simulation provides an m ^ 1 (defined later). Together, the 50 m ^ 1 values provide visualization for possible outcomes of f ( m 0 , ξ ). The interval [ 0.15 , 0.3 ] is selected to cover all m 1 E ( t n ) m 1 I ( t n ) showing up in the network simulations.

Each simulation evolves the network dynamics from t 0, the initiation time of the first MFE, until t 1 , the termination time of the second MFE. The random path ξ for each simulation is independent. We first generate the initial condition of the network at t 0. From Assumption 1, we can fix all conductances as 0, only assigning the membrane potentials to all neurons. The potentials of the E/I neurons are randomly sampled from the inferred distributions ρ E ( v , t 0 ) or ρ I ( v , t 0 ), respectively. After the membrane potentials are assigned, we adjust the largest one to V th by hand to ensure that an MFE is triggered at t 0. After the second MFE is terminated (at t 1 ) in the simulation, we collect the empirical distributions of the E/I population at t 1, the initiation of the second MFE [ ρ ^ E ( v , t 1 ) and ρ ^ I ( v , t 1 )]. By looking into the first-order moments, this Monte Carlo simulation gives
m ^ 1 = M ^ 1 E ( t 1 ) M ^ 1 I ( t 1 ) .
(6)
For each specific m 0 [ 0.15 , 0.3 ], the 50 m ^ 1 values are indicated by the blue dots in the left plots in Figs. 3(a)3(c). For more details, see Sec. IV.
FIG. 3.

Different rhythms are explained by the MFE mapping f ^ ( m n ). Panels (a)–(c) correspond to the three rhythms illustrated in Fig. 1(b), generated by the same choices of S E I. (a) S E I = 2.45 × 10 2. Left: The MFE mapping f ^ ( m n ) computed by Monte Carlo simulations (blue dots). Line x = y (red diagonal line) and the only subgroup of the invariant set χ f ^ (pink circle) are also indicated. Right up: The raster plot. Right bottom: The time trace of m ( t ) = m 1 E ( t ) m 1 I ( t ). Pink circles indicate m ( t n ) at the initiations of the first few MFE. (b) S E I = 2.61 × 10 2. Same as (a), but the χ f ^ includes two subgroups (green and pink circles). (c) S E I = 2.55 × 10 2. χ f ^ includes three points (green, cyan, and pink circles) and indicates a 3-beat rhythm.

FIG. 3.

Different rhythms are explained by the MFE mapping f ^ ( m n ). Panels (a)–(c) correspond to the three rhythms illustrated in Fig. 1(b), generated by the same choices of S E I. (a) S E I = 2.45 × 10 2. Left: The MFE mapping f ^ ( m n ) computed by Monte Carlo simulations (blue dots). Line x = y (red diagonal line) and the only subgroup of the invariant set χ f ^ (pink circle) are also indicated. Right up: The raster plot. Right bottom: The time trace of m ( t ) = m 1 E ( t ) m 1 I ( t ). Pink circles indicate m ( t n ) at the initiations of the first few MFE. (b) S E I = 2.61 × 10 2. Same as (a), but the χ f ^ includes two subgroups (green and pink circles). (c) S E I = 2.55 × 10 2. χ f ^ includes three points (green, cyan, and pink circles) and indicates a 3-beat rhythm.

Close modal

Here, we must point out that, without the reductions in Steps 2 and 3, an unfeasibly large ensemble of surrogate data would be necessary to produce an effective, low-dimensional representation of the MFE mapping F 2 since we need to cover the two potential distributions and four total conductances in S n.

5. The MFE mapping predicts MFE dynamics

We use the Monte Carlo simulation for mapping f (denoted as f ^ ) to explain the different MFE beats discussed above. We will focus on (1) the invariant set of f ^ (denoted χ f ^ ) and (2) the geometrical features of f ^ .

The structure of χ f ^ can be revealed by iterating f ^ repeatedly, which is carried out as follows. After selecting a m 0, we compute m ^ 1 = f ^ ( m 0 ) from one Monte Carlo simulation described in Step 4. m ^ 1 is also the input to the next iteration, from which we compute m ^ 2 = f ^ ( m ^ 1 ) and so on. After 300 iterations, we find that the sequence ( m ^ n , m ^ n + 1 ) can cluster into one, two, or three subgroups (see Fig. S1 in the supplementary material). These features do not rely on the selection of m 0. The subgroups are indicated by the pink, green, and cyan circles in the left panels of Figs. 3(a)3(c). Remarkably, the number of subgroups predicts the beat number of MFE rhythms [Figs. 3(a)3(c), upper right raster plots]. Moreover, the subgroups of χ f ^ agree in detail with the spiking network simulations; i.e., they fit m ( t ) at the initiation of each MFE [Figs. 3(a)3(c), lower right panels; circles correspond to the same-color subgroups of χ f ^ ].

Furthermore, different MFE beats are explained by the geometrical features of f ^ . We note that the lower branch of f ^ (the one containing m 0 > 0) undergoes downward spatial translation when S E I increases (Fig. 3 left panels; S E I values: A < C < B). The downward translation of the branch explains the different MFE rhythms: From A to C, the only cluster (pink) loses stability and introduces two other clusters (cyan and green) into χ f ^ as its predecessor and successor in the MFE iteration, giving rise to a 3-beat rhythm; from C to B, the further lowered branch does not intersect the diagonal m ^ 1 = m 0 anymore (red line). Thus, only two clusters remain, forming the 2-beat rhythm oscillation.

The downward translation can be explained as follows: For a fixed m 0, a stronger S E I coupling means that inhibition produced by the MFEs suppresses E neurons more. Therefore, M ^ 1 E ( t 1 ) will be lower when the next MFE is initiated, leading to a smaller m ^ 1.

These results confirm that, in the selected parameter regimes, the 1D mapping f successfully captures the E–I competition and the underlying mechanism of MFE beats by only considering the first-order moments of the potential distributions.

2. Multi-band oscillations caused by bifurcations

The relation between f and MFE beats suggests that multi-band oscillations come from critical bifurcations of the MFE mappings. Here, we examine this point by exploring f ^ in a few 1D sections of the parameter space (a thorough investigation of the high-dimensional parameter landscape is unfortunately computationally unfeasible). This subsection aims to demonstrate the prominence of multi-band oscillations in the parameter space of the simple network.

We first select a handful of parameters from the following groups.

  1. Cortical synaptic couplings. We choose S E I, the I-to-E synaptic strength.

  2. Timescales of recurrent neuronal interactions. We choose τ E, the E-current timescale.

  3. External drive. We choose S ext, the strength of each external kick. The mean of external stimuli S ext λ ext is fixed during the variation of strength.

  4. Network architecture. P, the synaptic coupling probability, controls the most important feature of the network architecture generated by an ER model. Similar to other simulations for P = 0.8, we will generate and fix one network architecture for each P tested (“quenched disorder” all the time). In addition, P S Q Q is fixed for Q , Q { E , I } to control the total postsynaptic current induced by each spike.

  5. Single-cell physiology. We choose τ R, the refractory period of a cell.

The five parameters listed above { S E I , τ E , S ext , P , τ R } are varied individually to investigate the one-dimensional bifurcations of f ^ . For each parameter point involved, we compute f ^ and expect the invariant set χ f ^ to reflect the MFE beats. For a better presentation, we use a difference visualization of χ f ^ by using
Δ m n = m n + 1 m n = ( f ^ 1 ) ( m n )
instead of m n. Here, 1 stands for the identity function. The reason is that some subgroups of χ f ^ may occupy similar locations when displayed on the 1D space of m n. In the difference visualization, we use χ f ^ 1 to indicate the subgroup number of χ f ^ . Finally, the bifurcations of MFE beats are depicted by the heat maps of empirical distributions of χ f ^ 1 in Fig. 4.

Figure 4(a) investigates the bifurcation diagram of S E I, while all other parameters are fixed (the same as Fig. 3). This confirms our previous finding of the “1–3–2” beat bifurcation when S E I increases: χ f ^ 1 exhibits only one subgroup for S E I < 2.50 × 10 2, two subgroups for S E I > 2.61 × 10 2, and three subgroups in the transition period between them. Notably, in the 3-beat rhythm, the probability mass is usually unevenly distributed within the three subgroups of χ f ^ 1, which is affected by S E I. For example, when S E I 2.53 × 10 2, though two other subgroups already emerge [blue and green circles, Fig. 3(c)], the subgroup around ( m 0 , m ^ 1 ) = ( 0 , 0 ) still carries most of the mass (pink circle); i.e., Δ m n = m n + 1 m n is mostly 0 for any consecutive two MFEs. As a result, the network dynamics are dominated by 1-beat rhythms, with 3-beat rhythms taking place occasionally. On the other hand, when S E I 2.60 × 10 2, as indicated by the branches of Δ m n, the unstable subgroup around ( 0 , 0 ) almost vanishes, and the network dynamics is dominated by 2-beat rhythm patterns instead. This is echoed by our observations from spiking network simulations that, instead of standing alone, the 3-beat rhythms are always mixed with 1-beat or 2-beat rhythms [see, e.g., Fig. 3(c) raster plot, 2650–2800 ms].

The bifurcation map of S E I suggests that 1- and 2-beat rhythms are more robust than 3-beat. Therefore, we next focus on how other parameters may affect the bifurcations for 3-beat rhythms. Four 1-dim parameter scans are carried out for τ E , S ext, P, and τ R. We fix S E I = 2.55 × 10 2 where 3-beat rhythm patterns are prominent in Fig. 4(a). The reference parameter point is indicated by red lines in Fig. 4 (details given in Sec. IV). The bifurcation diagrams for the other MFE beat rhythms, i.e., S E I = 2.45 × 10 2 and 2.61 × 10 2, are shown in Figs. S2 and S3 of the supplementary material, respectively.

Previous studies highlight the importance of synaptic timescales, τ E and τ I, to the synchronicity of gamma oscillations.58,70 Hence, we speculate that synaptic timescales impact MFE beats. Our intuition goes as follows: Reducing τ E results in a more concentrated excitation in a shorter time. This would break the regular 2-/3-beat rhythms if strong MFEs replaced the weak MFEs. The same argument applies to increasing τ I as well. On the other hand, when τ E , I are close to each other, the network dynamics should become more homogeneous, as comparable excitatory and inhibitory timescales induce weaker gamma oscillations. Therefore, MFEs become less distinguishable by their amplitudes. Our numerical experiments confirm this intuition by multiplying τ E with an adjustment ratio factor [where τ E = 1.4 ms for the 3-beat rhythm; see Fig. 4(b)]. When τ E decreases to 0.8 × the original values, the 3-beat rhythm is replaced by 1-beat; when τ E goes to 1.8 × the original values, different branches of χ f ^ 1 become less distinguishable; i.e., the sequence { m n } does not oscillate regularly, and the beat rhythm is disrupted.

We then examine the bifurcations induced by external noise, the size of which is represented by S ext. Remarkably, the network dynamics are dominated by 1-beat when large external noises are present. On the other hand, 3-beat rhythms are prevalent when noises are small [Fig. 4(c)]. These bifurcations echo our intuition that the 3-beat rhythm is disrupted by the irregularity of dynamics.

The synaptic coupling probability P also exhibits a similar bifurcation diagram as S ext [Fig. 4(d), shown in the inverse scale]. This is because P is related to the internal variability of network dynamics: When P is larger, more neurons will receive the same recurrent stimulus when a neuron fires, leading to more synchronized network dynamics; A smaller P leads to larger differences between the potentials within the same E/I population, and the network dynamics becomes more irregular.

The bifurcations induced by τ R are much more intriguing [Fig. 4(e)]. While dominated by 3-beat rhythm patterns when τ R 0, the cluster Δ m 0 carries most mass of χ f ^ 1 for 0.2 < τ R < 0.8 ms. Hence, MFEs reoccur in 1-beat rhythms due to m n + 1 m n. When the τ R becomes longer, the dynamics are dominated by 2-beat rhythms for 0.8 < τ R < 2.5. More interestingly, 3-beat rhythms come back robustly when the refractory becomes sufficiently long. A qualitative thought goes as follows: The bifurcations of τ R occur at the same magnitudes of the synaptic timescales used in this paper (1–10 ms; see also Ref. 80). Starting from the 3-beat dynamics where τ R = 0 [red line, Fig. 4(e)], increasing τ R to 0.2 and 0.8 ms means that the neuron will miss 15 % -- 50 % of the recurrent excitation produced by the MFE (since τ E = 1.4 ms). Such drastic deductions of recurrent excitation can substantially change ρ E , I ( v ) before the initiation of the next MFE. Similar arguments apply to the rest bifurcation point τ R = 2.5 ms, where up to 40 % of recurrent inhibition produced by MFE can be missed. This intuition suggests that the refractory periods play an essential role in the interplay of E–I competition, as also hinted by other studies.81 

We remark that the bifurcations investigated in this paper may change when the network size varies. The spatial scale of Gamma and other types of neural oscillations is an open problem, and the choice of the 400-neuron network follows previous studies of MFEs.50,56,70 Suppose the 400-network is scaled up to larger sizes while roughly fixing the total recurrent neuronal interactions. We hypothesize that the network dynamics will become more regular because of the larger number of interchangeable neurons. This can be unfavorable for transient, weak MFEs. However, if the larger circuit has a specific architecture, such as the widely used ring model,82 one could expect many partial synchronized firings emerging from localized neuronal interactions. The “beat” dynamics can be affected by network sizes in different ways, but a thorough investigation is beyond the current scope of this paper.

3. Geometrical features of the bifurcations

The MFE beat bifurcation lies at the center of our theory for multi-band oscillation and is worth more meticulous investigations. Here, we present a novel geometrical representation of the bifurcations with low-dimensional manifolds.

In our previous study,65 we developed a first-principles-based, coarse-grained model consisting of only a few variables. Though drastically reduced from the high-dimensional spiking network model, this coarse-grained model successfully captured gamma oscillations. Notably, in the reduced state space, gamma oscillations were mostly restricted on a 2D manifold. We refer readers to Ref. 65 for more details.

Now, we project the multi-band oscillations discovered in our current study to this reduced state space. We use an analogy of a “root–leaf” structure to describe the geometrical features of the long-term behavior. The “roots” represent the states of MFE initiations, which congregate together and occupy a small area in the state space (cyan dots in Fig. 5). The structure of the root area is shown in Fig. S4 of the supplementary material in detail. On the other hand, the “leaves” represent the trajectories of MFEs and post-MFE dynamics, which are represented as loops on a 2D manifold. All “leaves” are glued to the “root” area. Like gamma oscillations in our previous study, the multi-band oscillations also exhibit sufficiently low dimensionality despite the strong nonlinearity and stochasticity of the network dynamics.

Here, we demonstrate that the MFE beat bifurcations are captured by the “root–leaf” structure.

Most importantly, the MFE beats form a one-to-one mapping to the different “leaf clusters.” To visualize the leaf clusters, we dissect the trajectories of network dynamics on I n (as shown in Sec. II B 1) and then color each trajectory section according to m n (weak MFEs are particularly colored in dark blue). It is clear that trajectories with similar colors usually form clusters, and the number of leaf clusters equals the number of MFE beats (Fig. 5).

Also, the emergence of multiple leaf clusters is dictated by the MFE beat bifurcations. From Fig. 4, we select four points in each 1D parameter space and check the network dynamics in the reduced state space. For example, recall the “1–3–2” bifurcation pattern for S E I. For the 1-beat rhythm, the trajectories of S E I = 2.44 × 10 2 only form one leaf with light green [Fig. 5(a) 1st plot]. Three-beat rhythms are represented by the three separate leaves [dark blue, orange, and light green curves in the 2nd plot of Fig. 5(a)]. For S E I = 2.52 × 10 2 and 2.68 × 10 2, the 2-beat rhythm is echoed by the merging of the two larger leaves. Similar observations when we vary τ E, S ext, and P are shown in Figs. 5(b)5(d). Surprisingly, for different refractory periods τ R, the “root–leaf” structure undergoes substantial changes [Fig. 5(e)], suggesting again the importance of refractory periods in multi-band oscillation dynamics.

The coordinate of the reduced model includes the total E/I conductances ( g E , I) and the numbers of subthreshold neurons near the firing threshold V t h (E and I “gate” neurons). More details of the reduced model are given in Sec. IV.

We have been studying the MFE beats with a combination of first principles and data-driven model reductions. The soundness of a data-driven model reduction is usually controlled by two factors: (1) the validity of the model reduction assumptions and (2) the amount and quality of the surrogate data. These two factors can put us in a dilemma: the more a model is simplified, the less accurately can it represent the full model; however, the closer it is to the full model, the larger amount of high-quality surrogate data is required to compute the reduced model, which can be computationally expensive. This dilemma is shown by the series of reduced models in Eqs. (2)–(5), i.e., F 2, F 3, F 4, and f. f is the compromise we focus on.

To overcome the weaknesses of the data-driven model reductions, we develop an autonomous, reduced ODE model for MFE dynamics. The ODE model is governed by the mean-field representations of E–I competitions combined with population fluctuations. Opposed to the previous data-driven model, this ODE model is completely based on first principles and independent of the numerical simulations of network dynamics.

The key idea of the ODE model is to approximate ρ E , I ( v , t ) with multiple (instead of one) truncated Gaussian peaks to echo the observation that E/I populations are only partially involved in the weak MFEs. During the evolution of the model, the truncated Gaussian peaks will undergo splitting and merging, which are caused by the threshold-crossing during MFEs. Specifically, the potential distribution of a neural population would split if only a fraction of neurons fire during an MFE. These neurons would reset and be modeled as an additional truncated Gaussian around V r in the voltage distributions. Other neurons remain subthreshold. On the other hand, we assume that two peaks would merge into one when they get adjacent. Each peak is described as a truncated Gaussian due to the same reason as above (both E/I neurons are primarily driven by external Poissonian inputs in the absence of recurrent stimuli). In all, the autonomous ODE model describes the E–I competition by the splitting and merging of Gaussian peaks. We refer readers to Sec. IV for more details.

As crude as it is, the autonomous ODE model provides a remarkably good approximation to MFE beats, especially when the variance of each Gaussian peak is small. We find that the ODE model produces similar crucial dynamical features as the MFE beats produced by the network (Fig. 6), including firing rates, firing synchronicity, and oscillation frequency. Most importantly, the ODE model also exhibits qualitatively similar dynamics of potential distribution averages (represented by their difference, m). We selectively demonstrate the performance of the ODE model for the three parameter sets displayed in Fig. 3. Furthermore, compared to Fig. 4(a), the ODE model generates a similar bifurcation map of MFE beats when we vary the cortical synaptic coupling weights (Fig. 7; since we use P = 1 for simplicity, the tested parameter S E I is normalized to keep P S E I fixed).

FIG. 4.

MFE beat bifurcation maps indicated by the distribution of Δ m. (a) Bifurcation map of S E I. S E I values of the three patterns in Fig. 3 are indicated by arrows. (b)–(e) Bifurcation maps of τ E, S ext, P, and τ R, where the reference parameter point is indicated by the red lines.

FIG. 4.

MFE beat bifurcation maps indicated by the distribution of Δ m. (a) Bifurcation map of S E I. S E I values of the three patterns in Fig. 3 are indicated by arrows. (b)–(e) Bifurcation maps of τ E, S ext, P, and τ R, where the reference parameter point is indicated by the red lines.

Close modal
FIG. 5.

Multi-band oscillations produced by network simulations are presented in a reduced 3D space classified by the coarse-grained model in Ref. 65. Trajectories of different beats are colored according to m at the initiations of each MFE. Weak beats are specially labeled with deep blue color. The states of MFE initiations are indicated by cyan dots. (a) Four points in the 1D parameter space of S E I are selected. The corresponding network dynamics form a “root–leaf” structure, and the number of leaf clusters indicates the MFE beat number demonstrated by the bifurcation map in Fig. 4(a). (b)–(e) Similar to (a), but the varying parameters correspond to Figs. 4(b)4(e).

FIG. 5.

Multi-band oscillations produced by network simulations are presented in a reduced 3D space classified by the coarse-grained model in Ref. 65. Trajectories of different beats are colored according to m at the initiations of each MFE. Weak beats are specially labeled with deep blue color. The states of MFE initiations are indicated by cyan dots. (a) Four points in the 1D parameter space of S E I are selected. The corresponding network dynamics form a “root–leaf” structure, and the number of leaf clusters indicates the MFE beat number demonstrated by the bifurcation map in Fig. 4(a). (b)–(e) Similar to (a), but the varying parameters correspond to Figs. 4(b)4(e).

Close modal
FIG. 6.

Comparison between the dynamics generated by the spiking neuronal network and the reduced ODE model. Rows 1–3 display the MFE beats produced by both models, where the parameter sets are chosen the same as Fig. 3. Column 1: Raster plots of the spiking neuronal network. Columns 2 and 3: Firing rates of both models. Columns 4 and 5: Differences between the first moments of voltage distributions for both models.

FIG. 6.

Comparison between the dynamics generated by the spiking neuronal network and the reduced ODE model. Rows 1–3 display the MFE beats produced by both models, where the parameter sets are chosen the same as Fig. 3. Column 1: Raster plots of the spiking neuronal network. Columns 2 and 3: Firing rates of both models. Columns 4 and 5: Differences between the first moments of voltage distributions for both models.

Close modal
FIG. 7.

The bifurcation map of MFE beats produced by the ODE model. Only cortical synaptic coupling strength S E I is varied, and this map is comparable to Fig. 4(a).

FIG. 7.

The bifurcation map of MFE beats produced by the ODE model. Only cortical synaptic coupling strength S E I is varied, and this map is comparable to Fig. 4(a).

Close modal

Notably, the ODE model exhibits similar “root–leaf” structures during MFE bifurcations when illustrated in the same reduced space as Fig. 5 (see Fig. 8). Since the ODE model evolves the distributions of E/I potentials, one can deduce the numbers of “gate” neurons from the numbers and locations of the Gaussian peaks. Remarkably, not only the number of “leaves” matches the number of MFE beats, but the ODE model trajectories share similar geometrical features with the representation of the network simulation (Fig. 5). This suggests that the complex network dynamics can be successfully captured by considering only the most important features of ρ E , I, even though the splitting and merging of Gaussian peaks are very rough descriptions.

FIG. 8.

Similar to Fig. 5(a), multi-band oscillations produced by the autonomous ODE model are presented in the same reduced 3D space. The S E I values for the four panels are normalized from S E I = 2.46, 2.50, 2.54, and 2.57 × 10 2.

FIG. 8.

Similar to Fig. 5(a), multi-band oscillations produced by the autonomous ODE model are presented in the same reduced 3D space. The S E I values for the four panels are normalized from S E I = 2.46, 2.50, 2.54, and 2.57 × 10 2.

Close modal

Taking together, the autonomous ODE model supports our main conclusions that

  1. Multi-band oscillation, i.e., the various MFE beats, emerge from the E–I competition in a different fashion.

  2. The E–I competition in network dynamics can be captured by the E/I membrane potentials and is sufficiently represented by their important features.

We finally address the issue that how different network architectures impact network dynamics. We argue that, for larger Ps, (1) the network dynamics generated by different ER network architectures usually share similar statistics and (2) the statistics of ER network dynamics are close to those of the annealed architecture. However, these two arguments are less correct for smaller Ps.

1. Different ER networks

We generate five different ER networks for each P = 0.3, 0.5, and 0.8. We simulate each spiking network for 10 s and compare the following statistics of dynamics: firing rates, the amplitudes and lengths of MFEs, and the lengths of inter-MFE-intervals [Figs. S5–S7(a) in the supplementary material]. Other parameters are fixed the same as Fig. 4(d). For P = 0.8, the firing rates and the distributions of the other three statistics are alike among the five different architectures. Similar arguments apply to P = 0.5. For P = 0.3, the firing rates and distributions become less similar.

2. ER networks and the annealed architecture

A similar trend is observed when comparing the ER network dynamics to the annealed architecture [Figs. S5–S7(b) in the supplementary material]. The distributions of the three statistics are alike for P = 0.8. As for P = 0.5, though the shapes of the distributions are comparable, the locations of the peaks are different. Specifically, for the annealed architecture, the MFE amplitudes of the strong beats are lower than the ER networks, and most MFEs are shorter in time. Larger differences are observed for P = 0.3. We speculate that this is because the ER network architecture usually gives rise to E neurons whose numbers of presynaptic E/I cells are away from the average. Therefore, these E neurons are more active than the others and can help the MFEs last longer/involve more neurons. On the opposite, the annealed architecture does not share the same advantage since neurons are interchangeable.

While a fixed network architecture reflects biological realism better, the annealed architecture is closer to a mean-field limit and also excludes a possible bias introduced by specific synaptic connections. We have repeated our analysis using the annealed architecture, including the computations of reduced MFE mappings f ^ , the bifurcation diagrams, and the “root–leaf” structure in the reduced state space. Qualitatively, similar results are found and are not presented in this paper.

Thus, we believe that the qualitative features we discussed, such as multi-band oscillations, bifurcations, and mechanisms, should apply to a larger category of network architectures. The dynamics of the annealed architecture represent a mean-field limit of the ER networks. The analysis of the former is qualitatively similar to the latter.

The emerging picture of brain cognition is one in which coherent neural activities form the physiological basis of many cognitive functions. Prominent among the patterned population activities are the various oscillatory rhythms that have been implicated in sensory processing, perception, attention, learning, and memory. While the experimental investigations of coherent neural activities have formed the theoretical underpinnings of information coding in the brain,17,39,44,83–85 the understanding of how neural circuits can generate patterned activity remain incomplete. Previous theoretical studies have largely focused on single-band oscillatory activities. In addition to the theories for gamma oscillations referred to earlier in this paper, there are many studies on the mechanism behind theta,18,45,57,64,86,87 alpha,45,62 beta oscillations,48,88,89 and others. By combining these ideas with oscillatory inputs, studies have shown that, in general, multi-band oscillations can emerge near Hopf bifurcations of the driven population.63,64

Recent work has shown that a wide variety of coherent activity can be generated by homogeneously structured, fluctuation-driven neuronal networks, even under temporally constant drive. In particular, experimental and simulational studies have demonstrated the ubiquity of self-organized, nearly synchronous population spiking activity of variable size and duration. Much work has examined the sparsely synchronized regime, where most neurons show irregular firing at a frequency lower than the population rhythm (see Refs. 30 and 90–92, for instance).

Here, we take a different tack and examine a stronger coupling regime where a single excitatory neuronal spiking can induce stochastic synchronized firing of a small cluster of neurons. Theoretically, the main obstacle to a full analysis of this regime is a clear delineation of the strong stochastic competition between excitatory and inhibitory sub-populations. Already, the mathematical description of the so-called multiple-firing events needed the development of a specific ensemble average, i.e., a “partitioned ensemble average” that treated MFEs differently from nearly homogeneous spiking patterns69,93 (which can be faithfully captured by classical coarse-graining techniques, e.g., Fokker–Planck approaches78). In recent work, we also uncovered a nearly two-dimensional manifold underlying stochastic gamma oscillations by projecting onto a suitable, low-dimensional state space.65 

In particular, we focused on a dynamical regime between single-band, stochastic oscillatory rhythms and general heterogeneous patterned activities. During our exploration of the dynamical possibilities of homogeneous networks that generate MFEs, we found that many parameters led to the emergence of nearly periodic patterns consisting of more than one single frequency. Because of the prevalent periodicity, we found it natural to combine classical Poincaré sectional theories with modern data-driven techniques. After a few steps of model reductions, the complex MFE dynamics were captured by the neuronal potentials on the time sections of MFE initiations, which were further computed from a series of Monte Carlo simulations of transient network dynamics. The Poincaré sectional theory revealed the crucial role of the delicate E–I competition in MFE bifurcations, leading to multiple frequency bands in network oscillations.

While a data-driven Poincaré return map was instrumental in identifying the bifurcations behind the emergence of multi-band patterns, a more detailed understanding of the roles played by various dynamical variables emerged in our dimensional reduced modeling. Viewed in our framework, a simple geometry of “leaves and roots” described well the multi-band oscillatory regime. When displayed in a suitably reduced state space, the stochastic neuronal oscillations coherently formed a nearly 2D manifold, on which a collection of “leaves” (describing each MFE and the post-MFE dynamics) and “roots” (indicating the complex initial conditions triggering each MFE) delineate the manifold of multi-band oscillations. Furthermore, based on these observations and through well-motivated assumptions, we managed to develop an ODE model description that captured the essentials of the population membrane potential dynamics. The multi-band oscillations emerging from spiking networks and the ODE model both exhibit similar geometrical representations of MFE cycles, with the number of MFE beats corresponding to the clusters of “leaves” in the state space.

The modern view of cognition as neuronal population dynamics has also considered these neural network computations as trajectories on manifolds.94–99 Recent dynamical theories of stochastic neuronal network dynamics have progressed beyond f–I curves, static input–output relations, thresholded-linear filters, and weakly nonlinear expansions.58,90,91,100 Among the various neuronal dynamical activities, periodic synchronization has received the most attention, both experimentally and theoretically. We view our work here as complementary to previous studies that focused on identifying mechanisms of synchronization in heterogeneous networks.59,60,101,102 By identifying the population membrane potential distribution as the major player in the generation of multi-frequency oscillations, we have uncovered a novel dynamical route to multi-band oscillations in a simple, homogeneous neuronal network, without appealing to oscillatory inputs or more cell types and synaptic time scales. The central role of the membrane potential distribution points out the possibility of maintaining and manipulating these multi-frequencied, synchronizing populations, which we hope to investigate more fully in ongoing and future work.

We consider an integrate-and-fire (I&F) neuronal network, including N E excitatory neurons and N I inhibitory neurons. For neuron i, its membrane potential ( v i) is driven by excitatory and inhibitory (E/I) synaptic currents,
d v i d t = ( g i ext + g i E ) ( V t h V r ) + g i I ( V I v i ) , g i ext = S i ext μ i ext G E ( t t μ i ext ) , g i E = j E j i S i j E μ j E G E ( t t μ j E ) , g i I = j I j i S i j I μ j I G I ( t t μ j I ) .
(7)
In Eq. (7), g i { ext , E , I } indicates the external, recurrent excitatory, and inhibitory conductances of neuron i, which are induced by the external stimulus ( μ i ext) and recurrent E/I spikes ( μ j { E , I }) series, respectively. The strengths of synaptic couplings are S i ext and S i j { E , I }. Green’s functions for the E/I conductances are
G E ( t ) = 1 τ E e t τ E h ( t ) , G I ( t ) = 1 τ I e t τ I h ( t ) ,
(8)
where h ( t ) is the Heaviside function. The time constants, τ { E , I }, model the synaptic timescales of the excitatory and inhibitory synapses (such as AMPA and GABA103), respectively.

Neuron i fires when its membrane potential v i reaches the firing threshold V t h, after which it is immediately reset to the resting potential ( V r) and remains in a refractory state for a fixed time of τ R. It is conventional in many previous studies to set v dimensionless80,100 by choosing V t h = 1 and V r = 0. Accordingly, V E = 14 3 and V I = 2 3 are the excitatory and inhibitory reversal potentials.

While Eq. (7) can model a network with an arbitrary connectivity structure, the ER model is explained in the main text, and here, we provide more information about the annealed architecture. That is, whether a neuron of type Q receives a certain spike released by another neuron of type Q is determined by an independent coin flip with a probability P Q Q, where Q , Q { E , I }. For both the ER model and the annealed architecture, S i j { E , I } only depends on the cell types of i and j; i.e., S i j S Q Q if i is a type- Q neuron and j is a type- Q neuron.

For the “reference point” in Fig. 4 (3-beat), the network parameters are summarized below:

  1. Frequencies of external input: λ E = λ I = 21 000 Hz; strength of external stimuli: S i ext = 3.3 × 10 3.

  2. Strengths of cortical synaptic couplings: S E E = 0.94 × 10 2, S I I = 2.45 × 10 2, S I E = 1.25 × 10 2, and S E I = 2.55 × 10 2.

  3. Probability of spike projections in the annealed architecture: P E E = P I E = P E I = P I I = 0.8; the coupling probability in the ER model: P = 0.8.

  4. Synaptic timescales: τ I = 4.5 ms, τ E E = 1.4 ms, and τ I E = 1.2 ms. This is to reflect the fact that AMPA is faster than GABA.

  5. Refractory period: τ R = 0.

In addition, for the 1-beat and 2-beat rhythm in Figs. 1 and 3, we use S E I = 2.45 × 10 2 and 2.61 × 10 2.

The power spectrum density (PSD) measures the variance in a signal as a function of frequency. In this study, the PSD is computed as follows:

A time interval [ 0 , T ] is divided into time bins
B n = [ ( n 1 ) Δ t , n Δ t ] , n = 1 , 2 , ;
the spike density μ n per neuron in B n is given by μ n = m n N Δ t, where m n is the total number of spikes within B n. Hence, the discrete Fourier transform of { μ n } on ( 0 , T ) is given as
μ ^ ( k ) = 1 T n = 1 T Δ t μ n Δ t e k ( 2 π i ) ( n Δ t ) .
(9)
Then, PSD is the “power” concentrated at frequency k; i.e., | μ ^ ( k ) | 2.
Here, we explain how to estimate the mean of | μ ^ ( k ) | 2 and its standard error. We first partition the simulations into s batches,
[ 0 , T ] , [ T , 2 T ] , , [ ( s 1 ) T , s T ] ,
where S T = T. For the ith interval, we compute x i k = def | μ ^ ( k ) | 2. Then, for frequency k, our estimations of the mean and standard error are
x ¯ k = 1 s i = 1 s x i k , σ k = i = 1 s ( x ¯ k x i k ) 2 ( s 1 ) 2 .
(10)
Finally, the confidence interval for frequency k is
( x ¯ k σ k , x ¯ k + σ k ) .

MFE lacks a rigorous definition from previous studies. On the other hand, it is always visually categorized as highly correlated spiking patterns on the raster plots. In this study, we design a generic algorithm based on the key idea that each MFE is always triggered by recurrent interactions:

  1. When > 2 spikes show up in a 2 ms time interval, the end of the interval is set as the initiation of an MFE.

  2. After an MFE initiation, when < 2 spikes are detected in a 2 ms time interval, we set the beginning of the interval as the termination of the MFE.

The two steps above temporally dissect the network dynamics into intervals for MFEs and inter-MFE periods. To finalize the MFE detection, we further combine two MFE intervals into one if the inter-MFE-interval between them is shorter than 1 ms.

It is known that the lengths of MFE and inter-MFE-intervals are related to the external stimuli.65 On the other hand, for all parameter regimes we investigated, most inter-MFE-intervals are longer than 10 ms. Although we do not claim it is necessarily coherent with all previous studies, we find that this simple algorithm captures the great majority of the MFEs in this study and is consistently reliable during our parameter scanning.

Here, we briefly introduce the reduced coordinates in Fig. 5. The reduced Markovian model65 used six important statistics as variables: ( N G E , N G I , H E E , H E I , H I E , H I I ). It was based on a Markovian-integrate-fire model, where we discretized v into 167 states. Specifically, we used V t h = 100, V r = 0, V I = 66. Therefore, the membrane potential of a neuron took value in
{ 66 , 65 , , 0 , 1 , , 100 } { R } .
In the reduced Markovian model, N G E and N G I represented “the numbers of gate neurons of E/I populations.” A neuron was deemed to be in the “gate” state if its membrane potential was close enough to the firing threshold. The “gate” neurons included neuron i if V i > 60. Therefore, in the I&F model in this paper, the gate neurons include all neurons whose v > 0.6.

H E E, H E I, H I E, and H I I represented the sum of “pending spikes,” indicating the remaining fuels that can drive the membrane potentials. In this paper, they are related to the total conductances g E E, g E I, g I E, and g I I.

Our ODE model is a coarse-graining reduction of the I&F network model. We are inspired by the interchangeability between neurons i and j when v i = v j in the annealed architecture. In this case, distributions ρ E , I ( v , t ) are sufficient for the evolution of network dynamics, instead of tracing every v i.

Our primary idea is to approximate ρ E , I ( v , t ) by sums of several truncated Gaussian peaks. The system dynamics are subjected to the splitting and merging of the peaks, thus resulting in an ODE model concerning only the means and variances of all peaks. According to previous results,100 when an I&F network is away from firing activities, ρ E , I ( v , t ) are well approximated by Gaussian functions. The variances of the Gaussians are dictated by the noise introduced by the external and recurrent inputs. The ODE model aims at reproducing the MFE dynamics in a phenomenological fashion instead of mimicking a rigorous evolution of a Fokker–Planck type equation concerning ρ E , I ( v , t ).

Precisely, we assume that
ρ Q ( v , t ) = i = 1 k Q ( t ) D Q , i ( v , t ) , Q { E , I } .
(11)
That is, at time t, ρ Q ( v , t ) consists of k Q ( t ) truncated Gaussian peaks. The ith peak is D Q , i ( v ) with probability mass p Q , i, and
D Q , i ( v , t ) = { c Q , i ( t ) e 1 2 ( v m Q , i ( t ) σ Q , i ( t ) ) 2 for  v [ l Q , i ( t ) , h Q , i ( t ) ] , 0 otherwise.
Here, c is the normalization factor,
l Q , i ( t ) h Q , i ( t ) D Q , 1 d v = p Q , i ,
(12)
and ( m , σ 2 , l , h ) Q , i represent the mean, variance, lower, and upper truncation bounds, respectively. Truncated Gaussian peaks are sorted by m Q , i ( t ); i.e., m Q , 1 ( t ) > m Q , 2 ( t ) > > m Q , k Q ( t ). With the truncation at 3 σ, we have
{ l Q , i ( t ) = m Q , i ( t ) 3 σ Q , i ( t ) for  i , h Q , i ( t ) = m Q , i ( t ) + 3 σ Q , i ( t ) for  i > 1.
We will explain the reason for different handling h Q , i ( t ) (the upper truncation bounds of the peaks with the highest means) momentarily in splitting.

1. Splitting

We first assume that ρ E , I ( v , 0 ) are both unimodal truncated Gaussians; i.e., k Q ( 0 ) = 1. Driven by external and recurrent drives, ρ Q may cross V t h, thus divided into two parts: D Q , 1 and D Q , 2. Specifically, the splitting timing is deemed as the time when the sign of d m Q , 1 ( t ) d t changes from positive to negative. Consider a probability mass 1 p Q , 1 has passed V t h. This mass is released from V r as a new truncated Gaussian peak D Q , 2 after the refractory period. For simplicity, we assume that the shape of D Q , 1 remains a truncated Gaussian peak minus the right tail (i.e., the probability mass 1 p Q , 1 that already crossed V t h) until it merges with another peak (explained later) or touches V t h again in the next MFE; that is, t [ t , t ), h Q , i ( t ) is determined by
h Q , 1 ( t ) = m Q , 1 ( t ) + σ Q , 1 ( t ) h Q , 1 ( t ) m Q , 1 ( t ) σ Q , 1 ( t ) ;
[ t , t ) consists of events involving the first peak: t is the end of the last splitting; t stands for the time of the next merging or splitting event, whichever comes first. Due to the firing activities of the system, splitting of D Q , 1 may occur several times before the merging of any two peaks, producing multiple truncated Gaussian peaks.

2. Merging

The ith and ( i + 1 )th peaks are merged when they are close to each other at time t; i.e., m Q , i ( t ) σ Q , i ( t ) = m Q , i + 1 ( t ) + σ Q , i + 1 ( t ) . D Q , i + 1 is instantly “absorbed” by D Q , i, whose mean and variance are averaged from the two peaks before merging,
m Q , i ( t + ) = m Q , i ( t ) p Q , i ( t ) + m Q , i + 1 ( t ) p Q , i + 1 ( t ) p Q , i ( t ) + p Q , i + 1 ( t ) , σ Q , i 2 ( t + ) = [ σ Q , i 2 ( t ) + m Q , i 2 ( t ) ] p Q , i ( t ) p Q , i ( t ) + p Q , i + 1 ( t ) + [ σ Q , i + 1 2 ( t ) + m Q , i + 1 2 ( t ) ] p Q , i + 1 ( t ) p Q , i ( t ) + p Q , i + 1 ( t ) m Q , i 2 ( t + ) .
The probability mass of the new peak p Q , i ( t + ) sums from the two merged peaks, and the normalization factor c Q , i ( t + ) is adjusted accordingly. Indices of peaks that > i + 1 come down by one due to merging. We assume that the merging of more than two peaks does not take place at the same time.

3. Evolution of ODEs

ρ Q ( v , t ) is now simplified to the averages and variances of all truncated Gaussian peaks. During the absence of splitting and merging, each peak is independently driven by the external and recurrent stimulus. This is depicted by the following ODEs:
{ d σ E , i 2 d t = λ E ( S ext ) 2 2 σ E , i 2 V t h V I N I g E I , d m E , i d t = λ E S ext + N E g E E m E , i V I V t h V I N I g E I , { d σ I , i 2 d t = λ I ( S ext ) 2 2 σ I , i 2 V t h V I N I g I I , d m I , i d t = λ I S ext + N E g I E m I , i V I V t h V I N I g I I .
(13)
In Eq. (13), we assume that σ Q , i 2 is dominated by the noise from external and recurrent inhibition. The term m Q , i V I V t h V I normalizes the contribution of I drives due to the conductance-based setup in Eq. (7). Similar normalization to E drives is saved since m Q , i V E V t h V E are closer to 1.
All parameters of Eq. (13) can be found in Sec. IV A. Equation (13) is, thus, self-contained except for the conductance terms g Q Q, which can be derived from the probability flux crossing V t h. According to previous studies of kinetic theories of I&F networks,100 we define the firing rates as
f Q ( t ) = J V Q ( V t h , t ) ,
where probability flux J V Q ( V t h , t ) should satisfy the continuity equation t ρ Q + v J V Q = 0. The conductance terms are, thus, defined as temporal convolutions,
g Q Q ( t ) = S Q Q ( f Q ( t ) G Q ( t ) ) .
See definitions of convolution kernels in Eq. (8). In practice, the probability fluxes at V t h are computed from the displacement of D Q , 1, thus saving the computations of a partial differential equation.

We use the explicit Euler method to simulate v i for the network and those equations for the ODE model. The time step is 0.1 ms, and the simulation time is set as 30 s for all models.

We present four figures and three videos as supplementary materials. Four figures are ∙Fig. S1: the invariant set of f ^ ; ∙Fig. S2: Similar bifurcation maps as Fig. 4, but S E I = 2.45 × 10 − 2; ∙Fig. S3: Similar bifurcation maps as Fig. 4, but S E I = 2.61 × 10 − 2; ∙Fig. S4: MFE initiation states presented in the reduced state space; and ∙Figs. S5–S7: Comparing ER network dynamics with the annealed architecture. Three videos present the evolution of voltage distributions for 1/2/3-beat rhythms.

This work was partially supported by the STI2030-Major Projects (No. 2022ZD0204600) (R.Z., Z.W., and L.T.) and the Natural Science Foundation of China through Grant Nos. 31771147 (T.W., R.Z., Z.W., and L.T.) and 91232715 (L.T.). Z.-C.X. was supported by the Courant Institute of Mathematical Sciences through Courant Instructorship. We thank Lai-Sang Young and David McLaughlin (New York University) and Yao Li (The University of Massachusetts Amherst) for their useful comments.

The authors have no conflicts to disclose.

Tianyi Wu and Yuhang Cai contributed equally to this paper.

Tianyi Wu: Data curation (lead); Formal analysis (equal); Investigation (equal); Methodology (lead); Software (lead); Validation (equal); Visualization (lead); Writing – original draft (equal). Yuhang Cai: Data curation (lead); Formal analysis (equal); Investigation (equal); Methodology (lead); Software (lead); Validation (equal); Visualization (lead); Writing – original draft (equal). Ruilin Zhang: Methodology (equal). Zhongyi Wang: Methodology (equal). Louis Tao: Conceptualization (equal); Formal analysis (equal); Funding acquisition (equal); Investigation (equal); Methodology (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal). Zhuo-Cheng Xiao: Conceptualization (equal); Formal analysis (equal); Investigation (equal); Methodology (equal); Project administration (equal); Supervision (equal); Visualization (equal); Writing – original draft (equal); Writing – review & editing (equal).

The data and scripts to produce all the figures in the main text and the supplementary material can be found at https://github.com/faro1219/Neuron-Oscillation-II.git.

1.
C.
Gray
,
P.
König
,
A.
Engel
, and
W.
Singer
, “
Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties
,”
Nature
338
,
334
337
(
1989
).
2.
R.
Azouz
and
C.
Gray
, “
Dynamic spike threshold reveals a mechanism for synaptic coincidence detection in cortical neurons in vivo
,”
Proc. Natl. Acad. Sci. U.S.A.
97
,
8110
8115
(
2000
).
3.
N. K.
Logothetis
,
J.
Pauls
,
M.
Augath
,
T.
Trinath
, and
A.
Oeltermann
, “
Neurophysiological investigation of the basis of the fMRI signal
,”
Nature
412
,
150
157
(
2001
).
4.
J.
Henrie
and
R.
Shapley
, “
LFP power spectra in V1 cortex: The graded effect of stimulus contrast
,”
J. Neurophysiol.
94
,
479
490
(
2005
).
5.
M.
Brosch
,
E.
Budinger
, and
H.
Scheich
, “
Stimulus-related gamma oscillations in primate auditory cortex
,”
J. Neurophysiol.
87
,
2715
2725
(
2002
).
6.
T.
Buschman
and
E.
Miller
, “
Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices
,”
Science
315
,
1860
1862
(
2007
).
7.
G.
Gregoriou
,
S.
Gotts
,
H.
Zhou
, and
R.
Desimone
, “
High-frequency, long-range coupling between prefrontal and visual cortex during attention
,”
Science
324
,
1207
1210
(
2009
).
8.
M.
Siegel
,
M.
Warden
, and
E.
Miller
, “
Phase-dependent neuronal coding of objects in short-term memory
,”
Proc. Natl. Acad. Sci. U.S.A.
106
,
21341
21346
(
2009
).
9.
V.
Sohal
,
F.
Zhang
,
O.
Yizhar
, and
K.
Deisseroth
, “
Parvalbumin neurons and gamma rhythms enhance cortical circuit performance
,”
Nature
459
,
698
702
(
2009
).
10.
R.
Canolty
,
K.
Ganguly
,
S.
Kennerley
,
C.
Cadieu
,
K.
Koepsell
,
J.
Wallis
, and
J.
Carmena
, “
Oscillatory phase coupling coordinates anatomically dispersed functional cell assemblies
,”
Proc. Natl. Acad. Sci. U.S.A.
107
,
17356
17361
(
2010
).
11.
T.
Sigurdsson
,
K.
Stark
,
M.
Karayiorgou
,
J.
Gogos
, and
J.
Gordon
, “
Impaired hippocampal-prefrontal synchrony in a genetic mouse model of schizophrenia
,”
Nature
464
,
763
767
(
2010
).
12.
M.
van Wingerden
,
M.
Vinck
,
J. V.
Lankelma
, and
C. M.
Pennartz
, “
Learning-associated gamma-band phase-locking of action–outcome selective neurons in orbitofrontal cortex
,”
J. Neurosci.
30
,
10025
10038
(
2010
).
13.
B.
Pesaran
,
J.
Pezaris
,
M.
Sahani
,
P.
Mitra
, and
R.
Andersen
, “
Temporal structure in neuronal activity during working memory in macaque parietal cortex
,”
Nat. Neurosci.
5
,
805
811
(
2002
).
14.
W.
Medendorp
,
G.
Kramer
,
O.
Jensen
,
R.
Oostenveld
,
J.
Schoffelen
, and
P.
Fries
, “
Oscillatory activity in human parietal and occipital cortex shows hemispheric lateralization and memory effects in a delayed double-step saccade task
,”
Cereb. Cortex
17
,
2364
2374
(
2007
).
15.
A.
Bragin
,
G.
Jandó
,
Z.
Nádasdy
,
J.
Hetke
,
K.
Wise
, and
G.
Buzsáki
, “
Gamma (40–100 Hz) oscillation in the hippocampus of the behaving rat
,”
J. Neurosci.
15
,
47
60
(
1995
).
16.
J.
Csicsvari
,
B.
Jamieson
,
K.
Wise
, and
G.
Buzsáki
, “
Mechanisms of gamma oscillations in the hippocampus of the behaving rat
,”
Neuron
37
,
311
322
(
2003
).
17.
L.
Colgin
,
T.
Denninger
,
M.
Fyhn
,
T.
Hafting
,
T.
Bonnevie
,
O.
Jensen
,
M.
Moser
, and
E.
Moser
, “
Frequency of gamma oscillations routes flow of information in the hippocampus
,”
Nature
462
,
75
78
(
2009
).
18.
L. L.
Colgin
, “
Rhythms of the hippocampal network
,”
Nat. Rev. Neurosci.
17
,
239
249
(
2016
).
19.
A.
Popescu
,
D.
Popa
, and
D.
Paré
, “
Coherent gamma oscillations couple the amygdala and striatum during learning
,”
Nat. Neurosci.
12
,
801
807
(
2009
).
20.
M.
van der Meer
and
A.
Redish
, “
Low and high gamma oscillations in rat ventral striatum have distinct relationships to behavior, reward, and spiking activity on a learned spatial decision task
,”
Front. Integr. Neurosci.
3
,
9
(
2009
).
21.
P.
Fries
,
J.
Reynolds
,
A.
Rorie
, and
R.
Desimone
, “
Modulation of oscillatory neuronal synchronization by selective visual attention
,”
Science
291
,
1560
1563
(
2001
).
22.
P.
Fries
,
T.
Womelsdorf
,
R.
Oostenveld
, and
R.
Desimone
, “
The effects of visual stimulation and selective visual attention on rhythmic neuronal synchronization in macaque area V4
,”
J. Neurosci.
28
,
4823
4835
(
2008
).
23.
E. P.
Bauer
,
R.
Paz
, and
D.
Paré
, “
Gamma oscillations coordinate amygdalo-rhinal interactions during learning
,”
J. Neurosci.
27
,
9369
9379
(
2007
).
24.
R.
Azouz
and
C.
Gray
, “
Adaptive coincidence detection and dynamic gain control in visual cortical neurons in vivo
,”
Neuron
37
,
513
523
(
2003
).
25.
A.
Frien
,
R.
Eckhorn
,
R.
Bauer
,
T.
Woelbern
, and
A.
Gabriel
, “
Fast oscillations display sharper orientation tuning than slower components of the same recordings in striate cortex of the awake monkey
,”
Eur. J. Neurosci.
12
,
1453
1465
(
2000
).
26.
T.
Womelsdorf
,
J.
Schoffelen
,
R.
Oostenveld
,
W.
Singer
,
R.
Desimone
,
A.
Engel
, and
P.
Fries
, “
Orientation selectivity and noise correlation in awake monkey area V1 are modulated by the gamma cycle
,”
Proc. Natl. Acad. Sci. U.S.A.
109
,
4302
4307
(
2012
).
27.
J.
Liu
and
W.
Newsome
, “
Local field potential in cortical area MT: Stimulus tuning and behavioral correlations
,”
J. Neurosci.
26
,
7779
7790
(
2006
).
28.
P. S.
Khayat
,
R.
Niebergall
, and
J. C.
Martinez-Trujillo
, “
Frequency-dependent attentional modulation of local field potential signals in macaque area MT
,”
J. Neurosci.
30
,
7037
7048
(
2010
).
29.
A. D.
Engell
and
G.
McCarthy
, “
Selective attention modulates face-specific induced gamma oscillations recorded from ventral occipitotemporal cortex
,”
J. Neurosci.
30
,
8780
8786
(
2010
).
30.
K.
Benchenane
,
P. H.
Tiesinga
, and
F. P.
Battaglia
, “
Oscillations in the prefrontal cortex: A gateway to memory and attention
,”
Curr. Opin. Neurobiol.
21
,
475
485
(
2011
).
31.
C.
Poch
,
P.
Campo
, and
G. R.
Barnes
, “
Modulation of alpha and gamma oscillations related to retrospectively orienting attention within working memory
,”
Eur. J. Neurosci.
40
,
2399
2405
(
2014
).
32.
R.
Kaplan
,
D.
Bush
,
M.
Bonnefond
,
P. A.
Bandettini
,
G. R.
Barnes
,
C. F.
Doeller
, and
N.
Burgess
, “
Medial prefrontal theta phase coupling during spatial memory retrieval
,”
Hippocampus
24
,
656
665
(
2014
).
33.
N. A.
Herweg
,
E. A.
Solomon
, and
M. J.
Kahana
, “
Theta oscillations in human memory
,”
Trends Cognit. Sci.
24
,
208
227
(
2020
).
34.
A.
Goyal
,
J.
Miller
,
S. E.
Qasim
,
A. J.
Watrous
,
H.
Zhang
,
J. M.
Stein
,
C. S.
Inman
,
R. E.
Gross
,
J. T.
Willie
,
B.
Lega
et al., “
Functionally distinct high and low theta oscillations in the human hippocampus
,”
Nat. Commun.
11
,
2469
(
2020
).
35.
U.
Vivekananda
,
D.
Bush
,
J. A.
Bisby
,
S.
Baxendale
,
R.
Rodionov
,
B.
Diehl
,
F. A.
Chowdhury
,
A. W.
McEvoy
,
A.
Miserocchi
,
M. C.
Walker
and
N.
Burgess
, “
Theta power and theta-gamma coupling support long-term spatial memory retrieval
,”
Hippocampus
31
,
213
220
(
2021
).
36.
J. A.
Sweet
,
K. C.
Eakin
,
C. N.
Munyon
, and
J. P.
Miller
, “
Improved learning and memory with theta-burst stimulation of the fornix in rat model of traumatic brain injury
,”
Hippocampus
24
,
1592
1600
(
2014
).
37.
M.
Stoiljkovic
,
C.
Kelley
,
D.
Nagy
, and
M.
Hajós
, “
Modulation of hippocampal neuronal network oscillations by α7 nACh receptors
,”
Biochem. Pharmacol.
97
,
445
453
(
2015
).
38.
B. M.
Roberts
,
A.
Clarke
,
R. J.
Addante
, and
C.
Ranganath
, “
Entrainment enhances theta oscillations and improves episodic memory
,”
Cogn. Neurosci.
9
,
181
193
(
2018
).
39.
R.
Sigala
,
S.
Haufe
,
D.
Roy
,
H. R.
Dinse
, and
P.
Ritter
, “
The role of alpha-rhythm states in perceptual learning: Insights from experiments and computational models
,”
Front. Comput. Neurosci.
8
,
36
(
2014
).
40.
B. C.
Bays
,
K. M.
Visscher
,
C. C.
Le Dantec
, and
A. R.
Seitz
, “
Alpha-band EEG activity in perceptual learning
,”
J. Vis.
15
,
7
(
2015
).
41.
M.
Brickwedde
,
M. C.
Krüger
, and
H. R.
Dinse
, “
Somatosensory alpha oscillations gate perceptual learning efficiency
,”
Nat. Commun.
10
,
263
(
2019
).
42.
Q.
He
,
B.
Gong
,
K.
Bi
, and
F.
Fang
, “The causal role of transcranial alternating current stimulation at alpha frequency in boosting visual perceptual learning,” bioRxiv (2021).
43.
R.
Hindriks
,
M.
Woolrich
,
H.
Luckhoo
,
M.
Joensson
,
H.
Mohseni
,
M. L.
Kringelbach
, and
G.
Deco
, “
Role of white-matter pathways in coordinating alpha oscillations in resting visual cortex
,”
NeuroImage
106
,
328
339
(
2015
).
44.
N. A.
Herweg
,
T.
Apitz
,
G.
Leicht
,
C.
Mulert
,
L.
Fuentemilla
, and
N.
Bunzeck
, “
Theta-alpha oscillations bind the hippocampus, prefrontal cortex, and striatum during recollection: Evidence from simultaneous EEG–fMRI
,”
J. Neurosci.
36
,
3579
3587
(
2016
).
45.
H.
Zhang
,
A. J.
Watrous
,
A.
Patel
, and
J.
Jacobs
, “
Theta and alpha oscillations are traveling waves in the human neocortex
,”
Neuron
98
,
1269
1281
(
2018
).
46.
S.
Haegens
,
V.
Nácher
,
A.
Hernández
,
R.
Luna
,
O.
Jensen
, and
R.
Romo
, “
Beta oscillations in the monkey sensorimotor network reflect somatosensory decision making
,”
Proc. Natl. Acad. Sci. U.S.A.
108
,
10708
10713
(
2011
).
47.
B. E.
Kilavik
,
M.
Zaepffel
,
A.
Brovelli
,
W. A.
MacKay
, and
A.
Riehle
, “
The ups and downs of beta oscillations in sensorimotor cortex
,”
Exp. Neurol.
245
,
15
26
(
2013
).
48.
H.
Tan
,
C.
Wade
, and
P.
Brown
, “
Post-movement beta activity in sensorimotor cortex indexes confidence in the estimations from internal models
,”
J. Neurosci.
36
,
1516
1528
(
2016
).
49.
R.
Schmidt
,
M. H.
Ruiz
,
B. E.
Kilavik
,
M.
Lundqvist
,
P. A.
Starr
, and
A. R.
Aron
, “
Beta oscillations in working memory, executive control of movement and thought, and sensorimotor function
,”
J. Neurosci.
39
,
8231
8238
(
2019
).
50.
L.
Chariker
,
R.
Shapley
, and
L.-S.
Young
, “
Rhythm and synchrony in a cortical network model
,”
J. Neurosci.
38
,
8621
8634
(
2018
).
51.
J.
Rubin
and
D.
Terman
, “
High frequency stimulation of the subthalamic nucleus eliminates pathological thalamic rhythmicity in a computational model
,”
J. Comp. Neurosci.
16
,
211
235
(
2004
).
52.
A.
Kumar
,
S.
Rotter
, and
A.
Aertsen
, “
Conditions for propagating synchronous spiking and asynchronous firing rates in a cortical network model
,”
J. Neurosci.
28
,
5268
5280
(
2008
).
53.
X.-J.
Wang
and
G.
Buzsáki
, “
Gamma oscillation by synaptic inhibition in a hippocampal interneuronal network model
,”
J. Neurosci.
16
,
6402
6413
(
1996
).
54.
M. A.
Whittington
,
R.
Traub
,
N.
Kopell
,
B.
Ermentrout
, and
E.
Buhl
, “
Inhibition-based rhythms: Experimental and mathematical observations on network dynamics
,”
Int. J. Psychophysiol.
38
,
315
336
(
2000
).
55.
C.
Borgers
and
N.
Kopell
, “
Synchronization in networks of excitatory and inhibitory neurons with sparse, random connectivity
,”
Neural Comput.
15
,
509
538
(
2003
).
56.
L.
Chariker
and
L.-S.
Young
, “
Emergent spike patterns in neuronal populations
,”
J. Comput. Neurosci.
38
,
203
220
(
2015
).
57.
M. J.
Bezaire
,
I.
Raikov
,
K.
Burk
,
D.
Vyas
, and
I.
Soltesz
, “
Interneuronal mechanisms of hippocampal theta oscillations in a full-scale model of the rodent CA1 circuit
,”
eLife
5
,
e18566
(
2016
).
58.
S.
Keeley
,
Á.
Byrne
,
A.
Fenton
, and
J.
Rinzel
, “
Firing rate models for gamma oscillations
,”
J. Neurophysiol.
121
,
2181
2190
(
2019
).
59.
P.
Ashwin
,
S.
Coombes
, and
R.
Nicks
, “
Mathematical frameworks for oscillatory network dynamics in neuroscience
,”
J. Math. Neurosci.
6
,
2
(
2016
).
60.
C.
Bick
,
M.
Goodfellow
,
C. R.
Laing
, and
E. A.
Martens
, “
Understanding the dynamics of biological and neural oscillator networks through exact mean-field reductions: A review
,”
J. Math. Neurosci.
10
,
9
(
2020
).
61.
M.
Ter Wal
and
P. H.
Tiesinga
, “
Comprehensive characterization of oscillatory signatures in a model circuit with PV-and SOM-expressing interneurons
,”
Biol. Cybern.
115
,
487
517
(
2021
).
62.
L.
Zonca
and
D.
Holcman
, “
Emergence and fragmentation of the alpha-band driven by neuronal network dynamics
,”
PLoS Comput. Biol.
17
,
e1009639
(
2021
).
63.
O.
Jensen
and
L. L.
Colgin
, “
Cross-frequency coupling between neuronal oscillations
,”
Trends Cognit. Sci.
11
,
267
269
(
2007
).
64.
M.
Segneri
,
H.
Bi
,
S.
Olmi
, and
A.
Torcini
, “
Theta-nested gamma oscillations in next generation neural mass models
,”
Front. Comput. Neurosci.
14
,
47
(
2020
).
65.
Y.
Cai
,
T.
Wu
,
L.
Tao
, and
Z. C.
Xiao
, “
Model reduction captures stochastic gamma oscillations on low-dimensional manifolds
,”
Front. Comput. Neurosci.
15
,
678688
(
2021
).
66.
S.
Ray
and
J. H.
Maunsell
, “
Do gamma oscillations play a role in cerebral cortex?
,”
Trends Cogn. Sci.
19
,
78
85
(
2015
).
67.
M.
Livingstone
, “
Oscillatory firing and interneuronal correlations in squirrel monkey striate cortex
,”
J. Neurophysiol.
66
,
2467
2485
(
1996
).
68.
C.
Geisler
,
N.
Brunel
, and
X.-J.
Wang
, “
Contributions of intrinsic membrane dynamics to fast network oscillations with irregular neuronal discharges
,”
J. Neurophysiol.
94
,
4344
4361
(
2005
).
69.
J.
Zhang
,
D.
Zhou
,
D.
Cai
, and
A. V.
Rangan
, “
A coarse-grained framework for spiking neuronal networks: Between homogeneity and synchrony
,”
J. Comput. Neurosci.
37
,
81
104
(
2014
).
70.
Y.
Li
,
L.
Chariker
, and
L.-S.
Young
, “
How well do reduced models capture the dynamics in models of interacting neurons?
,”
J. Math. Biol.
78
,
83
115
(
2019
).
71.
Y.
Li
and
H.
Xu
, “
Stochastic neural field model: Multiple firing events and correlations
,”
J. Math. Biol.
79
,
1169
1204
(
2019
).
72.
J.
Kadmon
and
H.
Sompolinsky
, “
Transition to chaos in random neuronal networks
,”
Phys. Rev. X
5
,
041030
(
2015
).
73.
I. D.
Landau
and
H.
Sompolinsky
, “
Coherent chaos in a recurrent neural network with structured connectivity
,”
PLoS Comput. Biol.
14
,
e1006309
(
2018
).
74.
F.
Mastrogiuseppe
and
S.
Ostojic
, “
Linking connectivity, dynamics, and computations in low-rank recurrent neural networks
,”
Neuron
99
,
609
623
(
2018
).
75.
R.
Engelken
,
F.
Wolf
, and
L. F.
Abbott
, “Lyapunov spectra of chaotic recurrent neural networks,” arXiv:2006.02427 (2020).
76.
Y.
Shao
and
S.
Ostojic
, “
Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks
,”
PLoS Comput. Biol.
19
,
e1010855
(
2023
).
77.
J.
Zhang
,
Y.
Shao
,
A. V.
Rangan
, and
L.
Tao
, “
A coarse-graining framework for spiking neuronal networks: From strongly-coupled conductance-based integrate-and-fire neurons to augmented systems of odes
,”
J. Comput. Neurosci.
46
,
211
232
(
2019
).
78.
Y.
Shao
,
J.
Zhang
, and
L.
Tao
, “
Dimensional reduction of emergent spatiotemporal cortical dynamics via a maximum entropy moment closure
,”
PLoS Comput. Biol.
16
,
e1007265
(
2020
).
79.
T. A.
Brunner
and
J. P.
Holloway
, “
One-dimensional Riemann solvers and the maximum entropy closure
,”
J. Quant. Spectrosc. Radiat. Transf.
69
,
543
566
(
2001
).
80.
C.
Koch
,
Biophysics of Computation: Information Processing in Single Neurons
(
Oxford University Press
,
1998
).
81.
S.
Wieland
,
D.
Bernardi
,
T.
Schwalger
, and
B.
Lindner
, “
Slow fluctuations in recurrent networks of spiking neurons
,”
Phys. Rev. E
92
,
040901
(
2015
).
82.
R.
Ben-Yishai
,
R. L.
Bar-Or
, and
H.
Sompolinsky
, “
Theory of orientation tuning in visual cortex
,”
Proc. Natl. Acad. Sci. U.S.A.
92
,
3844
3848
(
1995
).
83.
E.
Salinas
and
T.
Sejnowski
, “
Correlated neuronal activity and the flow of neural information
,”
Nat. Rev. Neurosci.
2
,
539
550
(
2001
).
84.
R. Q.
Quiroga
and
S.
Panzeri
, “
Extracting information from neuronal populations: Information theory and decoding approaches
,”
Nat. Rev. Neurosci.
10
,
173
185
(
2009
).
85.
Principles of Neural Coding, edited by R. Quian Quiroga and S. Panzeri (CRC Press, London, 2013).
86.
V.
Varga
,
B.
Hangya
,
K.
Kránitz
,
A.
Ludányi
,
R.
Zemankovics
,
I.
Katona
,
R.
Shigemoto
,
T. F.
Freund
, and
Z.
Borhegyi
, “
The presence of pacemaker HCN channels identifies theta rhythmic gabaergic neurons in the medial septum
,”
J. Physiol.
586
,
3893
3915
(
2008
).
87.
B.
Hangya
,
Z.
Borhegyi
,
N.
Szilágyi
,
T. F.
Freund
, and
V.
Varga
, “
Gabaergic neurons of the medial septum lead the hippocampal network during theta activity
,”
J. Neurosci.
29
,
8094
8102
(
2009
).
88.
A.
Pavlides
,
S. J.
Hogan
, and
R.
Bogacz
, “
Computational models describing possible mechanisms for generation of excessive beta oscillations in Parkinson’s disease
,”
PLoS Comput. Biol.
11
,
e1004609
(
2015
).
89.
Y.
Yu
and
Q.
Wang
, “
Oscillation dynamics in an extended model of thalamic-basal ganglia
,”
Nonlinear Dyn.
98
,
1065
1080
(
2019
).
90.
N.
Brunel
and
V.
Hakim
, “
Fast global oscillations in networks of integrate-and-fire neurons with low firing rates
,”
Neural Comput.
11
,
1621
1671
(
1999
).
91.
N.
Brunel
, “
Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons
,”
J. Comput. Neurosci.
8
,
183
208
(
2000
).
92.
G.
Buzsaki
and
X. J.
Wang
, “
Mechanisms of gamma oscillations
,”
Annu. Rev. Neurosci.
35
,
203
225
(
2012
).
93.
J. W.
Zhang
and
A. V.
Rangan
, “
A reduction for spiking integrate-and-fire network dynamics ranging from homogeneity to synchrony
,”
J. Comput. Neurosci.
38
,
355
404
(
2015
).
94.
M. M.
Churchland
,
M. Y.
Byron
,
J. P.
Cunningham
,
L. P.
Sugrue
,
M. R.
Cohen
,
G. S.
Corrado
,
W. T.
Newsome
,
A. M.
Clark
,
P.
Hosseini
,
B. B.
Scott
et al., “
Stimulus onset quenches neural variability: A widespread cortical phenomenon
,”
Nat. Neurosci.
13
,
369
378
(
2010
).
95.
V.
Mante
,
D.
Sussillo
,
K. V.
Shenoy
, and
W. T.
Newsome
, “
Context-dependent computation by recurrent dynamics in prefrontal cortex
,”
Nature
503
,
78
84
(
2013
).
96.
P. H.
Janak
and
K. M.
Tye
, “
From circuits to behaviour in the amygdala
,”
Nature
517
,
284
292
(
2015
).
97.
J. A.
Gallego
,
M. G.
Perich
,
L. E.
Miller
, and
S. A.
Solla
, “
Neural manifolds for the control of movement
,”
Neuron
94
,
978
984
(
2017
).
98.
S.
Saxena
and
J. P.
Cunningham
, “
Towards the neural population doctrine
,”
Curr. Opin. Neurobiol.
55
,
103
111
(
2019
).
99.
C. J.
Cueva
,
A.
Saez
,
E.
Marcos
,
A.
Genovesio
,
M.
Jazayeri
,
R.
Romo
,
C. D.
Salzman
,
M. N.
Shadlen
, and
S.
Fusi
, “
Low-dimensional dynamics for working memory and time encoding
,”
Proc. Natl. Acad. Sci. U.S.A.
117
,
23021
23032
(
2020
).
100.
D.
Cai
,
L.
Tao
,
A.
Rangan
, and
D.
McLaughlin
, “
Kinetic theory for neuronal network dynamics
,”
Commun. Math. Sci.
4
,
97
127
(
2006
).
101.
J. A.
Acebrón
,
L. L.
Bonilla
,
C. J.
Pérez Vicente
,
F.
Ritort
, and
R.
Spigler
, “
The Kuramoto model: A simple paradigm for synchronization phenomena
,”
Rev. Mod. Phys.
77
,
137
185
(
2005
).
102.
E.
Montbrio
,
D.
Pazo
, and
A.
Roxin
, “
Macroscopic description for networks of spiking neurons
,”
Phys. Rev. X
5
,
021028
(
2015
).
103.
W.
Gerstner
,
W. M.
Kistler
,
R.
Naud
, and
L.
Paninski
,
Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition
(
Cambridge University Press
,
2014
).

Supplementary Material