In the brain, coherent neuronal activities often appear simultaneously in multiple frequency bands, e.g., as combinations of alpha (8–12 Hz), beta (12.5–30 Hz), and gamma (30–120 Hz) oscillations, among others. These rhythms are believed to underlie information processing and cognitive functions and have been subjected to intense experimental and theoretical scrutiny. Computational modeling has provided a framework for the emergence of network-level oscillatory behavior from the interaction of spiking neurons. However, due to the strong nonlinear interactions between highly recurrent spiking populations, the interplay between cortical rhythms in multiple frequency bands has rarely been theoretically investigated. Many studies invoke multiple physiological timescales (e.g., various ion channels or multiple types of inhibitory neurons) or oscillatory inputs to produce rhythms in multi-bands. Here, we demonstrate the emergence of multi-band oscillations in a simple network consisting of one excitatory and one inhibitory neuronal population driven by constant input. First, we construct a data-driven, Poincaré section theory for robust numerical observations of single-frequency oscillations bifurcating into multiple bands. Then, we develop model reductions of the stochastic, nonlinear, high-dimensional neuronal network to capture the appearance of multi-band dynamics and the underlying bifurcations theoretically. Furthermore, when viewed within the reduced state space, our analysis reveals conserved geometrical features of the bifurcations on low-dimensional dynamical manifolds. These results suggest a simple geometric mechanism behind the emergence of multi-band oscillations without appealing to oscillatory inputs or multiple synaptic or neuronal timescales. Thus, our work points to unexplored regimes of stochastic competition between excitation and inhibition behind the generation of dynamic, patterned neuronal activities.

## I. INTRODUCTION

Coherent neuronal activities are found in many brain regions, ranging from visual and auditory cortices,^{1–5} to frontal and parietal cortices,^{6–14} to the hippocampus,^{15–18} the amygdala,^{19} and the striatum,^{20} and are often manifested as neuronal oscillations. Dynamically, these rhythms are likely to be a reflection of the many possible interactions between excitation and inhibition, between disparate synaptic and neuronal timescales, and between local and long-range recurrent connectivities. It is also commonly believed that they form a neurophysiological basis of sensory processing and cognitive functions, such as attention,^{21,22} learning,^{23} and memory.^{13} Therefore, neuronal oscillations have been subjected to intense experimental and theoretical examinations, ranging from the specific functions of these brain rhythms to inferences of their roles in information processing and brain computations.

Much experimental evidence has correlated various population oscillations to enhanced cognition or behavioral tasks. For instance, the enhancement of gamma oscillations has been shown to sharpen visual feature detection in the primary visual cortex^{2,24–26} and direction selectivity in MT.^{27,28} Attentional states are often accompanied by more significant gamma-band synchronization.^{21,29–31} Theta-band oscillations have been found to be correlated with visual spatial memory,^{32–35} while physiologically and pharmacologically enhanced theta rhythms improve learning and memory.^{33,36–38} Alpha oscillations have been linked to perceptual learning,^{31,39–42} and recent work has suggested a possible role in coordinating activity between brain regions.^{43–45} Cortical beta rhythms have been correlated with working memory, decision-making, and sensorimotor activities.^{46–49}

Many studies, experimental, theoretical, and modeling, have focused on the emergence of collective rhythms of a single-frequency band. Numerical simulations of large-scale models of the visual cortex have shown dynamics consistent with the experimentally observed gamma rhythms.^{50} Studies demonstrated that coherent neuronal oscillations can be driven by external inputs or may emerge as internal states of the network.^{51,52} Theoretical investigations have pointed to the importance of recurrent couplings between interneurons and excitatory pyramidal cells and the role of synaptic timescales.^{50,53–58} Substantial theoretical progress has been made by mapping the so-called theta neurons to systems of phase oscillators, which, in the weakly coupled limit, are amenable to mean-field reductions demonstrating how collective dynamics can lead to phase synchronization.^{59,60}

Compared to single-frequency band neuronal rhythms, there has been much less work on the mechanisms underlying multi-band oscillations. Theoretically, studies have built upon single-frequency synchronization by introducing multiple synaptic or neuronal timescales or by the interaction between multiple neuronal pools.^{57,61,62} Researchers have also pointed out that different frequency input forcing near the Hopf bifurcation of a Hodgkin–Huxley type neuron can result in nested oscillations, similar to the theta-gamma nested rhythms observed in the hippocampus and during rapid eye movements.^{63,64}

Here, we demonstrate that multi-band rhythms can emerge from a small neuronal network consisting of a single excitatory population recurrently coupled with a single inhibitory population, both driven by constant input. Recently, we investigated how stochastic gamma oscillations form a low-dimensional manifold in a dimension-reduced state space.^{65} Here, we show that a more detailed exploration of the dynamics reveals regions in parameter space where single-frequency, stochastic oscillations break up into different frequency bands, even in the presence of temporally stationary inputs.

In Sec. II, we introduce our integrate-and-fire (I&F) neuronal network model and explore the regime of repetitive, nearly periodic multiple-firing events (MFEs) as a function of model parameters. Focusing first on the repetition, we formulate and examine a data-driven Poincaré section theory. The Poincaré sections allow us to map the iterative, nearly synchronous coherent spiking events in a reduced, low-dimensional space. Our analysis reveals that the multi-band oscillations emerge from different modes of excitation-inhibition (E–I) competitions between neural populations, which are sufficiently captured by membrane potential distributions.

## II. RESULTS

Experimentally, neuronal oscillations are typically studied by electrophysiological methods, and rhythmic oscillatory behavior is usually observed in the recordings of local field potentials.^{4,66} Mathematically, neuronal oscillations are usually modeled by temporally reoccurring activity patterns of a neuronal network. Well-known theories include the interneuronal network gamma (ING^{53,67,68}), the pyramidal interneuronal network gamma (PING^{54,55}), multiple-firing events (MFEs^{50,56}), among others. Although these models include different levels of biological details and describe various types of oscillations in different brain areas, they generally share two common views: First, neuronal oscillations are reflected by reoccurring synchronized spiking patterns generated by neural ensembles; second, the reoccurring spiking patterns are induced by the rapid, recurrent interactions between neurons. Therefore, the informative aspects of neuronal oscillations (including frequencies, amplitudes, and phases) can be represented by the statistics of the spiking patterns. Furthermore, any study of the mechanism relies on the understanding of the neuronal interactions within the ensemble.

This paper investigates multi-band neuronal oscillations emerging from the dynamics of a simple spiking neuronal network containing only two neuronal populations: excitatory (E) and inhibitory (I), which are recurrently coupled. Oscillatory or any other temporally inhomogeneous inputs are absent. Following previous work,^{69} we focus on the so-called multiple-firing events (MFEs). We first present a few different temporally repetitive, synchronized spiking patterns found during our parameter space explorations. Appearing with different amplitudes and phases, these reoccurring MFEs form different *beats* robustly in many parts of parameter space. The firing patterns can exhibit multiple peaks in the frequency spectrum from the beta band (12.5–30 Hz) to the gamma band (30–90 Hz).

To understand the emergence of different beats, we focus on explaining how the tight competition between E/I populations shapes the repetitions and alternations of MFEs. Toward this end, we develop a novel data-driven Poincaré section theory for network dynamics. Most remarkably, our Poincaré section reveals that the E–I competition is essentially captured by the membrane potential distributions just before the occurrence of each MFE. Furthermore, by considering only the most informative features of the membrane potential distributions, a stochastic bifurcation analysis illustrates that

different outcomes of the E–I competition give rise to different beats and

the multi-band oscillations are prevalent in the high-dimensional parameter space.

The data-driven Poincaré section theory presents a *top-down* view of the emergence of multi-band oscillations. On the other hand, we also develop an autonomous ODE model from a *bottom-up* perspective, which qualitatively reproduces the MFE beats and bifurcations. The ODE model only preserves the most crucial features of the E–I competition (the interactions between the low-order statistics of the membrane potentials) and ignores many other biophysical details. We demonstrate that the model reduction and the ODE model meet halfway. They exhibit surprisingly similar geometrical features in the same reduced phase space.

Taken together, our results provide a theoretical framework to analyze the complicated oscillatory dynamics produced by the rapid competition between excitatory and inhibitory neuronal populations.

### A. Synchronized firing in a small integrate-and-fire network

We numerically simulate a small, 400-neuron network to demonstrate multi-band oscillations. The network has been shown to produce stochastic, oscillatory dynamics in the gamma band.^{56,65,70,71} In this paper, we will vary the parameters based on our previous study,^{65} which aimed to model an input layer of the macaque primary visual cortex.

#### 1. A simple, integrate-and-fire network

The network contains two populations of spiking neurons: 300 excitatory (E) cells and 100 inhibitory (I) cells. All neurons are driven by independent, identically distributed Poisson spike trains. The strength of the Poisson input maintains a physiologically realistic firing rate for each neuron.

Previous studies have revealed that the architecture of a neuronal network can significantly affect the dynamics of the network.^{72–76} In this paper, we investigate the multi-band oscillations emerging from structurally *homogeneous* networks, leaving detailed analysis of the impact of network architecture to future work. Specifically, we focus on two types of architectures: 1. the Erdös–Rényi (ER) model and 2. “annealed” network architectures.

When using the ER model, all possible synaptic couplings are independently selected with probability $P$, excluding self-couplings. Once generated, the network architecture is fixed during each simulation, and spikes of neuron $i$ are only sent to its postsynaptic cells. As for the annealed architecture, when neuron $i$ fires, all other cells have the same probability $P$ to receive the spike independently. In this case, the recipients of neuron $i$’s spikes vary from spike to spike. Therefore, the same-type neurons with the same membrane potentials are interchangeable.

In the rest of the paper, we mainly demonstrate the results by using an ER network with $P=0.8$. The network architecture is fixed for all network simulations, except when we test different synaptic coupling probabilities [Figs. 4(d) and 5(d); see more in Sec. II B 2]. All simulations and analyses are carried out on the annealed architecture as well, and the results are qualitatively similar to the ER network. Later, in Sec. II D, we will argue that different network architectures generated from the ER model exhibit similar dynamical statistics, especially for larger $P$. We will also compare the statistics of network dynamics of the ER networks to annealed architectures.

#### 2. Different MFE beats

The network described above has been shown to produce stochastic gamma-band oscillations in many previous studies.^{65,70,71} Namely, spikes of E and I neurons cluster in time as MFEs, which reoccur in a frequency of roughly $\u223c40$ Hz [see, e.g., Fig. 1(b), upper panel]. Despite the stochastic external drive, previous work demonstrated that the robust recurrence of MFEs comes from the dynamical competition between the E and I populations.^{50,58,65,70} Specifically, an MFE is triggered by the stochastic firing of a couple of E cells, leading to the rapid recruitment of both E and I neuron firings. However, the inhibitory synapses result in a longer timescale of I-current than the E-current (see Sec. IV). Therefore, in a typical regime of gamma oscillations, both E and I populations initially undergo highly correlated firing, followed by a dominance of inhibition, resulting in the termination of the MFE. After the inhibition wears off, the network again enters an excitable state, allowing for the next occurrence of an MFE. We refer the reader to Ref. 56 for more details of this recurrent excitation–inhibition mechanism of stochastic gamma oscillations.

However, the recurrence of MFE does not have to return to similar spiking patterns. During our parameter space exploration, the amplitudes of MFEs (i.e., the number of neurons involved) can alter in different fashions, forming different *beats and rhythms*. We describe the three most prevalent MFE rhythms below.

If consecutive MFE amplitudes do not vary significantly, it results in regular gamma oscillations [Fig. 1(b), upper]. This classical 1-beat MFE rhythm has been well investigated by previous studies.^{56,65,70,71} Besides that, MFEs can also alternate with large and small amplitudes. For example, in a 2-beat rhythm, a strong MFE is usually followed by a weak one and vice versa [Fig. 1(b), middle]. In addition, a weak MFE may occur after every two consecutive strong MFEs [Fig. 1(b), bottom], forming a 3-beat rhythm. We remark that the three distinct MFE rhythms can take place when varying only one physiological parameter [in Fig. 1(b), the cortical coupling I-to-E weights $ S E I$: 1-beat, $ S E I=2.45\xd7 10 \u2212 2$; 3-beat, $ S E I=2.55\xd7 10 \u2212 2$; 2-beat, $ S E I=2.61\xd7 10 \u2212 2$].

These rhythms result in different peaks in the temporal frequency spectrum [Fig. 1(c)]. Compared to the single peak $\u223c45$ Hz for the 1-beat rhythm, the alternation of MFE amplitudes in the 2-beat rhythm introduces an additional peak $\u223c25$ Hz (high beta band), and the 3-beat rhythm gives one more $\u223c15$ Hz (low beta band). At the same time, the gamma-band peak is shifted toward higher frequencies as the network recovers more rapidly from the weaker inhibition induced by weaker MFEs.

We comment that the above rhythms are only a small subset of all possible network dynamics. On the other hand, the patterns are the most robust and persistent ones during our parameter exploration (via one-dimensional sweeps of a few critical parameters; see Sec. II B 2). Thus, in the rest of this paper, we focus on these rhythms, leaving more thorough investigations of all possible dynamics for future work.

### B. Multi-bands arise from iterations of MFEs

The regularity of MFE recurrence in different beats suggests the importance of dynamical paths between MFEs. We propose to study this via iteration since the next MFE is induced by the E–I competition during the current MFE. Specifically, if one could predict the next MFE from a mapping $F$, then $ F n$ can predict all subsequent MFEs; hence, the long-term network dynamics are revealed.

To address the underlying mechanism, this section provides *a data-driven Poincaré section theory* to understand the iteration between MFEs. We will perform a *top-down analysis*: Starting from the high-dimensional network dynamics, we deduce and simplify the MFE mapping $F$ by combining first-principles arguments and surrogate data collected from simulations.

#### 1. Reduction of a high-dimensional MFE mapping

The network state on time section $ t n$ (denoted by $ S n$), i.e., the initial condition of the dynamical path.

The realization of all noise $\xi $ within interval $ I n$, including noises in external drives and internal randomness in synaptic couplings.

*pseudo*in the sense that the recognition ${ t n}$ relies on a short history),

Even for our small 400-neuron network, $ S n$ contains $3\xd7400=1200$ variables. Therefore, it would be hard to study $ F 1$ without a drastic reduction of dimensions. To make it worse, MFEs are produced by rapid, transient competitions between E/I populations.^{56,65,70,71} Each MFE can be very sensitive to the randomness lying in a couple of spike times. To the best of our knowledge, an analytical description of MFE mapping $ F 1$ is unfeasible.

Instead, we propose to investigate $ F 1$ in a *data-informed* manner. That is, we first perform a three-level reduction of $ F 1$ and then compute the reduced mapping $f$ from surrogate data.

##### a. **Step 1: Coarse-graining**

##### b. **Step 2: Simplifying ****S**_{n}

**S**

_{n}The coarse-grained network state $ S n$ includes two distributions (of membrane potentials) and four variables (conductances). To further reduce the complexity of $ F 2$, we impose three simplifying assumptions to $ S n$ (detailed in this and Step 3 below), starting with

*$ g Q \u2032 Q\u22480$ at $ t n$, the beginning of an MFE ( $Q, Q \u2032\u2208{E,I}$).*

^{77,78}Thus, we can replace Eq. (3) by

It is hard to deduce the explicit form of $ F 3$, but it can be computed from short-term network simulations. The initial conditions of membrane potentials of all neurons can be sampled from $ \rho E(v, t n)$ and $ \rho I(v, t n)$, which are inferred from the moments in $ S n$. After the simulation, $ S n + 1$ can be collected from the empirical distributions of the membrane potentials at the initiation time of the second MFE. See more details in Step 4 below.

##### c. **Step 3: First-order moments**

As a specific realization of $ F 3$, we focus on the first-order moments, i.e., $ M 1 E$ and $ M 1 I$. This is of course a drastic simplification, leading to substantial information loss about $ \rho E(v)$ and $ \rho I(v)$. Nevertheless, our hope is that the first-order moments are sufficient to *qualitatively* capture the E–I competition during and after MFEs, thus explaining the different MFE beats and dynamical bifurcations observed in network simulations.

There are different ways to infer the probability distributions from a finite number of moments (e.g., maximum entropy closures^{79}). In this paper, we use a simple, *truncated Gaussian scheme* by assuming

*The inferred membrane potential distributions $ \rho E(v, t n)$ and $ \rho I(v, t n)$ are truncated Gaussian distributions. The variances of the inferred distributions are dominated by the noise of external input.*

Here, we explain the motivation of the truncated Gaussian scheme: First, without the resetting process, $ \rho E , I(v, t n)$ should be nearly Gaussian when only driven by the external Poisson inputs. This is especially the case after a strong beat takes place at $ t n \u2212 1$ since most neurons reset to $v=0$ and are strongly suppressed by the inhibitory spikes in the MFE. After that, neurons are primarily driven by the external inputs [see, e.g., Fig. 2(c)]. On the other hand, after a weak beat at $ t n \u2212 1$, $ \rho E , I(v, t n)$ can split into multiple peaks since only a small fraction of the neurons fire in the MFE. In this case, the truncated Gaussian scheme does not approximate $ \rho E(v, t n)$ and $ \rho I(v, t n)$ well. Nevertheless, as we observe from the network simulations, a weak beat right before $ t n$ usually results in a large, positive difference $ M 1 E( t n)\u2212 M 1 I( t n)$. This is because only a small number of E neurons (and more I neurons) are involved in the weak MFE, and more E neurons remain excitable with high membrane potentials. Therefore, the MFE taking place at $ t n$ is very likely a strong beat. Our preliminary tests show that the strong beat is usually sufficiently guaranteed by a positive $ M 1 E( t n)\u2212 M 1 I( t n)$, and the shapes of $ \rho E(v, t n)$ and $ \rho I(v, t n)$ become less important.

We present three videos showing the temporal evolution of the empirical distributions of membrane potentials, which are collected from network simulations (see the supplementary material). The multi-peak feature also guides us develop an autonomous ODE model (see Sec. II C).

One last assumption is made so that we can focus on the difference $ m n = defm( t n)= M 1 E( t n)\u2212 M 1 I( t n)$ between the first-order moments rather than themselves:

*At $ t n$, there is at least one neuron whose membrane potential is around the firing threshold $ V th$.*

##### d. **Step 4: Compute the one-dimensional MFE mapping**

We compute $f$ via Monte Carlo simulations. Specifically, for every selected $ m 0\u2208[\u22120.15,0.3]$, we perform 50 Monte Carlo simulations of the short-term network dynamics. Each simulation provides an $ m ^ 1$ (defined later). Together, the 50 $ m ^ 1$ values provide visualization for possible outcomes of $f( m 0,\xi )$. The interval $[\u22120.15,0.3]$ is selected to cover all $ m 1 E( t n)\u2212 m 1 I( t n)$ showing up in the network simulations.

Here, we must point out that, without the reductions in *Steps 2* and *3*, an unfeasibly large ensemble of surrogate data would be necessary to produce an effective, low-dimensional representation of the MFE mapping $ F 2$ since we need to cover the two potential distributions and four total conductances in $ S n$.

##### 5. **The MFE mapping predicts MFE dynamics**

We use the Monte Carlo simulation for mapping $f$ (denoted as $ f ^ $) to explain the different MFE beats discussed above. We will focus on (1) the invariant set of $ f ^ $ (denoted $ \chi f ^ $) and (2) the geometrical features of $ f ^ $.

The structure of $ \chi f ^ $ can be revealed by iterating $ f ^ $ repeatedly, which is carried out as follows. After selecting a $ m 0$, we compute $ m ^ 1= f ^ ( m 0)$ from one Monte Carlo simulation described in Step 4. $ m ^ 1$ is also the input to the next iteration, from which we compute $ m ^ 2= f ^ ( m ^ 1)$ and so on. After 300 iterations, we find that the sequence $( m ^ n, m ^ n + 1)$ can cluster into one, two, or three subgroups (see Fig. S1 in the supplementary material). These features do not rely on the selection of $ m 0$. The subgroups are indicated by the pink, green, and cyan circles in the left panels of Figs. 3(a)–3(c). Remarkably, the number of subgroups predicts the beat number of MFE rhythms [Figs. 3(a)–3(c), upper right raster plots]. Moreover, the subgroups of $ \chi f ^ $ agree in detail with the spiking network simulations; i.e., they fit $m(t)$ at the initiation of each MFE [Figs. 3(a)–3(c), lower right panels; circles correspond to the same-color subgroups of $ \chi f ^ $].

Furthermore, different MFE beats are explained by the geometrical features of $ f ^ $. We note that the lower branch of $ f ^ $ (the one containing $ m 0>0$) undergoes downward spatial translation when $ S E I$ increases (Fig. 3 left panels; $ S E I$ values: $A<C<B$). The downward translation of the branch explains the different MFE rhythms: From A to C, the only cluster (pink) loses stability and introduces two other clusters (cyan and green) into $ \chi f ^ $ as its predecessor and successor in the MFE iteration, giving rise to a 3-beat rhythm; from C to B, the further lowered branch does not intersect the diagonal $ m ^ 1= m 0$ anymore (red line). Thus, only two clusters remain, forming the 2-beat rhythm oscillation.

The downward translation can be explained as follows: For a fixed $ m 0$, a stronger $ S E I$ coupling means that inhibition produced by the MFEs suppresses E neurons more. Therefore, $ M ^ 1 E( t 1)$ will be lower when the next MFE is initiated, leading to a smaller $ m ^ 1$.

These results confirm that, in the selected parameter regimes, the 1D mapping $f$ successfully captures the E–I competition and the underlying mechanism of MFE beats by only considering the first-order moments of the potential distributions.

#### 2. Multi-band oscillations caused by bifurcations

The relation between $f$ and MFE beats suggests that multi-band oscillations come from critical bifurcations of the MFE mappings. Here, we examine this point by exploring $ f ^ $ in a few 1D sections of the parameter space (a thorough investigation of the high-dimensional parameter landscape is unfortunately computationally unfeasible). This subsection aims to demonstrate the prominence of multi-band oscillations in the parameter space of the simple network.

We first select a handful of parameters from the following groups.

Cortical synaptic couplings. We choose $ S E I$, the I-to-E synaptic strength.

Timescales of recurrent neuronal interactions. We choose $ \tau E$, the E-current timescale.

External drive. We choose $ S ext$, the strength of each external kick. The mean of external stimuli $ S ext\u22c5 \lambda ext$ is fixed during the variation of strength.

Network architecture. $P$, the synaptic coupling probability, controls the most important feature of the network architecture generated by an ER model. Similar to other simulations for $P=0.8$, we will generate and fix one network architecture for each $P$ tested (“quenched disorder” all the time). In addition, $P\u22c5 S Q \u2032 Q$ is fixed for $Q, Q \u2032\u2208{E,I}$ to control the total postsynaptic current induced by each spike.

Single-cell physiology. We choose $ \tau R$, the refractory period of a cell.

*difference visualization*of $ \chi f ^ $ by using

Figure 4(a) investigates the bifurcation diagram of $ S E I$, while all other parameters are fixed (the same as Fig. 3). This confirms our previous finding of the “1–3–2” beat bifurcation when $ S E I$ increases: $ \chi f ^ \u2212 1$ exhibits only one subgroup for $ S E I<2.50\xd7 10 \u2212 2$, two subgroups for $ S E I>2.61\xd7 10 \u2212 2$, and three subgroups in the transition period between them. Notably, in the 3-beat rhythm, the probability mass is usually unevenly distributed within the three subgroups of $ \chi f ^ \u2212 1$, which is affected by $ S E I$. For example, when $ S E I\u22482.53\xd7 10 \u2212 2$, though two other subgroups already emerge [blue and green circles, Fig. 3(c)], the subgroup around $( m 0, m ^ 1)=(0,0)$ still carries most of the mass (pink circle); i.e., $\Delta m n= m n + 1\u2212 m n$ is mostly 0 for any consecutive two MFEs. As a result, the network dynamics are dominated by 1-beat rhythms, with 3-beat rhythms taking place occasionally. On the other hand, when $ S E I\u22482.60\xd7 10 \u2212 2$, as indicated by the branches of $\Delta m n$, the unstable subgroup around $(0,0)$ almost vanishes, and the network dynamics is dominated by 2-beat rhythm patterns instead. This is echoed by our observations from spiking network simulations that, instead of standing alone, the 3-beat rhythms are always mixed with 1-beat or 2-beat rhythms [see, e.g., Fig. 3(c) raster plot, 2650–2800 ms].

The bifurcation map of $ S E I$ suggests that 1- and 2-beat rhythms are more robust than 3-beat. Therefore, we next focus on how other parameters may affect the bifurcations for 3-beat rhythms. Four 1-dim parameter scans are carried out for $ \tau E, S ext$, $P$, and $ \tau R$. We fix $ S E I=2.55\xd7 10 \u2212 2$ where 3-beat rhythm patterns are prominent in Fig. 4(a). The reference parameter point is indicated by red lines in Fig. 4 (details given in Sec. IV). The bifurcation diagrams for the other MFE beat rhythms, i.e., $ S E I=2.45\xd7 10 \u2212 2$ and $2.61\xd7 10 \u2212 2$, are shown in Figs. S2 and S3 of the supplementary material, respectively.

Previous studies highlight the importance of synaptic timescales, $ \tau E$ and $ \tau I$, to the synchronicity of gamma oscillations.^{58,70} Hence, we speculate that synaptic timescales impact MFE beats. Our intuition goes as follows: Reducing $ \tau E$ results in a more concentrated excitation in a shorter time. This would break the regular 2-/3-beat rhythms if strong MFEs replaced the weak MFEs. The same argument applies to increasing $ \tau I$ as well. On the other hand, when $ \tau E , I$ are close to each other, the network dynamics should become more homogeneous, as comparable excitatory and inhibitory timescales induce weaker gamma oscillations. Therefore, MFEs become less distinguishable by their amplitudes. Our numerical experiments confirm this intuition by multiplying $ \tau E$ with an adjustment ratio factor [where $ \tau E=1.4$ ms for the 3-beat rhythm; see Fig. 4(b)]. When $ \tau E$ decreases to $\u223c0.8\xd7$ the original values, the 3-beat rhythm is replaced by 1-beat; when $ \tau E$ goes to $\u223c1.8\xd7$ the original values, different branches of $ \chi f ^ \u2212 1$ become less distinguishable; i.e., the sequence ${ m n}$ does not oscillate regularly, and the beat rhythm is disrupted.

We then examine the bifurcations induced by external noise, the size of which is represented by $ S ext$. Remarkably, the network dynamics are dominated by 1-beat when large external noises are present. On the other hand, 3-beat rhythms are prevalent when noises are small [Fig. 4(c)]. These bifurcations echo our intuition that the 3-beat rhythm is disrupted by the irregularity of dynamics.

The synaptic coupling probability $P$ also exhibits a similar bifurcation diagram as $ S ext$ [Fig. 4(d), shown in the inverse scale]. This is because $P$ is related to the internal variability of network dynamics: When $P$ is larger, more neurons will receive the same recurrent stimulus when a neuron fires, leading to more synchronized network dynamics; A smaller $P$ leads to larger differences between the potentials within the same E/I population, and the network dynamics becomes more irregular.

The bifurcations induced by $ \tau R$ are much more intriguing [Fig. 4(e)]. While dominated by 3-beat rhythm patterns when $ \tau R\u223c0$, the cluster $\Delta m\u22480$ carries most mass of $ \chi f ^ \u2212 1$ for $0.2< \tau R<0.8$ ms. Hence, MFEs reoccur in 1-beat rhythms due to $ m n + 1\u2248 m n$. When the $ \tau R$ becomes longer, the dynamics are dominated by 2-beat rhythms for $0.8< \tau R<2.5$. More interestingly, 3-beat rhythms come back robustly when the refractory becomes sufficiently long. A qualitative thought goes as follows: The bifurcations of $ \tau R$ occur at the same magnitudes of the synaptic timescales used in this paper (1–10 ms; see also Ref. 80). Starting from the 3-beat dynamics where $ \tau R=0$ [red line, Fig. 4(e)], increasing $ \tau R$ to 0.2 and 0.8 ms means that the neuron will miss $\u223c15%--50%$ of the recurrent excitation produced by the MFE (since $ \tau E=1.4$ ms). Such drastic deductions of recurrent excitation can substantially change $ \rho E , I(v)$ before the initiation of the next MFE. Similar arguments apply to the rest bifurcation point $ \tau R=2.5$ ms, where up to $\u223c40%$ of recurrent inhibition produced by MFE can be missed. This intuition suggests that the refractory periods play an essential role in the interplay of E–I competition, as also hinted by other studies.^{81}

We remark that the bifurcations investigated in this paper may change when the network size varies. The *spatial scale* of Gamma and other types of neural oscillations is an open problem, and the choice of the 400-neuron network follows previous studies of MFEs.^{50,56,70} Suppose the 400-network is scaled up to larger sizes while roughly fixing the total recurrent neuronal interactions. We hypothesize that the network dynamics will become more regular because of the larger number of interchangeable neurons. This can be unfavorable for transient, weak MFEs. However, if the larger circuit has a specific architecture, such as the widely used ring model,^{82} one could expect many partial synchronized firings emerging from localized neuronal interactions. The “beat” dynamics can be affected by network sizes in different ways, but a thorough investigation is beyond the current scope of this paper.

#### 3. Geometrical features of the bifurcations

The MFE beat bifurcation lies at the center of our theory for multi-band oscillation and is worth more meticulous investigations. Here, we present a novel geometrical representation of the bifurcations with low-dimensional manifolds.

In our previous study,^{65} we developed a first-principles-based, coarse-grained model consisting of only a few variables. Though drastically reduced from the high-dimensional spiking network model, this coarse-grained model successfully captured gamma oscillations. Notably, in the reduced state space, gamma oscillations were mostly restricted on a 2D manifold. We refer readers to Ref. 65 for more details.

Now, we *project* the multi-band oscillations discovered in our current study to this reduced state space. We use an analogy of a “root–leaf” structure to describe the geometrical features of the long-term behavior. The “roots” represent the states of MFE initiations, which congregate together and occupy a small area in the state space (cyan dots in Fig. 5). The structure of the root area is shown in Fig. S4 of the supplementary material in detail. On the other hand, the “leaves” represent the trajectories of MFEs and post-MFE dynamics, which are represented as loops on a 2D manifold. All “leaves” are glued to the “root” area. Like gamma oscillations in our previous study, the multi-band oscillations also exhibit sufficiently low dimensionality despite the strong nonlinearity and stochasticity of the network dynamics.

Here, we demonstrate that the MFE beat bifurcations are captured by the “root–leaf” structure.

Most importantly, the *MFE beats* form a one-to-one mapping to the different *“leaf clusters.”* To visualize the leaf clusters, we dissect the trajectories of network dynamics on $ I n$ (as shown in Sec. II B 1) and then color each trajectory section according to $ m n$ (weak MFEs are particularly colored in dark blue). It is clear that trajectories with similar colors usually form clusters, and the number of leaf clusters equals the number of MFE beats (Fig. 5).

Also, the emergence of multiple leaf clusters is dictated by the MFE beat bifurcations. From Fig. 4, we select four points in each 1D parameter space and check the network dynamics in the reduced state space. For example, recall the “1–3–2” bifurcation pattern for $ S E I$. For the 1-beat rhythm, the trajectories of $ S E I=2.44\xd7 10 \u2212 2$ only form one leaf with light green [Fig. 5(a) 1st plot]. Three-beat rhythms are represented by the three separate leaves [dark blue, orange, and light green curves in the 2nd plot of Fig. 5(a)]. For $ S E I=2.52\xd7 10 \u2212 2$ and $2.68\xd7 10 \u2212 2$, the 2-beat rhythm is echoed by the merging of the two larger leaves. Similar observations when we vary $ \tau E$, $ S ext$, and $P$ are shown in Figs. 5(b)–5(d). Surprisingly, for different refractory periods $ \tau R$, the “root–leaf” structure undergoes substantial changes [Fig. 5(e)], suggesting again the importance of refractory periods in multi-band oscillation dynamics.

The coordinate of the reduced model includes the total E/I conductances ( $ g E , I$) and the numbers of subthreshold neurons near the firing threshold $ V t h$ (E and I “gate” neurons). More details of the reduced model are given in Sec. IV.

### C. Multi-band oscillations captured by reduced ODE models

We have been studying the MFE beats with a combination of first principles and data-driven model reductions. The soundness of a data-driven model reduction is usually controlled by two factors: (1) the validity of the model reduction assumptions and (2) the amount and quality of the surrogate data. These two factors can put us in a dilemma: the more a model is simplified, the less accurately can it represent the full model; however, the closer it is to the full model, the larger amount of high-quality surrogate data is required to compute the reduced model, which can be computationally expensive. This dilemma is shown by the series of reduced models in Eqs. (2)–(5), i.e., $ F 2$, $ F 3$, $ F 4$, and $f$. $f$ is the compromise we focus on.

To overcome the weaknesses of the data-driven model reductions, we develop an *autonomous*, reduced ODE model for MFE dynamics. The ODE model is governed by the mean-field representations of E–I competitions combined with population fluctuations. Opposed to the previous data-driven model, this ODE model is completely based on first principles and independent of the numerical simulations of network dynamics.

The key idea of the ODE model is to approximate $ \rho E , I(v,t)$ with *multiple* (instead of one) truncated Gaussian peaks to echo the observation that E/I populations are only partially involved in the weak MFEs. During the evolution of the model, the truncated Gaussian peaks will undergo *splitting* and *merging*, which are caused by the threshold-crossing during MFEs. Specifically, the potential distribution of a neural population would split if only a fraction of neurons fire during an MFE. These neurons would reset and be modeled as an additional truncated Gaussian around $ V r$ in the voltage distributions. Other neurons remain subthreshold. On the other hand, we assume that two peaks would merge into one when they get adjacent. Each peak is described as a truncated Gaussian due to the same reason as above (both E/I neurons are primarily driven by external Poissonian inputs in the absence of recurrent stimuli). In all, the autonomous ODE model describes the E–I competition by the splitting and merging of Gaussian peaks. We refer readers to Sec. IV for more details.

As crude as it is, the autonomous ODE model provides a remarkably good approximation to MFE beats, especially when the variance of each Gaussian peak is small. We find that the ODE model produces similar crucial dynamical features as the MFE beats produced by the network (Fig. 6), including firing rates, firing synchronicity, and oscillation frequency. Most importantly, the ODE model also exhibits qualitatively similar dynamics of potential distribution averages (represented by their difference, $m$). We selectively demonstrate the performance of the ODE model for the three parameter sets displayed in Fig. 3. Furthermore, compared to Fig. 4(a), the ODE model generates a similar bifurcation map of MFE beats when we vary the cortical synaptic coupling weights (Fig. 7; since we use $P=1$ for simplicity, the tested parameter $ S E I$ is normalized to keep $P\u22c5 S E I$ fixed).

Notably, the ODE model exhibits similar “root–leaf” structures during MFE bifurcations when illustrated in the same reduced space as Fig. 5 (see Fig. 8). Since the ODE model evolves the distributions of E/I potentials, one can deduce the numbers of “gate” neurons from the numbers and locations of the Gaussian peaks. Remarkably, not only the number of “leaves” matches the number of MFE beats, but the ODE model trajectories share similar geometrical features with the representation of the network simulation (Fig. 5). This suggests that the complex network dynamics can be successfully captured by considering only the most important features of $ \rho E , I$, even though the splitting and merging of Gaussian peaks are very rough descriptions.

Taking together, the autonomous ODE model supports our main conclusions that

Multi-band oscillation, i.e., the various MFE beats, emerge from the E–I competition in a different fashion.

The E–I competition in network dynamics can be captured by the E/I membrane potentials and is sufficiently represented by their important features.

### D. Different network architectures

We finally address the issue that how different network architectures impact network dynamics. We argue that, for larger $P$s, (1) the network dynamics generated by different ER network architectures usually share similar statistics and (2) the statistics of ER network dynamics are close to those of the annealed architecture. However, these two arguments are less correct for smaller $P$s.

#### 1. Different ER networks

We generate five different ER networks for each $P=0.3$, 0.5, and 0.8. We simulate each spiking network for 10 s and compare the following statistics of dynamics: firing rates, the amplitudes and lengths of MFEs, and the lengths of inter-MFE-intervals [Figs. S5–S7(a) in the supplementary material]. Other parameters are fixed the same as Fig. 4(d). For $P=0.8$, the firing rates and the distributions of the other three statistics are alike among the five different architectures. Similar arguments apply to $P=0.5$. For $P=0.3$, the firing rates and distributions become less similar.

#### 2. ER networks and the annealed architecture

A similar trend is observed when comparing the ER network dynamics to the annealed architecture [Figs. S5–S7(b) in the supplementary material]. The distributions of the three statistics are alike for $P=0.8$. As for $P=0.5$, though the shapes of the distributions are comparable, the locations of the peaks are different. Specifically, for the annealed architecture, the MFE amplitudes of the strong beats are lower than the ER networks, and most MFEs are shorter in time. Larger differences are observed for $P=0.3$. We speculate that this is because the ER network architecture usually gives rise to E neurons whose numbers of presynaptic E/I cells are away from the average. Therefore, these E neurons are more active than the others and can help the MFEs last longer/involve more neurons. On the opposite, the annealed architecture does not share the same advantage since neurons are interchangeable.

While a fixed network architecture reflects biological realism better, the annealed architecture is closer to a mean-field limit and also excludes a possible bias introduced by specific synaptic connections. We have repeated our analysis using the annealed architecture, including the computations of reduced MFE mappings $ f ^ $, the bifurcation diagrams, and the “root–leaf” structure in the reduced state space. Qualitatively, similar results are found and are not presented in this paper.

Thus, we believe that the qualitative features we discussed, such as multi-band oscillations, bifurcations, and mechanisms, should apply to a larger category of network architectures. The dynamics of the annealed architecture represent a mean-field limit of the ER networks. The analysis of the former is qualitatively similar to the latter.

## III. DISCUSSION

The emerging picture of brain cognition is one in which coherent neural activities form the physiological basis of many cognitive functions. Prominent among the patterned population activities are the various oscillatory rhythms that have been implicated in sensory processing, perception, attention, learning, and memory. While the experimental investigations of coherent neural activities have formed the theoretical underpinnings of information coding in the brain,^{17,39,44,83–85} the understanding of how neural circuits can generate patterned activity remain incomplete. Previous theoretical studies have largely focused on single-band oscillatory activities. In addition to the theories for gamma oscillations referred to earlier in this paper, there are many studies on the mechanism behind theta,^{18,45,57,64,86,87} alpha,^{45,62} beta oscillations,^{48,88,89} and others. By combining these ideas with oscillatory inputs, studies have shown that, in general, multi-band oscillations can emerge near Hopf bifurcations of the driven population.^{63,64}

Recent work has shown that a wide variety of coherent activity can be generated by homogeneously structured, fluctuation-driven neuronal networks, even under temporally constant drive. In particular, experimental and simulational studies have demonstrated the ubiquity of self-organized, nearly synchronous population spiking activity of variable size and duration. Much work has examined the sparsely synchronized regime, where most neurons show irregular firing at a frequency lower than the population rhythm (see Refs. 30 and 90–92, for instance).

Here, we take a different tack and examine a stronger coupling regime where a single excitatory neuronal spiking can induce stochastic synchronized firing of a small cluster of neurons. Theoretically, the main obstacle to a full analysis of this regime is a clear delineation of the strong stochastic competition between excitatory and inhibitory sub-populations. Already, the mathematical description of the so-called multiple-firing events needed the development of a specific ensemble average, i.e., a “partitioned ensemble average” that treated MFEs differently from nearly homogeneous spiking patterns^{69,93} (which can be faithfully captured by classical coarse-graining techniques, e.g., Fokker–Planck approaches^{78}). In recent work, we also uncovered a nearly two-dimensional manifold underlying stochastic gamma oscillations by projecting onto a suitable, low-dimensional state space.^{65}

In particular, we focused on a dynamical regime between single-band, stochastic oscillatory rhythms and general heterogeneous patterned activities. During our exploration of the dynamical possibilities of homogeneous networks that generate MFEs, we found that many parameters led to the emergence of nearly periodic patterns consisting of more than one single frequency. Because of the prevalent periodicity, we found it natural to combine classical Poincaré sectional theories with modern data-driven techniques. After a few steps of model reductions, the complex MFE dynamics were captured by the neuronal potentials on the time sections of MFE initiations, which were further computed from a series of Monte Carlo simulations of transient network dynamics. The Poincaré sectional theory revealed the crucial role of the delicate E–I competition in MFE bifurcations, leading to multiple frequency bands in network oscillations.

While a data-driven Poincaré return map was instrumental in identifying the bifurcations behind the emergence of multi-band patterns, a more detailed understanding of the roles played by various dynamical variables emerged in our dimensional reduced modeling. Viewed in our framework, a simple geometry of “leaves and roots” described well the multi-band oscillatory regime. When displayed in a suitably reduced state space, the stochastic neuronal oscillations coherently formed a nearly 2D manifold, on which a collection of “leaves” (describing each MFE and the post-MFE dynamics) and “roots” (indicating the complex initial conditions triggering each MFE) delineate the manifold of multi-band oscillations. Furthermore, based on these observations and through well-motivated assumptions, we managed to develop an ODE model description that captured the essentials of the population membrane potential dynamics. The multi-band oscillations emerging from spiking networks and the ODE model both exhibit similar geometrical representations of MFE cycles, with the number of MFE beats corresponding to the clusters of “leaves” in the state space.

The modern view of cognition as neuronal population dynamics has also considered these neural network computations as trajectories on manifolds.^{94–99} Recent dynamical theories of stochastic neuronal network dynamics have progressed beyond f–I curves, static input–output relations, thresholded-linear filters, and weakly nonlinear expansions.^{58,90,91,100} Among the various neuronal dynamical activities, periodic synchronization has received the most attention, both experimentally and theoretically. We view our work here as complementary to previous studies that focused on identifying mechanisms of synchronization in heterogeneous networks.^{59,60,101,102} By identifying the population membrane potential distribution as the major player in the generation of multi-frequency oscillations, we have uncovered a novel dynamical route to multi-band oscillations in a simple, homogeneous neuronal network, without appealing to oscillatory inputs or more cell types and synaptic time scales. The central role of the membrane potential distribution points out the possibility of maintaining and manipulating these multi-frequencied, synchronizing populations, which we hope to investigate more fully in ongoing and future work.

## IV. METHODS

### A. Integrate-and-fire network

^{103}), respectively.

Neuron $i$ fires when its membrane potential $ v i$ reaches the firing threshold $ V t h$, after which it is immediately reset to the resting potential ( $ V r$) and remains in a refractory state for a fixed time of $ \tau R$. It is conventional in many previous studies to set $v$ dimensionless^{80,100} by choosing $ V t h=1$ and $ V r=0$. Accordingly, $ V E= 14 3$ and $ V I=\u2212 2 3$ are the excitatory and inhibitory reversal potentials.

While Eq. (7) can model a network with an arbitrary connectivity structure, the ER model is explained in the main text, and here, we provide more information about the annealed architecture. That is, whether a neuron of type $ Q \u2032$ receives a certain spike released by another neuron of type $Q$ is determined by an independent coin flip with a probability $ P Q \u2032 Q$, where $Q, Q \u2032\u2208{E,I}$. For both the ER model and the annealed architecture, $ S i j { E , I}$ only depends on the cell types of $i$ and $j$; i.e., $ S i j\u2261 S Q \u2032 Q$ if $i$ is a type- $ Q \u2032$ neuron and $j$ is a type- $Q$ neuron.

For the “reference point” in Fig. 4 (3-beat), the network parameters are summarized below:

Frequencies of external input: $ \lambda E= \lambda I= 21 000$ Hz; strength of external stimuli: $ S i ext=3.3\xd7 10 \u2212 3$.

Strengths of cortical synaptic couplings: $ S E E=0.94\xd7 10 \u2212 2$, $ S I I=2.45\xd7 10 \u2212 2$, $ S I E=1.25\xd7 10 \u2212 2$, and $ S E I=2.55\xd7 10 \u2212 2$.

Probability of spike projections in the annealed architecture: $ P E E= P I E= P E I= P I I=0.8$; the coupling probability in the ER model: $P=0.8$.

Synaptic timescales: $ \tau I=4.5$ ms, $ \tau E E=1.4$ ms, and $ \tau I E=1.2$ ms. This is to reflect the fact that AMPA is faster than GABA.

Refractory period: $ \tau R=0$.

### B. Power spectrum of neuronal oscillations

The power spectrum density (PSD) measures the variance in a signal as a function of frequency. In this study, the PSD is computed as follows:

### C. Detection of MFE

MFE lacks a rigorous definition from previous studies. On the other hand, it is always visually categorized as highly correlated spiking patterns on the raster plots. In this study, we design a generic algorithm based on the key idea that each MFE is always triggered by recurrent interactions:

When $>2$ spikes show up in a 2 ms time interval, the end of the interval is set as the

*initiation*of an MFE.After an MFE initiation, when $<2$ spikes are detected in a 2 ms time interval, we set the beginning of the interval as the termination of the MFE.

It is known that the lengths of MFE and inter-MFE-intervals are related to the external stimuli.^{65} On the other hand, for all parameter regimes we investigated, most inter-MFE-intervals are longer than 10 ms. Although we do not claim it is necessarily coherent with all previous studies, we find that this simple algorithm captures the great majority of the MFEs in this study and is consistently reliable during our parameter scanning.

### D. A reduced coordinate for multi-band oscillations

^{65}used six important statistics as variables: $( N G E, N G I, H E E, H E I, H I E, H I I)$. It was based on a Markovian-integrate-fire model, where we discretized $v$ into 167 states. Specifically, we used $ V t h=100$, $ V r=0$, $ V I=\u221266$. Therefore, the membrane potential of a neuron took value in

*gate*neurons of E/I populations.” A neuron was deemed to be in the “gate” state if its membrane potential was close enough to the firing threshold. The “gate” neurons included neuron $i$ if $ V i>60$. Therefore, in the I&F model in this paper, the gate neurons include all neurons whose $v>0.6$.

$ H E E$, $ H E I$, $ H I E$, and $ H I I$ represented the sum of “pending spikes,” indicating the remaining fuels that can drive the membrane potentials. In this paper, they are related to the total conductances $ g E E$, $ g E I$, $ g I E$, and $ g I I$.

### E. An autonomous ODE model

Our ODE model is a coarse-graining reduction of the I&F network model. We are inspired by the interchangeability between neurons $i$ and $j$ when $ v i= v j$ in the annealed architecture. In this case, distributions $ \rho E , I(v,t)$ are sufficient for the evolution of network dynamics, instead of tracing every $ v i$.

Our primary idea is to approximate $ \rho E , I(v,t)$ by sums of several *truncated Gaussian peaks*. The system dynamics are subjected to the *splitting* and *merging* of the peaks, thus resulting in an ODE model concerning only the means and variances of all peaks. According to previous results,^{100} when an I&F network is away from firing activities, $ \rho E , I(v,t)$ are well approximated by Gaussian functions. The variances of the Gaussians are dictated by the noise introduced by the external and recurrent inputs. The ODE model aims at reproducing the MFE dynamics in a phenomenological fashion instead of mimicking a rigorous evolution of a Fokker–Planck type equation concerning $ \rho E , I(v,t)$.

*splitting*.

#### 1. Splitting

*minus*the right tail (i.e., the probability mass $1\u2212 p Q , 1$ that already crossed $ V t h$) until it

*merges*with another peak (explained later) or touches $ V t h$ again in the next MFE; that is, $\u2200t\u2208[ t \u2032, t \u2033)$, $ h Q , i(t)$ is determined by

*merging*of any two peaks, producing multiple truncated Gaussian peaks.

#### 2. Merging

#### 3. Evolution of ODEs

*probability flux*crossing $ V t h$. According to previous studies of kinetic theories of I&F networks,

^{100}we define the firing rates as

### F. Numerical simulations of ODEs

We use the explicit Euler method to simulate $ v i$ for the network and those equations for the ODE model. The time step is $0.1$ ms, and the simulation time is set as $30$ s for all models.

## SUPPLEMENTARY MATERIAL

We present four figures and three videos as supplementary materials. Four figures are ∙Fig. S1: the invariant set of f ^ ; ∙Fig. S2: Similar bifurcation maps as Fig. 4, but S E I = 2.45 × 10 − 2; ∙Fig. S3: Similar bifurcation maps as Fig. 4, but S E I = 2.61 × 10 − 2; ∙Fig. S4: MFE initiation states presented in the reduced state space; and ∙Figs. S5–S7: Comparing ER network dynamics with the annealed architecture. Three videos present the evolution of voltage distributions for 1/2/3-beat rhythms.

## ACKNOWLEDGMENTS

This work was partially supported by the STI2030-Major Projects (No. 2022ZD0204600) (R.Z., Z.W., and L.T.) and the Natural Science Foundation of China through Grant Nos. 31771147 (T.W., R.Z., Z.W., and L.T.) and 91232715 (L.T.). Z.-C.X. was supported by the Courant Institute of Mathematical Sciences through Courant Instructorship. We thank Lai-Sang Young and David McLaughlin (New York University) and Yao Li (The University of Massachusetts Amherst) for their useful comments.

## AUTHOR DECLARATIONS

### Conflict of Interest

The authors have no conflicts to disclose.

### Author Contributions

Tianyi Wu and Yuhang Cai contributed equally to this paper.

**Tianyi Wu:** Data curation (lead); Formal analysis (equal); Investigation (equal); Methodology (lead); Software (lead); Validation (equal); Visualization (lead); Writing – original draft (equal). **Yuhang Cai:** Data curation (lead); Formal analysis (equal); Investigation (equal); Methodology (lead); Software (lead); Validation (equal); Visualization (lead); Writing – original draft (equal). **Ruilin Zhang:** Methodology (equal). **Zhongyi Wang:** Methodology (equal). **Louis Tao:** Conceptualization (equal); Formal analysis (equal); Funding acquisition (equal); Investigation (equal); Methodology (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal). **Zhuo-Cheng Xiao:** Conceptualization (equal); Formal analysis (equal); Investigation (equal); Methodology (equal); Project administration (equal); Supervision (equal); Visualization (equal); Writing – original draft (equal); Writing – review & editing (equal).

## DATA AVAILABILITY

The data and scripts to produce all the figures in the main text and the supplementary material can be found at https://github.com/faro1219/Neuron-Oscillation-II.git.

## REFERENCES

*in vivo*

*et al.*, “

*Biophysics of Computation: Information Processing in Single Neurons*

*Principles of Neural Coding*, edited by R. Quian Quiroga and S. Panzeri (CRC Press, London, 2013).

*et al.*, “

*Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition*