Nowadays, experimental techniques allow scientists to have access to large amounts of data. In order to obtain reliable information from the complex systems that produce these data, appropriate analysis tools are needed. The Kalman filter is a frequently used technique to infer, assuming a model of the system, the parameters of the model from uncertain observations. A well-known implementation of the Kalman filter, the unscented Kalman filter (UKF), was recently shown to be able to infer the connectivity of a set of coupled chaotic oscillators. In this work, we test whether the UKF can also reconstruct the connectivity of small groups of coupled neurons when their links are either electrical or chemical synapses. In particular, we consider Izhikevich neurons and aim to infer which neurons influence each other, considering simulated spike trains as the experimental observations used by the UKF. First, we verify that the UKF can recover the parameters of a single neuron, even when the parameters vary in time. Second, we analyze small neural ensembles and demonstrate that the UKF allows inferring the connectivity between the neurons, even for heterogeneous, directed, and temporally evolving networks. Our results show that time-dependent parameter and coupling estimation is possible in this nonlinearly coupled system.

The Kalman filter is a popular technique that can be employed to infer the parameters of a model given uncertain observations, and it has found applications in diverse fields. In the field of neuroscience, it has been used, for example, to estimate the parameters of neural models and for real-time decoding of brain signals for brain–machine interfaces. However, the neural models that have been considered contain a large number of parameters, which makes a systematic exploration of the parameter space unfeasible. Here, we study a neural model, the Izhikevich model (IM), which realistically reproduces many neural states, even though it is computationally low-cost. Having a small number of parameters and, at the same time, showing very rich dynamical regimes, the Izhikevich model is an ideal candidate for a systematic exploration of the parameter space and the study of neurons coupled with different topologies. We analyze the suitability of the Kalman filter to estimate the model’s parameters and we discuss its main limitations.

## I. INTRODUCTION

One of the main challenges that neuroscience has faced for a long time is the determination of brain topology, which is morphologically diverse and complex. In addition, the elements that form the brain network, the neurons, are also diverse and complex. Neurons show reproducible nonlinear responses to stochastic stimuli.^{1} Hence, they can be modeled as stochastic nonlinear dynamical systems.^{2}

Although much progress has been made on the relationship between topology and dynamics in the brain, scientists are still far from having a good understanding.^{3,4} Mathematical models of the whole brain or just of a tiny fraction of billions of neurons, as well as information-based data analysis techniques are powerful tools for shedding light on the above relationship.^{5}

However, a realistic estimation of the models’ states and parameters is a very difficult challenge, and different approaches based on control theory have been developed.^{6} A well-known method is the Kalman filter.^{7–9}

The Kalman filter allows inferring optimal parameters of a model given uncertain observations, balancing the effects of measurement noise, disturbances, and model uncertainties and has found applications in many fields of science and technology.^{10} In neuroscience, the Kalman filter has been used, for example, for decoding brain signals for brain-machine interfaces.^{11–13} It has also been used to estimate the parameters of neural models.^{14–19} However, the models that have been considered, such as the Morris–Lecar or the Hodgkin–Huxley, contain a large number of parameters that make a systematic exploration of the parameter space unfeasible. Here, we study the Izhikevich model^{20} because it reproduces many important properties of biological neurons and, at the same time, has a small number of parameters and is computationally low-cost.^{21} Therefore, the Izhikevich model is an ideal candidate for a systematic exploration of the parameter space allowing a study of small ensembles of coupled neurons.

We analyze under which conditions a nonlinear version of the Kalman filter, the unscented Kalman filter (UKF),^{22,23} provides a good estimation of the IM parameters and we discuss its main limitations. We show that the UKF is able to recover the parameters of an isolated neuron and the external current that is exciting its activity. We also show that the UKF is able to do so even in the case of time-dependent input currents. Then, we study small networks with different topologies, with both electrical and chemical couplings, and show that UKF is able to recover the topology of the network using observations of the dynamic variables, assuming the coupling strength, electrical or chemical, and all the internal parameters are known.

## II. METHODS

### A. Model

The Izhikevich model (IM) was introduced by Izhikevich^{20} as an alternative to more realistic but computationally expensive neuron models.^{24} Despite its simplicity, it can be used to model a broad variety of neuron types^{21} and dynamical regimes. Here, we will focus on single Izhikevich neurons in the chaotic regime—that is, neurons for which the spiking dynamics is irregular, aperiodic—and small networks of chaotic neurons linked by electrical or chemical couplings.

The state of an Izhikevich neuron $i$ is fully specified by two state variables. $ x i$ represents the neuron membrane potential and $ y i$ represents the membrane recovery variable accounting for the activation of the ionic currents.

$a$ is a small parameter representing the slow time-scale of $ y i$, $b$ is the coupling strength between the state variables, and the external currents are modeled by $I$. All parameters here, including $x$, $y$, and time, are dimensionless. The parameters $a$, $b$, $c$, and $d$ can be fitted to obtain a specific firing pattern of the neuron. The last term in Eq. (1) represents random fluctuations and we refer to it as dynamic or process noise. $ \xi i x$ and $ \xi i y$ represent Gaussian white noises with zero mean and unity variance. $ \sigma Z$ is the noise strength and for simplicity, it is the same for $x$ and $y$. For a system of $N$ neurons, the dynamical noise can be thought as a random 2 $N$-dimensional vector with zero mean and covariance matrix $ Q \xaf Z= \sigma Z 2 I$, where $ I$ is the identity matrix.

^{25}Here, we consider the simplest coupling, namely, linear diffusive coupling, $ E i$ is given by

^{26}

^{,}

^{27}

a
. | b
. | c
. | d
. | I (I_{s})
. | g_{e}
. | g_{c}
. | μ_{s}
. | $\u03f5$ . | θ
. | α
. | ω
. | σ_{Z}
. | σ_{ν}
. |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

0.2 | 2 | −56 | −16 | −99 | (0, 0.1) | (0, 0.05) | 35 | 7 | 0 | 3 | 0.15 | 0.025 | 0.15 |

a
. | b
. | c
. | d
. | I (I_{s})
. | g_{e}
. | g_{c}
. | μ_{s}
. | $\u03f5$ . | θ
. | α
. | ω
. | σ_{Z}
. | σ_{ν}
. |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

0.2 | 2 | −56 | −16 | −99 | (0, 0.1) | (0, 0.05) | 35 | 7 | 0 | 3 | 0.15 | 0.025 | 0.15 |

Throughout this study, we use symmetric $ A e$ matrices and asymmetric $ A c$ matrices, because the electrical coupling is symmetric, but the chemical coupling is directional. Here, we only focus on a proof-of-concept demonstration of the UKF’s ability to infer coupling topology; in the future, we plan to study the more challenging (and realistic) scenario of heterogeneous, excitatory, or inhibitory chemical synapses.

### B. The unscented Kalman filter

The Kalman filter makes a prediction for the future state of a system, resulting from the state evolution of a dynamical model, and then corrects it using the information coming from experimental data. Even though it was originally developed for linear systems, soon it was extended to include nonlinearities. Different nonlinear extensions were created. The UKF is one nonlinear version of the filter that has a good performance in terms of computational effort.

Following the notation in Forero-Ortiz *et al.*,^{28} we consider the extended state $ u \xaf$ as the vector given by the state variables $ x i$ and $ y i$ of the $N$ neurons $(i=1,\u2026,N)$ and all the parameters we want to retrieve. Our process model to be employed in the UKF will be $ u \xaf k + 1= a \xaf( u \xaf k)$, where $k$ is the timestep index. $ a \xaf$ is given by the deterministic part of Eq. (1) for the state variables and is the identity operator for the parameters, as we assume they are constant. In the UKF algorithm, the estimation for $ u \xaf k + 1$ predicted using the stochastic dynamical model is corrected by an experimental piece of data. However, these experimental data will necessarily have some uncertainty resulting from the measurement process, represented by a measurement function. Our measurement function is a selection of the state variables from the extended vector $ u \xaf k$, which are perturbed by the measurement noise with standard deviation $ \sigma \nu $: $ x i\u2192 x i+ \sigma \nu \chi i x$ and $ y i\u2192 y i+ \sigma \nu \chi i y$, where $\chi $ represents Gaussian white noise. Thus, the covariance matrix of the measurement noise will be $ Q \xaf \nu = \sigma \nu 2 I$. The covariance of the estimated state is $P= \sigma P I$, which is evolved by the UKF algorithm from the initial values given in Table II.

a
. | b
. | c
. | d
. | I (I_{s})
. | g_{e}
. | g_{c}
. | α
. | ω
. | σ_{P}
. |
---|---|---|---|---|---|---|---|---|---|

(0.01, 0.9) | (0.01, 5) | ( − 70, − 40) | ( − 25, − 5) | ( − 94, − 104) | (0, 0.1) | (0, 0.05) | (1, 5) | (0.1, 1) | 0.01 |

a
. | b
. | c
. | d
. | I (I_{s})
. | g_{e}
. | g_{c}
. | α
. | ω
. | σ_{P}
. |
---|---|---|---|---|---|---|---|---|---|

(0.01, 0.9) | (0.01, 5) | ( − 70, − 40) | ( − 25, − 5) | ( − 94, − 104) | (0, 0.1) | (0, 0.05) | (1, 5) | (0.1, 1) | 0.01 |

### C. Implementation

To generate the synthetic data that we use as experimental observations, we numerically solve Eq. (1) with a fourth-order Runge–Kutta method, an integration step of $dt=0.01$, and the parameters reported in Table I, keeping the measurements with sampling rate equal to the integration step. With these parameters single (uncoupled) neurons display chaotic dynamics,^{27} as depicted in Fig. 1(a). Initial conditions for the simulations were drawn from a normal distribution centered at a fixed point of Eq. (1) in the case of no coupling, $(\u221256.25,\u2212112.5)$, with a standard deviation equal to $1$.

Throughout the study, we employ the UKF implemented by the Python package *FilterPy*.^{29} The confidence in the process model ( $ Q \xaf Z$) and the measurements ( $ Q \xaf \nu $) are kept constant (see Table I). An initial transient of $50000$ timesteps was discarded in all runs.

The UKF requires an initial guess for the parameters that we want to estimate. To test the robustness of the UKF, we consider different initial guesses for each run, which are selected from a uniform distribution in the ranges given in Table II.

We quantify the performance using the full coupling matrix $G$, as we want to test not only if the UKF is able to reconstruct the connectivity but also if it can devise the correct coupling strength without being informed that the coupling strength is the same for all links. Also, we chose to use the Euclidean distance because it is a simple, straightforward measure to compare two graphs of weighted links with a single figure.

## III. RESULTS

### A. Estimation of the parameters of a single neuron

First, we illustrate the effectiveness of the UKF in estimating the parameters of a single Izhikevich neuron. The parameters that we attempt to estimate are $a,b,c,d,$ and $I$. Note that the parameters $c$ and $d$ only appear in the resetting dynamics, therefore the UKF can only update them in the event of a spike. Moreover, the equation for $ y \u02d9$ contains a product $ab$, which can increase the uncertainty of the estimation. For example, an underestimation of $a$ can compensate an overestimation of $b$. To avoid such problems, we estimated $ab$ and $a$ independently.

To recover the unknown parameters, we consider $100$ simulated time series as input, each with a different initial parameter guess drawn uniformly from the intervals reported in Table II. These intervals have been chosen because in those ranges the spiking of the neuron will be chaotic, which is a piece of information we can infer from the spike sequence. Results are shown in Fig. 2. For all cases, the real value of the parameter is within the range of the standard deviation. As the estimation of $c$ and $d$ is only updated when the neuron spikes, the duration of the simulated time series required to obtain a reliable estimation is larger than for the other parameters.

We point out that using the UKF to estimate $c$ and $d$ is unnecessary because a direct estimation of these parameters can be done easily by checking the values of $x$ and $y$ after a spike.

Now, we test the estimation of time-varying parameters. Specifically, we consider a sinusoidal external current, $I(t)= I s+3\alpha sin\u2061(\omega t)$, and estimate $a$, $b$, and $I(t)$ (wrongly assuming that the current is constant). The effect of such a current on the neuron’s dynamics is shown in Fig. 1(b), where we see that bursts of spikes are followed by periods of subthreshold oscillations. Since at constant $I$ the Izhikevich model displays a great variety of dynamical behaviors including bursting,^{20} the inspection of the time series does not provide evidence of the presence of a sinusoidal input current.

The results of the parameter estimation are shown in Fig. 3. The recovered values of $a$ and $b$ are comparable to those obtained in the previous parameter estimation (see Fig. 2). The estimated value of $I$ oscillates with a frequency equal to $\omega $, suggesting that $I$ is not constant.

Next, we substitute the expression of $I(t)$ in the model, Eq. (1), and separately estimate $ I s$, $\alpha $, and $\omega $. In this case, we also need to include time as an additional dimension of the extended vector space, with dynamic equation $ t \u02d9=1$. The results of this approach are shown in Fig. 4. The UKF can estimate the correct parameters of the oscillation in the majority of cases. However, large departures from the correct values can be observed, which could be due to the fact that the model with constant $I$ can produce similar output dynamics.

### B. Estimation of network connectivity

We now consider small networks of Izhikevich neurons, and we investigate the capacity to recover the adjacency matrix $A$ assuming that the coupling strengths, $ g e$ and $ g c$, and all the internal parameters are known. Of course, this is not possible in experiments, and we use these assumptions as a first step for testing the neural network reconstruction problem using the UKF approach: if, given these assumptions, the network cannot be inferred, we can conclude that the UKF approach is not useful; on the other hand, if we succeed in reconstructing the network with these assumptions, as a next step we will test the UKF approach having less information, for instance, assuming a different neuron model, unknown internal parameters, unknown coupling strengths. We run the UKF algorithm, considering each element of $A$ as an additional dimension of the extended vector $ u \xaf$.

We consider networks with $N=4$ neurons, which can be seen as building blocks of bigger networks. However, we must keep in mind that complex systems usually display emergent collective behavior when the number of elements is large enough, and therefore, while the UKF algorithm may succeed in reconstructing the topology of a small network, the collective behavior that may emerge for a large enough number of neurons (and/or the large number of parameters to be inferred), will probably cause the UKF algorithm to fail. Therefore, the study of the role of the network size is, of course, important and additional work is planned, which will be reported elsewhere.

The network topologies are shown in Appendix (see Figs. 8 and 9). The simulated time evolution of the membrane potentials of all nodes for each network topology are also shown in Figs. 8 and 9. The network dynamics differ in their level of synchronization. First, we consider electrical coupling only ( $ g c=0$), such that the adjacency matrix is symmetric and we only need to determine its upper triangular elements $ G e= g e A e$ with $ g e=0.05$.

The results for the different topologies are displayed in Fig. 5, where we see that the UKF gives an excellent estimation of $ G e$, as the Euclidean distance $D ( G e , G e K F )$ approaches $0$.

To study the effect of chemical synaptic coupling in the UKF estimation, we add direct links between some nodes in the formerly symmetric networks, as shown in Appendix (Fig. 8). We estimate two adjacency matrices, one encoding electrical coupling, $ g e A e= G e$ with $ A e= ( A e ) T$, and the other encoding chemical coupling, $ g c A c = G c$ with $ A c\u2260 ( A c ) T$. We chose $ g e=0.1$ and $ g c=0.05$. Note that we increase $ g e$ compared to the previous case to test the robustness of the UKF against synchronized states. Since we use inhibitory synapses, the firing rate decreases slightly. In the limit of total synchronization the input through electric coupling, $ E i$ goes to zero, while $ C i$ is exactly the same for each element in the network.

We see in Figs. 6(a) and 6(b) that even with two coupling schemes, the chemical one being nonlinear, the UKF can estimate the correct coupling matrices. All networks are heterogeneous in the sense that the number of connections is not the same for the different neurons. As pointed out by Forero-Ortiz *et al.*,^{28} the UKF is robust against synchronization, which is confirmed here.

Here, we presented reconstruction results in the case of inhibitory synapses; however, we checked that the UKF provides similar results also for excitatory synapses and a mix of excitatory and inhibitory synapses, provided it knows which synapses are excitatory and which are inhibitory.

We highlight that for all cases the Euclidean distance $D(G, G K F)$ saturates below $ 10 \u2212 2$. Furthermore, to verify that all the links were correctly estimated we classified the performance of the UKF using the Receiver Operating Characteristic (ROC) curve.^{30} If the UKF recovers the right connectivity, then the Area Under the ROC Curve (AUC)^{30} will be 1. For all cases studied, we obtained an AUC $>0.99$, implying a perfect reconstruction of the underlying topologies, that is, the UKF predicts a link between two neurons $i$ and $j$ only if $ A i j=1$. We believe that the UKF is robust against noise as long as noise can be seen as a small perturbation to the system and the dynamics is not driven by it.

### C. Estimation of network connectivity in temporal networks

Finally, we consider temporal networks, in which $ G e , i j= g e A i j$ varies with time. This is the case in many applications of network theory,^{31} and in neuroscience, it is especially important since it can be linked to plasticity.^{32}

The results are presented in Fig. 7. Before the coupling is switched on, the UKF has quickly inferred the absence of coupling (as $D\u21920$). After the coupling is switched on, $D$ first increases sharply and then decreases steadily for all values of $ g e$. This means that the UKF can detect the emergence of coupling and estimate the $ G e$ matrix correctly. However, the estimation after the change in the network takes more time than the initial estimation of the null adjacency matrix. This is because the covariance on the matrix coefficients will decrease, meaning high confidence in the inferred matrix before the coupling is switched. When the matrix is changed, the filter has to adjust to the new state, but the low covariance will make the convergence rate slow. Nevertheless, the filter is eventually able to recover the right network structure. Different initial models or state covariances are expected to impact the convergence time, both before and after turning on the coupling. Higher covariances will result in higher variability in the predictions. This variability is more efficient in capturing changes in the parameters. On the contrary, when covariances are smaller, the predictions are less prone to change and, thus, adapt to new values. The right choice of this parameter will result in a responsive system with sufficiently stable inferred parameters.

## IV. DISCUSSION AND CONCLUSIONS

We studied the capability of the UKF for recovering the parameters of a single neuron and of small neural ensembles modeled with the Izhikevich model. We simulated the equations governing the system dynamics and used the simulated time series as experimental observations to feed the UKF algorithm, with confidence regulated by $ Q \xaf \nu $. The IM was the process model with confidence regulated by $ Q \xaf Z$.

When the parameters of an isolated neuron are constant in time, the UKF is able to estimate all the parameters. Second, we studied an isolated neuron with a sinusoidal input current, which displayed bursting spike dynamics. Due to the rich variety of dynamical behaviors of the IM, it is not trivial to identify the cause of the bursting activity. Still, even when modeling the current as a constant, the UKF retrieved the neuron parameters and the average value of the current and suggested an oscillating current. When including the oscillating current in the process model, the UKF was able to provide a reasonable estimate of the amplitude ( $\alpha $), the mean ( $I$), and the frequency of the oscillation ( $\omega $).

We have also estimated the connectivity of small networks of Izhikevich neurons with known internal parameters. First, we analyzed the five possible network topologies for four neurons with undirected electrical coupling. Then, we added directed chemical connections to the same networks. The UKF was able to recover the connectivity for all the networks regardless of the synchronization level.

Finally, we addressed the problem of temporal networks by analyzing a network of three electrically coupled neurons, in which the topology changed from no coupling to a chain topology. The UKF was able to identify the change in the network and estimate the connectivity correctly

The results presented here were obtained considering measurements of both $x$ and $y$. Beyond that, we conducted a preliminary analysis of the applicability of the UKF when only measurements of the $x$ variable are available. Our results suggest that the UKF is still able to recover the parameters of a single neuron and the network connectivity. However, to obtain good estimates, the UKF hyperparameters had to be carefully tuned, in particular, the standard deviation $ \sigma \nu $ and the initial condition for $ \sigma P$.

As in experimental measurements, only short time series with limited temporal resolution can be recorded, further work is needed to clarify the impact of the duration of the time series and the sampling time. While the results presented here were obtained using each simulated data point (i.e., using the integration step as sampling time), preliminary studies suggest that the UKF is robust to downsampling up to 1:20, if the time series is long enough.

Future work should also address larger networks and different types of neurons. In fact, as discussed before, complex systems usually display emergent collective behavior when the number of elements is large enough. Therefore, the UKF algorithm may succeed in reconstructing the topology of a small network, but will probably fail for a large number of neurons, or when there is a large number of unknown parameters. Therefore, further work is planned to test the UKF algorithm when the networks are larger and when the internal and coupling parameters are unknown. While we expect that the UKF algorithm will fail to reconstruct the network, it may yield some information that can be useful for inferring some properties of the real network (e.g., the average degree, the degree distribution, the modularity, etc.).

Finally, it will be interesting to check if the UKF can differentiate between inhibitory and excitatory synapses.

## ACKNOWLEDGMENTS

R.P.A. acknowledges financial support from Coordenação de Aperfeiçoamento de Pessoal de Nível Superior–Brasil (CAPES), Finance Code 001. H.A.C. thanks ICTP-SAIFR and FAPESP grant 2021/14335-0. G.T. and C.M. acknowledge the support of the ICREA ACADEMIA program of Generalitat de Catalunya and Ministerio de Ciencia e Innovación, Spain (Project No. PID2021-123994NB-C21).

## AUTHOR DECLARATIONS

### Conflict of Interest

The authors have no conflicts to disclose.

### Author Contributions

**R. P. Aristides:** Conceptualization (equal); Data curation (equal); Formal analysis (equal); Software (equal); Writing – original draft (equal); Writing – review & editing (equal). **A. J. Pons:** Conceptualization (equal); Writing – original draft (equal); Writing – review & editing (equal). **H. A. Cerdeira:** Conceptualization (equal); Writing – original draft (equal); Writing – review & editing (equal). **C. Masoller:** Conceptualization (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal). **G. Tirabassi:** Conceptualization (equal); Supervision (equal); Writing – original draft (equal); Writing – review & editing (equal).

## DATA AVAILABILITY

The data that support the findings of this study are available from the corresponding author upon reasonable request.

### APPENDIX: TOPOLOGIES AND SYNCHRONIZATION QUANTIFICATION

The network topologies considered when studying electrical coupling between nodes are presented in Fig. 8. The black links represent undirected connections between nodes, so the resulting adjacency matrices are symmetric ( $A= A T$). The simulated time evolution of the membrane potentials of all nodes for each network topology is also shown in Fig. 8 on the right side, together with two synchronization measures, the Kuramoto order parameter^{33} $R$ and the synchronization error $Err$.

*et al.*

^{34}The Kuramoto order parameter is given by

When both electrical and chemical coupling between nodes are considered, we use the topologies presented in Fig. 9. The adjacency matrices are not symmetric ( $A\u2260 A T$), and all the networks are heterogeneous, meaning that the nodes have a different number of connections. The simulated time evolution of the membrane potentials of all nodes for each network topology is also shown in Fig. 9 on the right side.

## REFERENCES

*Principles of Computational Modelling in Neuroscience*

*Control Theory for Physicists*

*Neural Networks and Learning Machines*

*Pattern Recognition and Machine Learning*, Information Science and Statistics (Springer New York, 2016).

*Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control*

*Neural Control Engineering: The Emerging Intersection Between Control Theory and Neuroscience*, Computational Neuroscience (MIT Press, Cambridge, MA, 2012), p. 361.

*Signal Processing, Sensor Fusion, and Target Recognition VI*, International Society for Optics and Photonics Vol. 3068, edited by I. Kadar (SPIE, 1997), pp. 182–193.

*Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No. 00EX373)*(IEEE, 2000), p. 153.

*Chemical Oscillations, Waves and Turbulence*