Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Noise suppression ability and its mechanism analysis of scale-free spiking neural network under white Gaussian noise

  • Lei Guo ,

    Roles Conceptualization, Data curation, Funding acquisition, Methodology, Project administration, Supervision

    guolei@hebut.edu.cn

    Affiliations State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China, Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China

  • Enyu Kan,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Writing – original draft

    Affiliations State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China, Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China

  • Youxi Wu,

    Roles Funding acquisition, Project administration, Supervision

    Affiliation School of Artificial Intelligence, Hebei University of Technology, Tianjin, China

  • Huan Lv,

    Roles Data curation, Methodology, Software

    Affiliations State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China, Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China

  • Guizhi Xu

    Roles Funding acquisition, Project administration, Supervision

    Affiliations State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China, Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China

Abstract

With the continuous improvement of automation and informatization, the electromagnetic environment has become increasingly complex. Traditional protection methods for electronic systems are facing with serious challenges. Biological nervous system has the self-adaptive advantages under the regulation of the nervous system. It is necessary to explore a new thought on electromagnetic protection by drawing from the self-adaptive advantage of the biological nervous system. In this study, the scale-free spiking neural network (SFSNN) is constructed, in which the Izhikevich neuron model is employed as a node, and the synaptic plasticity model including excitatory and inhibitory synapses is employed as an edge. Under white Gaussian noise, the noise suppression abilities of the SFSNNs with the high average clustering coefficient (ACC) and the SFSNNs with the low ACC are studied comparatively. The noise suppression mechanism of the SFSNN is explored. The experiment results demonstrate that the following. (1) The SFSNN has a certain degree of noise suppression ability, and the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC. (2) The neural information processing of the SFSNN is the linkage effect of dynamic changes in neuron firing, synaptic weight and topological characteristics. (3) The synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN.

Introduction

With the development of science and technology, the automation and informatization of human society have been continuously improved, which makes the electromagnetic environment become increasingly complex. Various electromagnetic interference can affect or even damage the operation of the electronic system [1]. The deficiency of traditional protection methods for electronic system including shielding, filtering, grounding and so on become increasingly prominent in the complex electromagnetic environment, which makes electromagnetic protection be faced with increasing serious challenges [24]. Biological nervous system has the self-adaptive advantages under the regulation of the nervous system, such as self-learning, self-organizing and self-repairing [5]. It is necessary to explore a new thought on electromagnetic protection by drawing from the self-adaptive advantage of the biological nervous system [6]. Artificial neural network (ANN) is the theoretical and model basis of computational neuroscience, so it is significant to study the robustness of ANN based on brain-like intelligence. The spiking neural network (SNN) is the most biologically interpreted ANN, which can simulate the information processing of the biological brain network by establishing the nonlinear state dynamics behavior of neurons and the regulation process of synaptic weight dynamics [7, 8]. Therefore, an SNN can process complex spatio-temporal information because of its powerful computing capacity [911]. SNN can be widely applied in robot control [12], brain-like research [13, 14], pattern recognition [15] and other fields.

The dynamic process of neurons is described by a mathematical model in the form of spiking firing in an SNN. The early integral and fire neuron model are too simple, which is quite different from real neuron characteristics. The H-H model is the fourth-order nonlinear differential equation, which closely represents biological characteristics [16]. The Izhikevich neuron model is the second-order nonlinear differential equation, which not only relatively closely represents real neurons but also has high computing performance [17]. Therefore, most studies are based on the Izhikevich neuron model to construct SNNs. An Izhikevich neuron model was introduced by Nobukawa to evaluate the signal responses of chaotic resonance in SNNs. They confirmed that chaotic states could sensitively respond to weak signals in chaotic resonance [18].

Synaptic plasticity is the basis of information transmission between neurons [19]. The excitability regulation of a synapse can enhance the efficiency of neural information transmission. Most studies are based on excitatory STDP to construct SNNs. A four-pathway excitatory spike-timing dependent plasticity (STDP) rule was proposed by Ebner, who applied the rule to the connection of the pyramidal neuron model, which revealed the interaction between local dynamics of dendritic voltage and plasticity mechanisms [20]. However, biological experiments show that the inhibitory regulation of synapses also plays a vital role in the neural system [21]. Chen et al. [22] used fluorescently tagged gephyrin to track inhibitory synapses in the rat visual cortex and show that visual experience-dependent plasticity is associated with clustered and location-specific of inhibitory synapses. Joana et al. [23] studied the modulation of inhibitory synaptic plasticity on coordinated activity across cortical layers, which found that modulation of inhibitory synaptic strength can effectively influence the participation of cortical neurons to cognition-relevant network activity in the rat barrel cortex. The synaptic plasticity model, including excitability and inhibition synapses, regulates the SNN dynamically, which is more biologically reasonable. The Matthew effect in the network with inhibitory synaptic plasticity showed that good burst synchronization with higher bursting measure improves with long-term potentiation, whereas bad burst synchronization with lower bursting measure becomes worse with long-term inhibition [24]. Research on SNN including both excitatory synapses and inhibitory synapses can reflect the fact that because of change in synaptic strengths, the degree of higher synchronization becomes decreased under the noise of intermediate intensity, while the degree of lower synchronization gets increased under the noise of large intensity [25]. Su et al. [26] studied the regulation process of an SNN under high frequency current stimulation based on synaptic plasticity. It was found that the dynamic behavior of the network is realized by the dynamic weights of synapses, which indicates that synaptic plasticity is the key factor of neurodynamics.

Network topology can reflect the connection form among neurons and affect network functions. A study showed that the distribution of neural connections could affect the propagation of firing rate (FR) and firing pattern in the feed-forward networks [27]. However, an enormous amount of evidence based on fMRI and EEG investigations have suggested that the biological brain function network has a scale-free property and/or small-world property [28, 29]. In our previous work, Zhang et al. found that the rat brain network has the small-world property and that correct working memory storage can increase the connection of the network and efficiency of information transmission [30]. Based on the research results on the biological brain network, an ANN with complex network topology has been studied. Investigations have shown that time delays tuned appropriately can induce multiple stochastic resonances in small-world SNNs based on the WS generation algorithm [31]. Additionally, the influence of STDP on burst synchronization in a scale-free spiking neural network (SFSNN) based on the Barrat Barthelemy Vespignani (BBV) generation algorithm was studied by Kim [32].

The external stimulation can affect the function of the brain network, and the brain network has the self-adaptive response to external stimulation. In the aspect of the study of anti-injury function of robustness in human brain networks, Saeedeh et al. [33] found that human brain networks have a certain degree of anti-injury ability against targeted attack to hub nodes in biological experiment. In the aspect of the anti-injury function in ANN, Nie et al. [34] evaluated the robustness of complex network through the variant of the characteristics (maximal degree, average degree and betweenness), which concluded that the SFN has a certain anti-injury function under node failure or attacks. The researches on the anti-injury function are conducted in the ANNs without nerve electrophysiological characteristics. In other words, the node is not a neuron model and the edge is not a synapse model in the networks. Therefore, this kind of networks cannot receive the external stimulation. The response of the networks without nerve electrophysiological characteristics to external stimulation cannot be studied. SNN is a network with electrophysiological characteristics, so researchers have carried out the researches on the impact of external stimulation on the SNN.

At present, most of the researches on self-adaptive regulation are firing synchronization and neural coding in SNNs under external stimulation. Etémé et al. found that electromagnetic stimulation induces not only regular firing activity of the neuron with spiking and bursting regimes but also synchronous neuronal modes in neural network under magnetic stimulation [35]. In our previous work, the responses of the time coding and the rate coding of the small-world SNN both showed respective specificity under white Gaussian noise and impulse noise [36]. The study of the noise suppression ability of the SNN based on synaptic plasticity is still in the stage of exploration. The research on the robust function of the SNN is of great important for brain science and engineering applications with noise suppression ability based on brain-inspired intelligence. Therefore, we carry out the research on noise suppression ability and its mechanism analysis of SFSNN under white Gaussian noise.

In this study, the SFSNN is constructed and the noise suppression abilities of the SFSNNs with the high average clustering coefficient (ACC) and the SFSNNs with the low ACC are studied comparatively. under white Gaussian noise. Additionally, the noise suppression mechanism of the SFSNN is explored. The main contributions of this study are as follows.

  1. (1). The SFSNN with more biological rationality are constructed. The construction of more brain-like SNNs is an inevitable trend for the development of artificial intelligence.
  2. (2). The noise suppression abilities of the SFSNNs are comparatively analyzed. The results show that the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC. The result provides a theoretical foundation for the engineering application based on the self-adaptive advantage of the biological nervous system.
  3. (3). The dynamic evolution processing of neuron firing, synaptic weight and topological characteristics is clarified in this study. The result is helpful to understand brain information processing.
  4. (4). The relationship between the external noise suppression ability of the SFSNN and internal synaptic plasticity is established. The result shows that the dynamic regulation of synaptic weight is significantly correlated with the noise suppression ability based on the Pearson correlation coefficient and implies that synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN.

Methods

Construction of the SFSNN

The three basic elements of the network construction are the node, edge and topology. In this study, to construct SFSNN, the Izhikevich neuron model is used as a node, which directly affects the information expression. The synaptic plasticity model including excitability and inhibition synapses is used as an edge, which is the basis of information transmission among neurons. The generation algorithm is used to generate the SFNs with a high ACC, which determines the connection form among neurons.

Izhikevich neuron model.

The firing characteristics of the Izhikevich neuron model are close to those of real neurons, and it is appropriate for large-scale construction of the network [17]. It is described as: (1) where v represents the membrane potential of the neuron, and u represents the recovery variable of membrane voltage, which reflects the activation of the potassium channel current and the inactivation of the sodium channel current. And u provides negative feedback for the membrane potential v. Excitatory and inhibitory neurons are generated by adjusting the dimensionless parameters a, b, c, and d. I represents the sum of the external input current and synaptic current. Regular spiking pattern is employed as the firing pattern of the excitatory neuron and low-threshold spiking pattern is employed as the firing pattern of the inhibitory neuron, as shown in Figs 1 and 2, respectively.

thumbnail
Fig 1. Firing patterns of the regular spiking mode of the Izhikevich neurons.

The parameters of the excitatory neuron are: a = 0.02, b = 0.2, c = −65, and d = 8.

https://doi.org/10.1371/journal.pone.0244683.g001

thumbnail
Fig 2. Firing patterns of the low-threshold spiking mode of the Izhikevich neurons.

The parameters of the inhibitory neuron are: a = 0.02, b = 0.25, c = −65, and d = 2.

https://doi.org/10.1371/journal.pone.0244683.g002

Synaptic plasticity model.

The synaptic plasticity model with excitatory synapses and inhibitory synapses plays an important role in regulating the network dynamically. The synaptic output current and input voltage show an approximately linear relationship, which can be described as: (2) where Isyn is the synaptic currents, gsyn is the synaptic conductance, Vj(t) is the membrane potential of postsynaptic neuron, and E is the reversal potential. In this study, the excitatory reversal potential Eex and the inhibitory reversal potential Ein are 0mV and −70mV, respectively. Both excitatory and inhibitory synapses regulate the efficiency of information transmission among neurons through the changes of the synaptic conductance, and each synapse has two regulation rules.

(1) When postsynaptic neuron j does not receive the action potential of presynaptic neuron i, the excitatory synaptic weights and inhibitory synaptic weights are in exponential attenuation. The excitatory and inhibitory synaptic weights are defined as gex and gin, respectively, which are described as: (3) (4) where τex and τin are the attenuation constants of excitatory and inhibitory conductance, respectively [37].

(2) When postsynaptic neuron j receives the action potential of the presynaptic neuron i, the changes of the excitatory synaptic weight and inhibitory synaptic weight can be described by formulas (5) and (6), respectively. (5) (6) where is the excitatory conductance increment caused by the action potential, and it is regulated by synaptic modification function wij. is the inhibitory conductance increment caused by the action potential, and it is regulated by synaptic modification function mij. In this study, gmax is 0.015. When the synaptic weight is less than 0, it is 0. When the synaptic weight is more than gmax, it is 0.015. wij and mij are related to the spiking firing of presynaptic neurons and postsynaptic neurons, respectively, which can be described as: (7) (8) where A+ and A are the maximum modified value of strengthened and weakened synaptic conductivity during excitation process, respectively. B+ and B are the maximum modified value of strengthened and weakened synaptic conductivity during inhibition process, respectively. Δt is the time interval between presynaptic and postsynaptic neuron firing. τ+ and τ are the time interval ranges between presynaptic and postsynaptic neuron firing when synapses are strengthened and weakened, respectively.

In this study, the ratio of excitatory synapses to inhibitory synapses is 4:1 in the synaptic plasticity model following the neuroanatomical experiment result of the mammalian cerebral cortex [38]. The parameters are as follows: τ+ = τ = 20ms, A+ = 0.1, A = 0.105, B+ = 0.02, and B = 0.03 [39].

Generation of the SFN with the high ACC.

The BBV generation algorithm is used to generate a weighted SFN in which the topology and network weights can evolve with time. The algorithm steps for network generation are as follows [40]:

  1. (1). Initial network: the network contains m0 nodes, and the weight w0 of each edge is 1, where m0 = 4.
  2. (2). Add new nodes: new node v is added with probability P, where P ∈ (0, 1]. The newly added node has m edges, and it is connected with the existing nodes according to priority selection of weight, where m = 3. The probability that old node i is selected as: (9) where j is the node connected to node i.
  3. (3). Add new edges: new edges are added according to probability 1 − P in the network, only adding mt edges, where mt = 2. The two endpoints of the newly added edge are selected according to the triangular mechanism. First, edge (i, j) of the network is selected randomly, and then another adjacent node k of node j is selected (excluding node i). The probability of selecting node k can be calculated as: (10)

If there is no connection between node i and k, a new edge is established. Otherwise, the weight is increased by σ. In both cases, the weights of edges wij and wjk are increased by σ, where σ = 1.

In this study, a SFN with the high ACC is generated according to the algorithm above. A SFN is a complex network with the degree distribution following power-law distribution. The probability that a node is connected to other k nodes is P(k) ∼ kγ. Different SFNs can be obtained by adjusting probability P. According to the research results of the functional characteristics of human brain [41], the power-law exponent γ is usually in the range of [2, 3], and the clustering coefficient of the SFN is relatively high. Therefore, the SFSNN with a high ACC is constructed, which probability P of the SFN is 0.3. The power-law exponent γ is 2.15, and the clustering coefficient C is 0.5. Among these, clustering coefficient can characterize the aggregation degree of the network. The clustering coefficient of node i is defined as the probability that two neighbor nodes of node i are connected. It is described as: (11) where ki is the degree of node i, ki(ki − 1)/2 is the possible maximum number of edges, and ei is the number of edges connected between node i and neighbor nodes. The average clustering coefficient of all nodes is used to characterize the clustering coefficient of the network. It is described as: (12)

A topological diagram of the SFN is illustrated in Fig 3. The probability distribution of node degree for the SFN when P = 0.3 is illustrated in Fig 4.

thumbnail
Fig 3. Topological diagram of the SFN.

The red dots on the boundary of the ellipse represents 500 nodes, and the internal black line represents the connections between nodes.

https://doi.org/10.1371/journal.pone.0244683.g003

thumbnail
Fig 4. Degree distribution of the SFN.

The abscissa represents the node degree value, and the ordinate represents the frequency of the corresponding degree value in the network. The degree distribution of the SFN follows the power-law distribution.

https://doi.org/10.1371/journal.pone.0244683.g004

In this study, to construct SFSNN, the Izhikevich neuron model is used as a node, the synaptic plasticity model is used as an edge, and the BBV generation algorithm is used to generate the SFN. Through the experimental results, the size of the SFSNN is 500 neurons for the following reasons: when the number of neurons of an SFSNN exceeds 1000, a computer cluster needs to be used to compute rather than a single computer due to the computing ability, and we found that noise suppression ability of an SFSNN with the number of neurons in the range of 500-1500 shows no obvious difference. The construction and noise suppression ability analysis of SFSNN are implemented on a PC with a 2.60 GHz CPU and 4 GB RAM.

Indexes of noise suppression ability

Before and after noise stimulation, the change degree of index is used to evaluate the noise suppression ability of the network. The closer the index of evaluating the noise suppression ability before stimulation to that after stimulation is, the better the noise suppression ability of the network is. In this study, we use two indexes to evaluate the noise suppression ability of the SFSNNs under white Gaussian noise from different angles. One is the relative change rate of the firing rate δ which reflects the degree of variation in FR before and after noise stimulation. The other is the correlation coefficient between membrane potentials ρ which reflects similarity between membrane potential of neuron before and after noise stimulation.

Relative change rate of the firing rate.

The interspike interval (ISI) is the difference between two adjacent firing moments of a neuron, which can be calculated as: (13) where ISIn is the firing moments difference between the nth neuron and the n-1th neuron, tn is the firing moment of the nth neuron. In this study, n is 500.

In this study, the FR of a neuron is estimated by dividing simulation duration (1000 ms) by the average ISI value. The average FR of all neurons can represent the FR of the SFSNN. The δ can quantitatively analyze the degree of variation in the FR before and after white Gaussian noise, which is described as: (14) where fi is the FR before stimulation and fj is the FR after stimulus, respectively. Under noise stimulation, the smaller the δ is, the smaller the degree of change in the FR, and the stronger the noise suppression ability of the SFSNN.

Correlation coefficient between membrane potentials.

The ρ of SFSNN is the average correlation coefficient between membrane potentials of all neurons in the network. The ρ can measure the degree of similarity between membrane potentials before and after noise stimulation, which is described as: (15) where ρij(τ) is the correlation coefficient between the neuron membrane potential before and after noise stimulation, xi is the neuron membrane potential before noise stimulation, xj is the neuron membrane potential after noise stimulation, [t1, t2] is the simulation duration. In this study, the simulation duration is 1000ms. Under noise stimulation, the higher the ρ is, the smaller the change in membrane potential, and the stronger the noise suppression ability of the SFSNN.

The Pearson correlation coefficient

The Pearson correlation coefficient r is used to calculate the correlation between the variable X and Y, which can be described as: (16)

To determine whether the sample r is from the X and Y related population, it needs to be tested for significance. In this study, we employ the t-test, which is described as: (17) When the correlation coefficient is not zero at the significance level of 0.05, it is marked significant with “*” in the top right. When the correlation coefficient is not zero at the significance level of 0.01, it is marked very significant with “**” in the top right. The P value is considered statistically significant as follows: “*”means P < 0.05, “**” means P < 0.01.

Results and discussion

Noise suppression ability

White Gaussian noise is the main source of noise for many practical systems, such as radar and communication systems. Therefore, it is currently important to investigate the noise suppression ability of the SFSNN under white Gaussian noise. White Gaussian noise is a kind of noise whose amplitude follows Gaussian distribution, and power spectral density follows uniform distribution. Gaussian distribution function is described as: (18) Probability density function of uniform distribution is described as: (19)

In this study, white Gaussian noise is the current stimulation. Adding the current stimulation to the neuron model formula (1) can get the neuron model under noise stimulation and then get the SFSNN stimulated by white Gaussian noise. The change of current stimulation amplitude of white Gaussian noise with time as shown in Fig 5.

In this section, δ and ρ are used as two indexes to evaluate the noise suppression ability of the SFSNNs under white Gaussian noise from different angles. And the noise suppression abilities of the SFSNNs with the high ACC and the SFSNNs with the low ACC are analyzed comparatively and discussed.

Noise suppression ability by relative change rate of the firing rate.

To investigate the noise suppression ability of the SFSNN under white Gaussian noise from the angle of δ, the change of the FRs before and after noise stimulation is studied according to formula (13). The change of the FRs with time under noise intensities of 0, 5, 10, 15, 20, 25 dBW are illustrated in Fig 6.

thumbnail
Fig 6. The change in the FR before and after noise stimulation.

https://doi.org/10.1371/journal.pone.0244683.g006

From Fig 6, the FR gradually decreases and tends to be stable with time under different intensities of noise. The FR under noise intensities of 5, 10, 15, 20, 25 dBW have small change compared with that before noise stimulation (the noise intensity is 0 dBW), and the degree of the change of the FR gradually increases with the increase of noise intensity. Experiment results show that the SFSNN has a certain degree of noise suppression ability.

To measure the degree of variation in the FR before and after noise stimulation, the change of the δ with time under noise intensities of 5, 10, 15, 20, 25 dBW are illustrated in Fig 7.

thumbnail
Fig 7. The change in the δ with different noise intensities.

https://doi.org/10.1371/journal.pone.0244683.g007

From Fig 7: with the time, the all of δ rise sharply in the first 200 ms, rise slowly from 200 ms to 700 ms and tend to be stable after 700 ms under noise intensities of 5, 10, 15, 20, 25 dBW; when δ tend to be stable, the overall trend of the change of δ increase with the increase of noise intensity and δ are about 7%, 15%, 37%, 84% and 180% under noise intensities of 5, 10, 15, 20, 25 dBW, respectively. The experiment results show that SFSNN has a certain degree of noise suppression ability, and the noise suppression ability becomes weak gradually with the increase of noise intensity from the δ.

Noise suppression ability by correlation coefficient between membrane potentials.

To investigate the noise suppression ability of the SFSNN under white Gaussian noise from the angle of ρ, the change of membrane potentials before and after noise stimulation is studied. The change of the membrane potentials with time under noise intensities of 0, 5, 10, 15, 20, 25 dBW are illustrated in Fig 8.

thumbnail
Fig 8. The change in the membrane potentials before and after noise stimulation.

https://doi.org/10.1371/journal.pone.0244683.g008

From Fig 8, the membrane potentials gradually decrease and tend to be stable with time under different intensities of noise. The membrane potentials under different intensities of noise have small change compared with the membrane potentials before noise stimulation (the noise intensity is 0 dBW), and the degree of the change of membrane potentials gradually increases with the increase of noise intensity. Experiment results show that the SFSNN has a certain degree of noise suppression ability.

To measure the degree of similarity between membrane potentials before and after noise stimulation quantitatively, the change of the ρ with time under noise intensities of 5, 10, 15, 20, 25 dBW are illustrated in Fig 9.

thumbnail
Fig 9. The change in the ρ with different noise intensities.

https://doi.org/10.1371/journal.pone.0244683.g009

From Fig 9: with the time, the all of ρ rise sharply in the first 200 ms, rise slowly from 200 ms to 700 ms and tend to be stable after 700 ms under noise intensities of 5, 10, 15, 20, 25 dBW; when ρ tend to be stable, the overall trend of the change of ρ increase with the increase of noise intensity and ρ are about 0.98, 0.97, 0.96, 0.94 and 0.93 under noise intensities of 5, 10, 15, 20, 25 dBW, respectively. The experiment results show that SFSNN has a certain degree of noise suppression ability, and the noise suppression ability becomes weak gradually with the increase of noise intensity from the ρ.

In this study, δ and ρ are measured to evaluate the noise suppression ability of the SFSNN under white Gaussian noise. δ and ρ are focused on measuring the degree of variation in FR and similarity between membrane potential of neuron before and after noise stimulation, respectively. Thus, we evaluate the noise suppression ability of the SFSNN from different angles and get the consistent experiment result that the SFSNN has a certain degree of noise suppression ability under white Gaussian noise.

Comparison of the SFSNNs with the high ACC and the SFSNNs with the low ACC

To investigate the noise suppression abilities of the SFSNNs with the high ACC and the SFSNNs with the low ACC, we attempt to use the BBV algorithm to construct the SFSNNs. However, the ACCs of the SFNs are relatively high and differ slightly based on the BBV algorithm when the power-law exponent is in the range of [2, 3]. Therefore, we can construct the SFSNNs with the high ACC based on BBV algorithm. SFNs with the low ACC is constructed based on the Barabási Albert (BA) generation algorithm [42]. However, the ACCs of the SFNs are relatively low and differ slightly based on the BA algorithm when the power-law exponent is in the range of [2, 3]. Therefore, we can construct the SFSNNs with the low ACC based on BA algorithm.

In order to comparatively analyze the noise suppression abilities of the SFSNNs with the high ACC and the SFSNNs with the low ACC more statistically, we construct three SFSNNs with the randomly generated high ACC topologies, whose clustering coefficients are 0.50, 0.53 and 0.56, and the corresponding power-law exponents are 2.15, 2.11 and 2.06 based on the BBV algorithm, respectively. And we construct three SFSNNs with the randomly generated low ACC topologies, whose clustering coefficients are 0.01, 0.02 and 0.03, and the corresponding power-law exponents are also 2.24, 2.17 and 2.06 based on the BA algorithm, respectively. The SFSNNs with the low ACC are stimulated under the same noise intensity range of white Gaussian noise. The noise suppression abilities of the SFSNNs with the high ACC are compared with that of the SFSNNs with the low ACC. The noise suppression abilities of the two kinds of SFSNNs are compared by the δ and the ρ, as illustrated in Figs 10 and 11, respectively.

thumbnail
Fig 10. The change of the δ of the three SFSNNs with the high ACC and the three SFSNNs with the low ACC.

https://doi.org/10.1371/journal.pone.0244683.g010

thumbnail
Fig 11. The change of the ρ of the three SFSNNs with the high ACC and the three SFSNNs with the low ACC.

https://doi.org/10.1371/journal.pone.0244683.g011

It can be seen from Figs 10 and 11 that all of the δ show an increasing trend, and all of the ρ show a decreasing trend in the two SFSNNs with the increase of noise intensity. The δ of the three SFSNNs with the randomly generated low ACC topologies are lower than that of the three SFSNNs with the randomly generated high ACC topologies. And the ρ of the three SFSNNs with the randomly generated high ACC topologies are higher than that of the three SFSNNs with the randomly generated low ACC topologies. Therefore, from two indexes δ and ρ of evaluating noise suppression ability, we get the consistent experiment result that the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC on the whole.

To quantitatively analyze the difference in the noise suppression abilities of the SFSNNs with the high ACC and the SFSNNs with the low ACC, the Euclidean distance is used to calculate the difference of the δ between the SFSNN with the high ACC of 0.56 and the SFSNN with the low ACC of 0.03, the result is 87.25 and illustration is shown in Fig 12; the Euclidean distance of the ρ between above these two SFSNNs is calculated, the result is 0.39 and illustration is shown in Fig 13.

thumbnail
Fig 12. The change of the δ of a SFSNN with the high ACC and a SFSNN with the low ACC.

https://doi.org/10.1371/journal.pone.0244683.g012

thumbnail
Fig 13. The change of the ρ of a SFSNN with the high ACC and a SFSNN with the low ACC.

https://doi.org/10.1371/journal.pone.0244683.g013

To eliminate the dimensional difference, the Euclidean distance of the δ and the ρ between the above two kinds of SFSNNs are recalculated through data normalization and are 0.03 and 0.32, respectively. The experiment results show that the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC.

Noise suppression mechanism analysis of the SFSNN

To explore the noise suppression mechanism of the SFSNN, we analyze intrinsic factor of the noise suppression ability from the two aspects. (1) The dynamic evolution process of the information processing of the SFSNN under white Gaussian noise. (2) The relationship between the external noise suppression ability of the SFSNN and internal synaptic plasticity.

Neural information processing of the SFSNN.

To explore the neural information processing of the SFSNN, the dynamic evolution processes of the firing rate, synaptic weight and clustering coefficient under white Gaussian noise of 10 dBW are investigated.

(1) Firing Rate

The external white Gaussian noise can lead to the change of the firing sequence of neurons in the network. To describe the FR of all neurons of the SFSNN, the average FR is used, and its dynamic evolution is illustrated in Fig 14.

From Fig 14, the average FR drops sharply in the first 150 ms, drops slowly from 150 ms to 700 ms and tends to be stable after 700 ms under white Gaussian noise of 10 dBW. And the firing moment of neurons is an important factor in the change in synaptic weight.

(2) Synaptic Weight

According to formulas (3), (4), (5), (6), (7) and (8), it can be found that the noise stimulation can affect the firing moments interval between presynaptic and postsynaptic neurons Δt, and the weight of excitatory synapses gex(t) and the weight of inhibitory synapses gin(t) is affected by Δt. Therefore, the change of the FR can lead to the change of synaptic weight. The average synaptic weight can be used to describe the synaptic weight of all edges in the SFSNN. The dynamic evolution of the average synaptic weight with time under the noise is illustrated in Fig 15.

thumbnail
Fig 15. The change in the average synaptic weight with time.

https://doi.org/10.1371/journal.pone.0244683.g015

From Fig 15, the average synaptic weight drops obviously in the first 150 ms and decreases slowly from 150 ms to 700 ms; the average synaptic weight tends to be stable after 700 ms.

And according to formula (2), it can be found that the change of synaptic weight gsyn can change synaptic currents Isyn. And the synaptic currents include excitatory and inhibitory currents. In this study, excitatory synaptic current is the mean of excitatory current received by all postsynaptic neurons; inhibitory synaptic current is the mean of inhibitory current received by all postsynaptic neurons. To investigate the change in the inhibitory and excitatory synaptic currents with time under white Gaussian noise, their dynamic evolution are illustrated in Fig 16.

thumbnail
Fig 16. The change in the synaptic current with time.

(A) The excitatory current. (B) The inhibitory current.

https://doi.org/10.1371/journal.pone.0244683.g016

From Fig 16, the excitatory current drops sharply in the first 200 ms, drops slowly from 200 ms to 700 ms and tends to be stable after 700 ms. The inhibitory current drops sharply in the first 100 ms, drops slowly from 100 ms to 200 ms and tends to be stable after 200 ms. Because of the influence of synaptic dynamic regulation on synaptic current, excitatory and inhibitory current also gradually decreases and tends to be stable during the process of regulation. From the experiment results, synaptic plasticity plays a role in regulating the SFSNN under white Gaussian noise. During the process of regulation, the synaptic weight gradually decreases and tends to be stable.

(3) Clustering Coefficient

As an important index to measure the topological characteristics of an SFSNN, the clustering coefficient reflects local information transmission efficiency of the network. For a weighted network, the clustering coefficient of the node i is described as [43]: (20) where gij, gik are the synaptic weight of weighted network W and ki, si are the degree and strength of the node i, respectively. In this study, the SFSNN is a network with dynamic regulation of synaptic weight. And according to formula (20), we can found that the change of synaptic weight g can lead to the change of the clustering coefficient Ci. Therefore, the dynamic regulation of the synaptic weight forms synaptic plasticity and it also can change the topological structure of the SFSNN. The ACC can describe the clustering coefficient of all neurons in the SFSNN, and dynamic evolution of the ACC with time under white Gaussian noise is illustrated in Fig 17.

From Fig 17, the ACC decreases gradually within 450 ms, increases gradually from 450 ms to 700 ms and becomes stable from 700 ms to 1000 ms.

From the theory and experiment results, white Gaussian noise can lead to the change of the firing sequence of neurons in the network. Synaptic plasticity regulates synaptic weight through the change of firing moment of presynaptic neurons and postsynaptic neurons. Therefore, the change of the FR can lead to the change of the synaptic weight. In this study, the SFSNN is a network with dynamic regulation of synaptic weight. Therefore, the dynamic regulation of the synaptic weight also can change the topological structure of the SFSNN. The experiment results show that the dynamic evolution of neural information processing presents a trend from intense to gradually stable. Furthermore, the neural information processing of the SFSNN is the linkage effect of dynamic evolution in neuron firing, synaptic weight and topological structure.

Correlation analysis based on the Pearson correlation coefficient.

To further explore the noise suppression mechanism of the SFSNN, the relationship between the external noise suppression ability of the SFSNN and the internal synaptic plasticity is established. An analysis of the correlation between the synaptic plasticity and the noise suppression ability of the SFSNN is conducted based on the Pearson correlation coefficient. To analyze the correlation between the dynamic regulation of synaptic weight and the δ, and the correlation between the dynamic regulation of synaptic weight and the ρ, the Pearson correlation coefficient is calculated according to formula (16) and (17). In this study, X represents the average synaptic weight in every 50 ms; Y represents the average FR or the average ρ in every 50 ms; and n represents the numbers of X and Y. The simulation time is 1000 ms, and the time interval is 50 ms. Thus, n is 20.

The correlation coefficient between the average synaptic weight and the δ is -0.961** (P < 0.01), and the correlation coefficient between the average synaptic weight and the ρ is -0.995** (P < 0.01), which shows that the dynamic regulation of synaptic weight is significantly correlated with the noise suppression ability of the SFSNN at the significance level of 0.01 (two-sided t-tests). The above results imply that synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN.

Conclusion

In this study, the SFSNN with more biological rationality is constructed to study the noise suppression ability under white Gaussian noise. Furthermore, the neural information processing of the SFSNN is investigated, and the noise suppression mechanism of the SFSNN is explored. The experiment results indicate the following. (1) We evaluate the noise suppression ability of the SFSNN from different angles and get the consistent experiment result that the SFSNN has a certain degree of noise suppression ability under white Gaussian noise. (2) The δ of the SFSNN with the high ACC are lower than that of the SFSNN with the low ACC, whereas the ρ of the SFSNN with the high ACC are higher than that of the SFSNN with the low ACC. The result shows that the SFSNN with the high ACC have higher noise suppression performance than the SFSNN with the low ACC on the whole. (3) The neural information processing of the SFSNN is the linkage effect of dynamic changes in neuron firing, synaptic weight and topological structure. (4) The dynamic regulation of synaptic weight is significantly correlated with the noise suppression ability, which shows that synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN. This study can be helpful to understand the brain information processing and provides theoretical foundation for the engineering application of robustness drawing from the self-adaptive advantage of the biological nervous system.

The ANNs without nerve electrophysiological characteristics cannot receive the external stimulation, in which the node is not a neuron model and the edge is not a synapse model. Therefore, the response of this kind of networks to external stimulation cannot be studied. For the SNNs, most of the researches on self-adaptive regulation are firing synchronization and neural coding under external stimulation. The study of the noise suppression ability of the SNN based on synaptic plasticity is still in the stage of exploration. This study can be helpful to understand the brain information processing under external stimulation and provides theoretical foundation for the engineering application of robustness drawing from the self-adaptive advantage of the biological nervous system.

Supporting information

S1 File. Data required for all study findings reported in the article.

https://doi.org/10.1371/journal.pone.0244683.s001

(ZIP)

References

  1. 1. Ribatti V, Santini L, Forleo GB, Della RD, Panattoni G, Scali M, et al. Electromagnetic interference in the current era of cardiac implantable electronic devices designed for magnetic resonance environment. G Ital Cardiol. 2017; 18(4): 295–304. pmid:28492569
  2. 2. Liu CT, Wu RJ, He ZX, Zhao XF, Li HC, Wang PZ. Modeling and analyzing interference signal in a complex electromagnetic environment. EURASIP J. Wirel. Commun. Netw. 2016; 2016(1): 1–9.
  3. 3. Jia LC, Yan DX, Liu X, Ma R, Wu HY, Li ZM. Highly efficient and reliable transparent electromagnetic interference shielding film. ACS Appl. Mater. Interfaces. 2018; 10(14): 11941–11949. pmid:29557166
  4. 4. Panagopoulos DJ, Chrousos GP. Shielding methods and products against man-made electromagnetic fields: protection versus risk. Sci. Total Environ. 2019; 667: 255–262. pmid:30831365
  5. 5. Huitzil CT, Girau B. Fault and error tolerance in neural net-works: A review. IEEE Access. 2017; 5: 17322–17341.
  6. 6. Liu SH, Yuan L, Chu J. Electromagnetic bionics: a new study field of electromagnetic protection. Chin. J. Nature. 2009; 31(1): 1–7.
  7. 7. Steve F. Large-scale neuromorphic computing systems. J. Neural Eng. 2016; 13(5): 051001.
  8. 8. Aboozar T, Ammar B, Li YH, Georgina C, Liam PM, McGinnity TM. A review of learning in biologically plausible spiking neural networks. Neural Netw. 2020; 122: 253–272.
  9. 9. Tavanaie A, Anthony SM. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing. 2018; 330: 39–47.
  10. 10. Behrenbeck J, Tayeb Z, Bhiri C, Richter C, Rhodes O, Kasabov N, et al. Classification and regression of spatio-temporal signals using NeuCube and its realization on SpiNNaker neuromorphic hardware. J. Neural Eng. 2019; 16(2): 026014. pmid:30577030
  11. 11. Tavanaie A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A. Deep learning in spiking neural networks. Neural Netw. 2018; 111: 47–63.
  12. 12. Mahadevuni A, Peng L. Navigating mobile robots to targetin near shortest time using reinforcement learning with spiking neural networks. International Joint Conference on Neural Networks. 2017; 2017: 2243–2250.
  13. 13. Kasap B, Van Opstal A J. A spiking neural network model of the midbrain superior colliculus that generates saccadic motor commands. Biological Cybernetics. 2017; 111(3): 249–268. pmid:28528360
  14. 14. Chen XL, Chen J, Cheng G, Gong T. Topics and trends in artificial intelligence assisted human brain research. PLoS One. 2020; 15(4): e0231192. pmid:32251489
  15. 15. Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T. STDP-based spiking deep convolutional neural networks for object recognition. Neural Networks. 2018; 99: 56–67. pmid:29328958
  16. 16. Hodgkin AL, Huxley AF. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 1952; 117(4): 500–544. pmid:12991237
  17. 17. Izhikevich EM. Simple model of spiking neurons. IEEE Trans Neural Netw. 2003; 14(6): 1569–1571. pmid:18244602
  18. 18. Nobukawa S, Nishimura H, Yamanishi T, Liu JQ. Analysis of chaotic resonance in Izhikevich neuron model. ploS One. 2015; 10(9): e0138919. pmid:26422140
  19. 19. Li XM, Luo SY, Xue FY. Effects of synaptic integration on the dynamics and computational performance of spiking neural network. Cogn Neurodyn. 2020; 14(104): 347–357. pmid:32399076
  20. 20. Ebner C, Clopath C, Jedlicka P, Cuntz H. Unifying long-term plasticity rules for excitatory synapses by modeling dendrites of cortical pyramidal neurons. Cell Rep. 2019; 29(13): 4295–4307. pmid:31875541
  21. 21. Yang S, Govindaiah G, Lee SH, Yang S, Cox CL. Distinct kinetics of inhibitory currents in thalamocortical neurons that arise from dendritic or axonal origin. PloS one; 2017; 12(12): e0189690. pmid:29252999
  22. 22. Chen JL, Villa KL, Cha JW, So PTC, Kubota Y, Nedivi E. Clustered dynamics of inhibitory synapses and dendritic spines in the adult neocortex. Neuron. 2012; 74(2): 361–373. pmid:22542188
  23. 23. Joana L, Angela MDS, Charlotte D, Mathilde B, Antonio P, Andrea A, et al. Modulation of coordinated activity across cortical layers by plasticity of inhibitory synapses. Cell Rep. 2020; 30(3): 630–641.e5.
  24. 24. Kim SY, Lim W. Burst synchronization in a scale-free neuronal network with inhibitory spike-timing-dependent plasticity. Cogn Neurodyn. 2019; 13(1): 53–73. pmid:30728871
  25. 25. Kim SY, Lim W. Effect of interpopulation spike-timing-dependent plasticity on synchronized rhythms in neuronal networks with inhibitory and excitatory populations. Cogn. Neurodyn 2020; 14(1): 535–567. pmid:32655716
  26. 26. Su F, Wang J, Li HY, Wei XL, Yu HT, Deng B. Synaptic dynamics regulation in response to high frequency stimulation in neuronal networks. Commun Nonlinear Sci Numer Simul. 2018; 55: 29–41.
  27. 27. Zhao J, Qin YM, Che YQ. Effects of topologies on signal propagation in feedforward networks. Chaos. 2018; 28(1): 013117. pmid:29390642
  28. 28. Vargas ER, Mitchell DGV, Greening SG, Wahl LM. Topology of whole-brain functional MRI networks: improving the truncated scale-free model. Physica A. 2014; 405: 151–158.
  29. 29. Liu XC, Li T, Tang C, Xu T, Chen P, Bezerianos P. Emotion recognition and dynamic functional connectivity analysis based on EEG. IEEE Access. 2019; 7: 143293–143302.
  30. 30. Zhang W, Guo L, Liu DZ, Xu GZ. The dynamic properties of a brain network during working memory based on the algorithm of cross-frequency coupling. Cogn Neurodyn. 2020; 14: 215–228. pmid:32226563
  31. 31. Yu HT, Wang J, Du JW, Deng B, Wei XL, Liu C. Effects of time delay on the stochastic resonance in small-world neuronal networks. Chaos. 2013; 23: 013128. pmid:23556965
  32. 32. Kim SY, Lim W. Effect of spike-timing-dependent plasticity on stochastic burst synchronization in a scale-free neuronal network. Cogn Neurodyn. 2018; 12(3): 315–342. pmid:29765480
  33. 33. Saeedeh A, Mahdi J. Directed functional networks in Alzheimer’s disease: disruption of global and local connectivity measures. IEEE J. Biomed. Health Inform. 2017; 21(4): 949–955.
  34. 34. Nie TY, Guo Z, Zhao K, Lu ZM. The dynamic correlation between degree and betweenness of complex network under attack. Physica A. 2016; 457(2016): 129–137.
  35. 35. Etémé AS, Tabi CB, Mohamadou A. Firing and synchronization modes in neural network under magnetic stimulation. Commun. Nonlinear Sci. Numer. Simul. 2019; 72: 432–440.
  36. 36. Guo L, Zhang W, Zhang JL. Neural information coding on small-world spiking neuronal networks modulated by spike-timing dependent plasticity under external noise stimulation. Cluster Comput. 2019; 22(3) 5217–5231.
  37. 37. Qiao GC, Hu SG, Wang JJ, Zhang CM, Chen TP, Ning N, et al. A neuromorphic-hardware oriented bio-plausible online-learning spiking neural network model. IEEE Access. 2019; 7: 71730–71740.
  38. 38. Braitenberg V, Schuz A. Anatomy of the cortex: statistics and geometry. J Anat. 1991; 179: 203–204.
  39. 39. Kleberg FI, Fukai T, Gilson M. Excitatory and inhibitory STDP jointly tune feedforward neural circuits to selectively propagate correlated spiking activity. Front. Comput. Neurosci. 2014; 8: 53. pmid:24847242
  40. 40. Wang D, Jin XZ. On weighted scale-free network model with tunable clustering and congestion. Acta Physica Sinica. 2012; 61(22): 514–518.
  41. 41. Eguíluz VM, Chialvo DR, Cecchi GA, Baliki Marwan, Apkarian AV. Scle-Free brain functional networks. Phys. Rev. Lett.2005; 94(1): 018102. pmid:15698136
  42. 42. Barabási AL, Albert R. Emergence of scaling in random networks. Science. 1999; 286: 509–512. pmid:10521342
  43. 43. Barrat A, Barthelemy M, Satorras RP. The architecture of complex weighted networks. Proc Natl Acad Sci U S A. 2004; 101 (11): 3747–3752. pmid:15007165