Chapter 1 Overview & Foundations of Spiking Neural Networks

1 Overview

Artificial neural networks have undoubtedly achieved great success in various applications. However, when compared with brains, artificial neural networks seem to be less capable in efficiency, especially in information processing speed and energy consumption. For example, carbon emission of training BERT on GPU is roughly equivalent to that of a trans-American flight 1.

In general, there are two inevitable defects in artificial neural networks: 1) high abstraction of neurons hinders interpretation and 2) time is overlooked.

As we want to build neural networks that are as efficient as brains, it is natural to simulate mechanisms in brains. Therefore, spikes and time-factors formally come to the stage.

Spiking neural networks, so called the third generation of artificial neural networks, are proposed. They are expected to process information faster by temporal coding.

Maass 2 has proved that spiking neural networks are able to simulate any forward sigmoid neural networks and spiking neural networks are able to realize the approximation of arbitrary continuous function.

2 Foundations of Neurons

2.1 Terminology

Before diving into one field, it is always better to get familiar with its glossary. Table 1 shows some important and often-used word in spiking neural networks.

Terminology释义
neuron神经元
soma细胞体
dendrite树突
axon轴突
synapse突触
potential电位
threshold阈值
spike脉冲
membrane生物膜
action potential动作电位
membrane potential膜电位
resting potential静息电位
presynaptic neuron突触前神经元
postsynaptic neuron突触后神经元
postsynaptic potential突触后电位
excitatory synapse兴奋性突触
inhibitory synapse抑制性突触
To be continued未完待续

TABLE 1: Glossary. It shows a list of frequent words in spiking neural networks. Now it has not been complete and will be extended in future.

2.2 Neurons

Neurons are responsible to receive, integrate and send messages. Figure 1 depicts simplified structure of neurons. There are three main parts in a neuron: 1) dendrite, 2) soma and 3) axon. Dendrites can receive messages from other neurons and send them to soma. When receiving messages from dendrites, soma can integrate messages and decide whether to produce a message. If so, such a message will be sent through axon and arrive at the end of axon.

Synapse are connections between two neurons. When messages reach the end of axon, they will continuously be transmitted by synapse. Once messages have passed through synapse, the other neuron will receive messages by its dendrites. By synapses, messages can be passed between neurons.
Neurons

Fig. 1: Neuron Structure. Neurons can send messages through synapses. Presynaptic neurons use axons to connect to synapse and postsynaptic neurons use dendrites to receive messages from synapses.

2.3 Spikes

2.3.1 Spike Generation

The previous section describes how information is transmitted within and between neurons. This section will focus on the form in which information is transmitted within neurons.

Signals in neurons are represented by changes of membrane potential. Usually, membrane outside is set to be 0V and inside is around -70mV to -80mV. When stimulus arrives, membrane potential will quickly be changed and spread.

If membrane potential exceeds a threshold, the neuron will produce a spike and then postsynaptic potential will return to resting potential. This procedure includes depolarization phase, repolarization phase and hyperpolarization phase. Figure 2 describes the generation of spikes and changes of membrane potential.
Spike generation
Fig. 2: Generation of a spike. When receiving a stimulus, membrane potential starts rising. A neuron produces a spike when its membrane potential exceeds the threshold. This process is called depolarization. Then the membrane potential drops quickly and becomes slightly lower than resting potential. Such a phase is repolarization. During repolarization, this neuron will not be able to produce any spike, so called absolute refractory period. Finally, the membrane potential will gradually return to resting potential, which is hyperpolarization. In this stage, the neuron can generate another spike only if there is a stronger stimulus. So, it is called relative refractory period.

2.3.2 Spike Integration

When neurons generate a series of spikes, they are called a spike train. It should be clear that every spike has almost the same shape. If neurons want to communicate, they’d better rely on spiking time and firing rate rather than spike shapes.

Neurons have another property that contributes to their efficiency. Instead of multiplication in artificial neural networks, neurons here only have integration. Specifically, neurons add effects of different spikes and then decide their responses. Figure 3 describe such integration.
Spike integration

Fig. 3: Spike integration. (a) shows the connection between four neurons. Spike train and its time have been marked; (b) shows the changes of membrane potential of four neurons with time.

2.3.3 Excitatory & Inhibitory Synapses

After generation, spikes will be spread to synapses and lead to changes of postsynaptic potential. It should be noted that spikes do not always lead to increase of postsynaptic potential. It depends on the types of synapses.

Synapses includes excitatory synapse and inhibitory synapse. When spikes go through excitatory synapses, the postsynaptic potential will be increased unless refractory. Such potential is called excitatory postsynaptic potential (EPSP). On the contrary, inhibitory synapses will decrease the postsynaptic potential, which is called inhibitory postsynaptic potential (IPSP). Figure 4 shows effects of both synapses.
Excitatory and Inhibitory synapses
Fig. 4: Excitatory and Inhibitory synapses. (a) shows that presynaptic neuron fires and lead to increase of postsynaptic potential through excitatory synapses; (b) depicts that presynaptic neuron generates a spike and lead to decrease of postsynaptic potential through inhibitory synapses.

3 Reference


  1. Strubell, Emma & Ganesh, Ananya & Mccallum, Andrew. (2019). Energy and Policy Considerations for Deep Learning in NLP. 3645-3650. 10.18653/v1/P19-1355. ↩︎

  2. MAASS W. Networks of spiking neurons: the third generation of neural network models[J]. Neural Networks, 1997, 10(9),: 1659-1617. ↩︎

评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值