当前位置:网站首页>Paper reading: deep residual learning in spiking neural networks

Paper reading: deep residual learning in spiking neural networks

2022-07-19 07:08:00 When to order

         Former Spking-ResNet Simulated ann Standard residual block in , Just simply put ReLU The active layer is replaced by spike Neuron , There is a degradation problem , It is difficult to realize residual learning . This article mainly puts forward spike-element-wise (SEW) ResNet To achieve depth snn Residual learning in .

1. introduction

        Spiking ResNet As ResNet Of spike edition , By simulating ann Residual block in , use spike Neuron replacement ReLU Activate the layer . from ANN The conversion spike ResNet It achieves the highest accuracy on almost all data sets , But direct training spike ResNet It has not been verified to solve the degradation problem .

         This paper proves that spike ResNet It is not applicable to all neuron models to realize identity mapping . Even if the unit mapping condition is satisfied ,spike ResNet Also facing disappearance / The problem of explosion gradient .

3. Method

        The residual block is ResNet Key components of . chart 1(a) Shows ResNet The basic block of ,Spiking ResNet The basic block is just imitation ann Blocks in , use Spiking Neuron (SN) Instead of ReLU Activation layer , Pictured 1(b) Shown , Based on spike-element-wise (SEW) It can easily realize unit mapping , At the same time, overcome disappearance / Explosion gradient problem . Pictured 1(c) Shown :

        SEW The residue can be expressed as : 

        SEW ResNet Identity mapping can be easily implemented . utilize spikes The duality of , We can find different element functions that satisfy identity mapping g( As shown in the table 1 Shown ). 

          When choosing ADD and IAND As an element function g when , Realize identity mapping by setting A[t]≡0 It can be achieved simply by setting Fl The last batch of normalization layers in (BN) The weight and offset of are zero . This applies to all neuron models . When using AND As an element function g when , We set up Al[t]≡1 Get the unit mapping . You can use the last BN Set the weight of to 0, And set the deviation to a constant large enough to cause spikes , for example , When the last one SN yes IF When it comes to neurons , Set deviation to Vth.

 

 

原网站

版权声明
本文为[When to order]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/200/202207170520306653.html