当前位置:网站首页>Machine learning BP (back propagation) neural network
Machine learning BP (back propagation) neural network
2022-07-18 21:16:00 【InfoQ】
Preface
1. What is neural network ?
- Biological neural networks , Generally refers to biological brain neurons 、 cells 、 A network of contacts, etc , Used to produce biological consciousness 、 Help organisms think and act .
- Artificial neural network (Artificial Neural Networks, Shorthand for ANN) Also referred to as neural network , It's a kind of imitation of animal neural network behavior characteristics , A mathematical model for distributed parallel information processing . This network depends on the complexity of the system , By adjusting the interconnecting relationships between a large number of internal nodes , In order to achieve the purpose of processing information .
2. BP Theoretical basis of neural network
2.1 perceptron (Perceptron) The Internet
- [ ] By input 、 The weight 、 bias 、 Activation function 、 The output consists of .
- Input node :
- The weight :
- bias :
- Activation function :
- Output node :

2.2 BP The structure and propagation rules of Neural Networks


- Forward propagation data ( Information 、 The signal ) Input from the input end , Along the direction of the network , Multiply by the corresponding weight and add , Then the result is calculated in the activation function as input , Pass the calculated result as input to the next node . Sequentially calculate , Until you get the final result . Through the perceptron of each layer , Layer upon layer calculation , Get the output , The output of each node is used as the input of the next node . This process is forward propagation .

- Back propagation compares the output result with the expected output result , The error generated by the comparison is back propagated by the network , The essence is a “ Negative feedback ” The process of . Through many iterations , Constantly adjust the weight of each node on the network ( to update ), Weight adjustment ( to update ) Gradient descent method .
- sigmoid function

2.3 Gradient descent learning
Are you satisfied with the expected results 
- Find the minimum value of the function through iterative method

- : Adjustment amount ;
- Smooth items .


2.4 Improvement of learning algorithm
- Introduce the momentum term, For learning rate ;Is the momentum factor ,
- Learning rate of variable step methodBad choice , If it is too small , Convergence is too slow ; If you choose too big , It is possible to over correct , Cause oscillation and even divergence . therefore , The variable step size method can be used to improve .
- The combination of momentum term and variable step size method
2.5 BP Application of neural network
- BP The field of neural network application BP Neural networks are widely used in many research fields , Like the environment 、 biological 、 Medical Science 、 meteorological 、 astronomical 、 Geography 、 economics 、 Management 、 Industry 、 statistical 、 Computer 、 mobile communication 、 space flight 、 information technology 、 automation 、 energy 、 The new material 、 Marine and other fields .
- BP The main aspects of neural network applicationforecast: Power operation load 、 Hemoglobin concentration 、 housing price 、 The stock market 、 Water demand 、 The wind speed 、 Geological disasters, etc .evaluation: City Safety 、 The river is healthy 、 The quality of teaching 、 Network security 、 water environment 、 ecological risk 、 Disaster losses, etc .The image processing: Image denoising 、 Image classification 、 Image restoration 、 Image compression 、 Image correction 、 Image segmentation 、 Image encryption, etc .Simulation: Mobile robot obstacle avoidance simulation 、 Logistics traffic simulation 、 UAV attitude adaptive simulation 、 Industrial control simulation .
3. 9 Line code implementation BP neural network
from numpy import exp, array, random, dot
# from numpy Call in library exp( Exponential function )、array( Array 〉、random( Random function )、dot( Matrix multiplication function ).
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
#bp Input of neural network training part .
training_set_outputs = array([[0, 1, 1, 0]]).T
#bp Output of neural network training part ,.T Represents matrix transpose .
random.seed(1)
# Use random functions to generate random numbers , Use seed Function can ensure that the random numbers generated each time are consistent .
synaptic_weights = 2 * random.random((3, 1)) - 1
# Generate a random array , The array format is 3 That's ok 1 Column , Used to store initial weights .
for iteration in range(10000):
output = 1 / (1 + exp(-(dot(training_set_inputs, synaptic_weights))))
# Use for Sentence loop 10000 Time , The input and weight of the training set are dot Do matrix multiplication , Enter the result of multiplication into sigmoid function , Then assign the result to output.
synaptic_weights += dot(training_set_inputs.T, (training_set_outputs - output) * output * (1 - output))
# Weighted i Adjustment and adoption “ Error weighted derivative "" The formula .
print (1 / (1 + exp(-(dot(array([1, 0, 0]), synaptic_weights)))))
#synaptic_weights Is the final weight after adjustment , Array ( matrix 〉[1,0,0] With this weight matrix dot Function to multiply , The result of multiplication is introduced into sigmoid function , Get the final result .
4. summary
- BP The propagation process of neural network includes forward propagation and back propagation , The essence of back propagation is
“ Negative feedback ”. This is similar to the closed-loop system in control , Through feedback , Use the deviation to correct the deviation , So as to achieve satisfactory output effect ;
- Dealing with errors , Take advantage ofGradient descent method + Multiple iterationsThe way , Find the smallest error . In the process , Every iteration , The weights between nodes in different layers will be updated once . Because of the dynamic update of weights , The error obtained by each forward propagation is also dynamically updated , Until the desired output effect is obtained ;
- Mastering the principle is the foundation , Code implementation is the key. At the end of the article 9 Line code implementation BP neural network , It reflects BP The principle of neural network , It is a single neural network (simple-neural-network), If you want to realize multiple neural networks , It may be very complicated .
reference
边栏推荐
- Construction and application of knowledge atlas de (IV): knowledge acquisition
- 激活navicat提示rsa public key not find的问题
- EPIC-KBS9工控机刷机文档
- Sword finger offer 53 - I. find the number I in the sorted array
- World Tour Finals 2019 D - special boxes
- Programming implementation of I2C communication protocol
- I2C communication protocol realizes data display on OLED display screen
- docker mysql
- 什么是产业规划?产业园区该如何做好产业规划
- Pending issues
猜你喜欢

UE4 shadow: perobjectshadow verification

Jenkins installation

激活navicat提示rsa public key not find的问题

Epic-kbs9 industrial computer brushing document

宇宙第一 IDE 霸主,换人了。。。

Gd32f4 (6): serial port garbled caused by crystal oscillator

产业园区增强企业服务能力的三大路径

鸿蒙应用开发项目新建过程与hap包生成方法

uniapp Request请求封装的方法

C language as a push box
随机推荐
R语言使用pcauchy函数生成柯西分布累积分布函数数据、使用plot函数可视化柯西分布累积分布函数数据(Cauchy distribution)
Daily question brushing record (XXV)
uniapp Request请求封装的方法
Sword finger offer 57 And are two numbers of S
BRITS: Bidirectional Recurrent Imputation for Time Series(时间序列的双向递归填补)论文详解
UE4阴影:PerObjectShadow验证
JS 中的事件委托是什么?
[dynamic programming]dp21 regular expression matching - difficult
[cloud native] Devops (IV): integrated sonar Qube
宇宙第一 IDE 霸主,换人了。。。
Win10 如何将FAT32格式磁盘不用格式化无损转化为NFTS格式
Tableau JDBC连接GraphDB
一步到位玩透Ansible-目录
putchar()
Unity-2d pixel lattice ablation
Play through ansible directory in one step
R语言使用lm函数构建多元回归模型、并根据模型系数写出回归方程、使用summary函数计算出模型的汇总统计信息(R方、F统计量、残差分布、残差标准误差、系数等)
sentinel
动态添加路由刷新页面会出现白屏
stm32F407----电源管理