4.3.3 Combine Back Propagate and Genetic algorithm
In my project, I mainly used genetic algorithms to optimize the initial layer of the back-propagation neural network(BP) to achieve the purpose of optimizing the neural network, and then used the new network to recognize the same characters.
Most of the work of using genetic algorithms to train neural networks is to fix the network topology in advance, and then use genetic algorithms to optimize the weights of neural networks. The evolution training method can be divided into two main steps: (1) the coding scheme for determining the weight of the network connection: (2) evolution using genetic algorithms. For a fixed neural network structure, the process of evolutionary network connection weights is typically divided into the following steps:
(1) Determine the weight coding scheme and generate the initial group;
(2) Decode each individual in the group and construct the corresponding neural network:
(3) According to certain performance evaluation criteria (such as mean square error, learning speed) Calculate network fitness;
(4) determine the probability of each individual breeding offspring according to the fitness level,
Complete selection:
(5) Apply a genetic operator (such as mating, mutation, etc.) to the population after selection to obtain a new generation of population according to a certain probability;
(6) Return to (2) until the performance requirements are met.
Genetic algorithm optimized neural network is to use genetic algorithm to optimize the initial weight and threshold of BP neural network, so that the optimized BP neural network can better predict the output. The implementation steps of genetic algorithm to optimize BP neural network are as follows:
- Population initialization
Neural network weight learning is a complex continuous parameter optimization problem. If binary coding is used, the encoding string will be too long, and it will need to be decoded into a real number, so that the weight change will be stepped, which will affect the learning accuracy. Therefore, real number coding is used here. Each weight of the neural network is cascaded into a long string in a certain order, and each position on the string corresponds to a weight of the network. The order of the encoded strings is arranged from input to output. - Fitness function
The connection weights represented on the individual are assigned to a given network structure in a corresponding manner. The network takes the training set samples as input and output. After running, the absolute value of the error between the predicted output and the expected output and E are used as the individual adaptation. Degree F is calculated as:
- Select operation
There are many methods for genetic algorithm selection operation, such as roulette and tournament. I choose roulette here, which is a selection strategy based on fitness ratio. In each formula, 0 is the fitness value of individual i. The smaller the degree value, the better, so find the inverse of the fitness value before individual selection: is the coefficient: is the number of individuals in the population. The selection probability pi of each individual i is:
In the formula, Fi is the fitness value of individual i. And we think smaller value have higher fitness value. Before the individual is selected, k is the inverse number; k is the coefficient; N is the number of individuals in the population.
4. cross-operation
In my project I uses real number coding, and use this number to do the crossover operation. The k-th chromosome a (k) and the l-th chromosome a (l) are in the j-position. random number:
After that, we can apply the previous method of echo propagation neural network to recognize the same characters and achieve the recognition effect.
以下为实验结果
原图:
BP的结果:
GA-BP: