TY - GEN
T1 - A performance comparison of crossover variations in differential evolution for training multi-layer perceptron neural networks
AU - Choi, Tae Jong
AU - Cheong, Yun Gyung
AU - Ahn, Chang Wook
N1 - Publisher Copyright:
© Springer Nature Singapore Pte Ltd. 2018.
PY - 2018
Y1 - 2018
N2 - Artificial neural networks (ANNs) are a kind of well-known machine learning techniques, and it is required to adjust the weights of their neurons to learn a given task, which usually done by using a gradient-based optimization algorithm. However, gradient-based optimization algorithms likely get stuck in a local optimum, and therefore, researchers have attempted to apply population-based metaheuristics. In this paper, we study the performance comparison of various crossover operators in differential evolution (DE) for training ANNs. We investigated the classification performance of three crossover operators, the binomial crossover, the exponential crossover, and the multiple exponential recombination (MER), with medical datasets. The experimental results show that the binomial crossover and the MER have better performance compared with the exponential crossover, and the exponential crossover varies significantly in performance depending on the architecture. Also, we found that dependent variables in training ANNs may not be located proximately each other, which results in makes the advantage of the exponential crossover and the MER effectless.
AB - Artificial neural networks (ANNs) are a kind of well-known machine learning techniques, and it is required to adjust the weights of their neurons to learn a given task, which usually done by using a gradient-based optimization algorithm. However, gradient-based optimization algorithms likely get stuck in a local optimum, and therefore, researchers have attempted to apply population-based metaheuristics. In this paper, we study the performance comparison of various crossover operators in differential evolution (DE) for training ANNs. We investigated the classification performance of three crossover operators, the binomial crossover, the exponential crossover, and the multiple exponential recombination (MER), with medical datasets. The experimental results show that the binomial crossover and the MER have better performance compared with the exponential crossover, and the exponential crossover varies significantly in performance depending on the architecture. Also, we found that dependent variables in training ANNs may not be located proximately each other, which results in makes the advantage of the exponential crossover and the MER effectless.
KW - Artificial neural networks
KW - Crossover operator
KW - Differential evolution algorithm
KW - Feed-forward neural network
KW - Neural network training
UR - https://www.scopus.com/pages/publications/85055803926
U2 - 10.1007/978-981-13-2829-9_44
DO - 10.1007/978-981-13-2829-9_44
M3 - Conference contribution
AN - SCOPUS:85055803926
SN - 9789811328282
T3 - Communications in Computer and Information Science
SP - 477
EP - 488
BT - Bio-inspired Computing
A2 - Qiao, Jianyong
A2 - Zhao, Xinchao
A2 - Zuo, Xingquan
A2 - Huang, Shanguo
A2 - Pan, Linqiang
A2 - Zhang, Xingyi
A2 - Zhang, Qingfu
PB - Springer Verlag
T2 - 13th International Conference on Bio-Inspired Computing: Theories and Applications, BIC-TA 2018
Y2 - 2 November 2018 through 4 November 2018
ER -