TY - JOUR
T1 - A robust multilayer extreme learning machine using kernel risk-sensitive loss criterion
AU - Luo, Xiong
AU - Li, Ying
AU - Wang, Weiping
AU - Ban, Xiaojuan
AU - Wang, Jenq-Haur
AU - Zhao, Wenbing
PY - 2020/1/1
Y1 - 2020/1/1
N2 - More recently, extreme learning machine (ELM) has emerged as a novel computing paradigm that enables the neural network (NN) based learning to be achieved with fast training speed and good generalization performance. However, the single hidden layer NN using ELM may be not effective in addressing some large-scale problems with more computational efforts. To avoid such limitation, we utilize the multilayer ELM architecture in this article to reduce the computational complexity, without the physical memory limitation. Meanwhile, it is known to us all that there are a lot of noises in the practical applications, and the traditional ELM may not perform well in this instance. Considering the existence of noises or outliers in training dataset, we develop a more practical approach by incorporating the kernel risk-sensitive loss (KRSL) criterion into ELM, on the basis of the efficient performance surface of KRSL with high accuracy while still maintaining the robustness to outliers. A robust multilayer ELM, i.e., the stacked ELM using the minimum KRSL criterion (SELM-MKRSL), is accordingly proposed in this article to enhance the outlier robustness on large-scale and complicated dataset. The simulation results on some synthetic datasets indicate that the proposed approach SELM-MKRSL can achieve higher classification accuracy and is more robust to the noises compared with other state-of-the-art algorithms related to multilayer ELM.
AB - More recently, extreme learning machine (ELM) has emerged as a novel computing paradigm that enables the neural network (NN) based learning to be achieved with fast training speed and good generalization performance. However, the single hidden layer NN using ELM may be not effective in addressing some large-scale problems with more computational efforts. To avoid such limitation, we utilize the multilayer ELM architecture in this article to reduce the computational complexity, without the physical memory limitation. Meanwhile, it is known to us all that there are a lot of noises in the practical applications, and the traditional ELM may not perform well in this instance. Considering the existence of noises or outliers in training dataset, we develop a more practical approach by incorporating the kernel risk-sensitive loss (KRSL) criterion into ELM, on the basis of the efficient performance surface of KRSL with high accuracy while still maintaining the robustness to outliers. A robust multilayer ELM, i.e., the stacked ELM using the minimum KRSL criterion (SELM-MKRSL), is accordingly proposed in this article to enhance the outlier robustness on large-scale and complicated dataset. The simulation results on some synthetic datasets indicate that the proposed approach SELM-MKRSL can achieve higher classification accuracy and is more robust to the noises compared with other state-of-the-art algorithms related to multilayer ELM.
KW - Deep learning
KW - Extreme learning machine (ELM)
KW - Kernel risk-sensitive loss (KRSL)
KW - Multilayer perceptron
UR - https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85067232216&origin=inward
UR - https://www.scopus.com/inward/citedby.uri?partnerID=HzOxMe3b&scp=85067232216&origin=inward
U2 - 10.1007/s13042-019-00967-w
DO - 10.1007/s13042-019-00967-w
M3 - Article
SN - 1868-8071
VL - 11
SP - 197
EP - 216
JO - International Journal of Machine Learning and Cybernetics
JF - International Journal of Machine Learning and Cybernetics
IS - 1
ER -