Abstract
Kernel least-mean mixed-norm (KLMMN) algorithm as a special kernel adaptive filter method achieves good performance when the measured noises are distributed with a linear combination of long-tails and short-tails. In order to reduce the computational efforts and improve the accuracy, this paper proposes a novel entropy optimized kernel learning algorithm, called E-KLMMN, on the basis of information entropy and KLMMN. The first step of E-KLMMN algorithm is to calculate the entropy weights of input vectors in the training set which contains the linear combination of long-tailed and short-tailed distribution noises. Then we remove the input vectors and their corresponding outputs whose entropy weights are less than the average value. Finally, using the modified training set to train KLMMN model, the following data points thus could be predicted. Through the use of information entropy, the proposed algorithm E-KLMMN has the advantages of high precision and low cost, while employing it to noise environment. We use the actual data to conduct the experiment, and the comparisons among E-KLMMN, KLMS, and KLMMN demonstrate the effectiveness and superiority of our algorithm.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the International Joint Conference on Neural Networks |
| Place of Publication | usa |
| Publisher | Institute of Electrical and Electronics Engineers Inc. |
| Pages | 1716-1722 |
| Number of pages | 7 |
| Volume | 2016-October |
| ISBN (Electronic) | 9781509006199 |
| DOIs | |
| State | Published - Oct 31 2016 |
| Event | 2016 International Joint Conference on Neural Networks, IJCNN 2016 - Vancouver, Canada Duration: Jul 24 2016 → Jul 29 2016 |
Conference
| Conference | 2016 International Joint Conference on Neural Networks, IJCNN 2016 |
|---|---|
| Country/Territory | Canada |
| City | Vancouver |
| Period | 07/24/16 → 07/29/16 |
Keywords
- Data prediction
- Entropy weight
- Kernel least-mean mixed-norm (KLMMN) algorithm
- Kernel method
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver