Abstract
The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that 'kernelizes' the celebrated (linear) least mean squares algorithm. We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thus, the KLMS can be interpreted as an approximate Bayesian filtering method. This allows us to systematically develop extensions of the KLMS by modifying the underlying state-space and observation models. The resulting extensions introduce many desirable properties such as 'forgetting', and the ability to learn from discrete data, while retaining the computational simplicity and time complexity of the original algorithm.
Original language | English |
---|---|
Title of host publication | 2014 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2014 |
Publisher | IEEE |
Pages | 8272-8276 |
Number of pages | 5 |
ISBN (Electronic) | 978-1-4799-2893-4 |
ISBN (Print) | 9781479928927 |
DOIs | |
Publication status | Published - 2014 |
MoE publication type | A4 Conference publication |
Event | IEEE International Conference on Acoustics, Speech, and Signal Processing - Florence, Italy Duration: 4 May 2014 → 9 May 2014 Conference number: 39 |
Publication series
Name | IEEE International Conference on Acoustics, Speech and Signal Processing |
---|---|
Publisher | IEEE |
ISSN (Print) | 1520-6149 |
ISSN (Electronic) | 2379-190X |
Conference
Conference | IEEE International Conference on Acoustics, Speech, and Signal Processing |
---|---|
Abbreviated title | ICASSP |
Country/Territory | Italy |
City | Florence |
Period | 04/05/2014 → 09/05/2014 |
Keywords
- kernel adaptive filtering
- KLMS
- sequential Bayesian learning
- state-space model