Abstract
Huber's criterion can be used for robust joint estimation of regression and scale parameters in the linear model. Huber's [1] motivation for introducing the criterion stemmed from nonconvexity of the joint maximum likelihood objective function as well as non-robustness (unbounded influence function) of the associated ML-estimate of scale. In this paper, we illustrate how the original algorithm proposed by Huber can be set within the block-wise minimization majorization framework. In addition, we propose novel data-adaptive step sizes for both the location and scale, which are further improving the convergence. We then illustrate how Huber's criterion can be used for sparse learning of underdetermined linear model using the iterative hard thresholding approach. We illustrate the usefulness of the algorithms in an image denoising application and simulation studies.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing, MLSP 2020 |
Publisher | IEEE |
Number of pages | 6 |
ISBN (Electronic) | 9781728166629 |
DOIs | |
Publication status | Published - Sep 2020 |
MoE publication type | A4 Article in a conference publication |
Event | IEEE International Workshop on Machine Learning for Signal Processing - Aalto University, Espoo, Finland Duration: 21 Sep 2020 → 24 Sep 2020 Conference number: 30 https://ieeemlsp.cc |
Publication series
Name | IEEE International Workshop on Machine Learning for Signal Processing |
---|---|
ISSN (Print) | 2161-0363 |
ISSN (Electronic) | 2161-0371 |
Workshop
Workshop | IEEE International Workshop on Machine Learning for Signal Processing |
---|---|
Abbreviated title | MLSP |
Country/Territory | Finland |
City | Espoo |
Period | 21/09/2020 → 24/09/2020 |
Internet address |
Keywords
- Huber's criterion
- Minimization-majorization algorithm
- Robust regression
- Sparse learning