TY - JOUR
T1 - Minimum variance estimation of a sparse vector within the linear gaussian model
T2 - An RKHS approach
AU - Jung, Alexander
AU - Schmutzhard, Sebastian
AU - Hlawatsch, Franz
AU - Ben-Haim, Zvika
AU - Eldar, Yonina C.
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/10/1
Y1 - 2014/10/1
N2 - We consider minimum variance estimation within the sparse linear Gaussian model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Our analysis is based on the theory of reproducing kernel Hilbert spaces (RKHS). After a characterization of the RKHS associated with the SLGM, we derive a lower bound on the minimum variance achievable by estimators with a prescribed bias function, including the important special case of unbiased estimation. This bound is obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. It provides an approximation to the minimum achievable variance (Barankin bound) that is tighter than any known bound. Our bound holds for an arbitrary system matrix, including the overdetermined and underdetermined cases. We specialize it to compressed sensing measurement matrices and express it in terms of the restricted isometry constant. For the special case of the SLGM given by the sparse signal in noise model, we derive closed-form expressions of the Barankin bound and of the corresponding locally minimum variance estimator. Finally, we compare our bound with the variance of several well-known estimators, namely, the maximum-likelihood estimator, the hard-thresholding estimator, and compressive reconstruction using orthogonal matching pursuit and approximate message passing.
AB - We consider minimum variance estimation within the sparse linear Gaussian model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Our analysis is based on the theory of reproducing kernel Hilbert spaces (RKHS). After a characterization of the RKHS associated with the SLGM, we derive a lower bound on the minimum variance achievable by estimators with a prescribed bias function, including the important special case of unbiased estimation. This bound is obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. It provides an approximation to the minimum achievable variance (Barankin bound) that is tighter than any known bound. Our bound holds for an arbitrary system matrix, including the overdetermined and underdetermined cases. We specialize it to compressed sensing measurement matrices and express it in terms of the restricted isometry constant. For the special case of the SLGM given by the sparse signal in noise model, we derive closed-form expressions of the Barankin bound and of the corresponding locally minimum variance estimator. Finally, we compare our bound with the variance of several well-known estimators, namely, the maximum-likelihood estimator, the hard-thresholding estimator, and compressive reconstruction using orthogonal matching pursuit and approximate message passing.
KW - Barankin
KW - compressed sensing
KW - Cramér-Rao bound
KW - denoising
KW - RKHS
KW - Sparsity
KW - unbiased estimation
UR - http://www.scopus.com/inward/record.url?scp=84907202538&partnerID=8YFLogxK
U2 - 10.1109/TIT.2014.2346508
DO - 10.1109/TIT.2014.2346508
M3 - Article
AN - SCOPUS:84907202538
SN - 0018-9448
VL - 60
SP - 6555
EP - 6575
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 10
M1 - 6874571
ER -