Robust likelihood ratio test using α− divergence

Published in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020

Paper

The problem of detecting a subspace signal in the presence of subspace interference and contaminated Gaussian noise with unknown variance is investigated. The target signal is assumed to lie in a subspace spanned by the columns of a known matrix. To develop the test, the same steps used in the generalized likelihood ratio test (GLRT) are used where instead of the maximum likelihood (ML) estimator of the parameters, the minimum α-divergence based estimator is substituted in the test to increase the robustness of the test against contaminations in noise. This test depends on the single parameter α and as the special case corresponds to the well known GLRT. Numerical examples illustrating that the proposed test can achieve better detection rates in such scenarios are presented. Moreover, the test is applied to real fMRI dataset to detect the active area of the brain for some task-related inputs.