Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

The problem of detecting a subspace signal in the presence of subspace interference and contaminated Gaussian noise with unknown variance is investigated. The target signal is assumed to lie in a subspace spanned by the columns of a known matrix. To develop the test, the same steps used in the generalized likelihood ratio test (GLRT) are used where instead of the maximum likelihood (ML) estimator of the parameters, the minimum α-divergence based estimator is substituted in the test to increase the robustness of the test against contaminations in noise. This test depends on the single parameter α and as the special case corresponds to the well known GLRT. Numerical examples illustrating that the proposed test can achieve better detection rates in such scenarios are presented. Moreover, the test is applied to real fMRI dataset to detect the active area of the brain for some task-related inputs.

Original publication

DOI

10.1109/ICASSP40776.2020.9053881

Type

Conference paper

Publication Date

01/05/2020

Volume

2020-May

Pages

1150 - 1154