This paper describes a new framework that, exploiting the Kullback-Leibler Divergence, allows to address the design of one-stage adaptive detectors for multiple hypothesis testing problems. Precisely, at the design stage, the problem is formulated in terms of multiple alternative hypotheses competing with the null hypothesis. Then, a one-stage decision scheme is derived in the context of both known model and unknown parameters as well as for the most general case of unknown model and parameters. Interestingly, the resulting detectors are given by the sum of the compressed log-likelihood ratio based on the available data and a penalty term depending on the number of unknown parameters. This general architecture is then particularized to the problem of subspace target detection, and its effectiveness is assessed through simulations also in comparison with its counterpart based on the two-stage paradigm.
File in questo prodotto:
Non ci sono file associati a questo prodotto.