Statistical Estimation of the Kullback–Leibler Divergenceстатья
Статья опубликована в высокорейтинговом журнале
Информация о цитировании статьи получена из
Scopus
Статья опубликована в журнале из списка Web of Science и/или Scopus
Дата последнего поиска статьи во внешних источниках: 4 июня 2021 г.
Аннотация:Asymptotic unbiasedness and L2-consistency are established, under mild conditions, forthe estimates of the Kullback–Leibler divergence between two probability measures in Rd, absolutelycontinuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certaink-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples.The novelty of results is also in treating mixture models. In particular, they cover mixtures ofnondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators forthe Shannon entropy and cross-entropy are strengthened. Some applications are indicated.