ИСТИНА |
Войти в систему Регистрация |
|
Интеллектуальная Система Тематического Исследования НАукометрических данных |
||
Statistical estimates of the Shannon entropy constructed by means of observations X_1,…,X_N having the same law as X are very important. They permit to estimate the mutual information and other related characteristics of a random vector X. Such estimates are widely used in machine learning, they are essential for tests concerning independence hypothesis for collections of random variables and they are employed in feature selection theory and various applications. The behavior of the Kozachenko - Leonenko estimates for the (differential) Shannon entropy, when the number of i.i.d. vector-valued observations tends to infinity, was studied by different authors. D.Pal et al. indicated the defects in the previous existing proofs of the asymptotic unbiasedness and L^2-consistency of these estimates. In a quite recent paper we also turn to the mentioned results and establish them under wide conditions. To this end the analogues of the Hardy - Littlewood maximal function are proposed and employed. It is shown that our approach applies, in particular, to the entropy estimation of any nondegenerate Gaussian distribution. Moreover, we provide conditions to guarantee the validity of these new results for distributions mixture.