ИСТИНА |
Войти в систему Регистрация |
|
Интеллектуальная Система Тематического Исследования НАукометрических данных |
||
Consider i.i.d. random vectors X_1,X_2,..., and i.i.d. random vectors Y_1,Y_2,..., with law(X_1) = law(X) and law(Y_1) = law(Y) for some random vectors X and Y taking values in R^d and having distributions P_X and P_Y, respectively. Assume that X and Y have densities p and q with respect to Lebesgue measure. Suppose also that {Xi, Yi, i ∈ N} are independent. We are interested in statistical estimation of the Kullback-Leibler divergence D(P_X || P_Y) constructed by means of observations {X_1,...,X_n} and {Y_1,...,Y_m}, n,m ∈ N. Proposed estimates involve the specified nearest neighbour statistics. Wide conditions are provided to guarantee asymptotic unbiasedness and L2-consistency of such estimates. In particular, the established results are valid for estimates of the Kullback-Leibler divergence between any two Gaussian mixtures in R^d (whose components have nondegenerate covariance matrices). Some applications of the studied estimates are provided for inhomogeneities detection within the fiber material filling a parallelepiped U in R^3.