ИСТИНА |
Войти в систему Регистрация |
|
Интеллектуальная Система Тематического Исследования НАукометрических данных |
||
The information entropy h was introduced to theory of information and communication by the pioneers of the field, Claude Shannon and Ralph Hartley in the first part of XX century. It describes the complexity of a message made up with n symbols, which appear in the message with probabilities pj. The pj values are also called statistic weights of the symbols in the message. If the base of logarithm equals two, the information entropy is expressed in bits. Mathematically, the above equation relates to the set of elements that could be divided into the nonintersecting subsets. In this sense, pj are called the cardinalities of the subsets. The information entropy in the original and modified forms entered chemical sciences but chemical audience in general is non-familiar with the title concepts. The talk is devoted to different ways of applying the information-entropy concepts to solving chemical problems. These applications deal with quantifying chemical and electronic structures of molecules, signal processing, structural studies on crystals, and molecular ensembles. Advances in the mentioned fields make information entropy a central concept for interdisciplinary studies on digitalizing chemical reactions, chemico-information synthesis, crystal engineering, as well as digitally rethinking basic notions of structural chemistry. In general, the applications of the h-based quantities become efficient when describing chemical objects and chemical phenomena have probabilistic nature or representable as the sets.