**THEORY OF INFORMATION** — the section of mathematics devoted to questions of assessment of amount of information, ways of its coding and transfer. T. and. often consider as a part of cybernetics (see), however it has also independent value for the areas which are not connected with management processes (e.g., biology, medicine, psychology, art, training).

At the heart of T. and. a certain way of measurement of amount of information, contained in any these (messages) lies. Words of any language, results of measurements, medical documentation etc. can be data source, or messages. Achievements in the area T. and. found application during creation of algorithms of processing of information (see. Algorithm ), various automated control systems (see), information retrieval systems (see) etc.

Theoretical bases of T. and. are put in 1948 — 1949 an amer. <the scientist Shannon K. of Shannon), to-ry proposed statistical measure of amount of information and proved a number of the theorems connected with questions of information transfer. As a source of information an object is considered, to-ry during a certain time slice (a step of transfer) about a nek-swarm probability can be in one of N possible states. The condition of a source of information in the form of signals arrives on a communication channel to the receiver of information. If all conditions of a source of information of a ravnoveroyatna and a current status the number (I) of information transferred for one step (it is measured in binary units of information — bits) as much as possible does not depend on the previous states, and it is calculated on a formula: I = log_{ 2 } N.

If the source has two possible and equally probable states (N — 2), for one step one unit of information — 1 bit is transferred. With growth of number of states the number of the transmitted data grows.

Can transmit information on one of two possible conditions of a source (N — 2) through one binary symbol (0 or 1) that corresponds to transfer of 1 bit of information (unit of information). Using combinations from binary symbols (00, 01, 10 and 11), it is possible to transfer information on a source with four possible states. Generally for information transfer about a source with 2^{ n } states there is enough p binary symbols. The logarithmic measure of information has property of additivity, i.e. the amount of information received from two independent sources of information is equal to the sum of the amount of information received from each of sources.

The statistical measure of information based that less probable (i.e. more unexpected) messages bear more information, than more probable (less unexpected) is widely used. The amount of information which is contained in the message that the source is in i-m a state is calculated on a formula:

I_{ i } = log_{ 2 } 1/P_{ i } = - log_{ 2 } P_{ i } , where P_{ i } — the prior probability of the fact that the source is in i-m a state.

As P_{ i } < 1, the amount of information is always positive and the more, than it is less than P_{ i } . Average amount of information, contained in one message, find by averaging of all possible states taking into account their probabilities:

If all conditions of an object of a ravnoveroyatna (i.e. P_{ i } = 1/N), that I = — log_{ 2 } 1/N = log_{ 2 } N.

In the presence of hindrances in a communication channel the number of the accepted information is always less than number of the transferred information. If i-e the message was transferred, and j-e, the number of the accepted information (I) is accepted it is equal:

where P(i/j) — probability that the message of i was transferred provided that the message of j is accepted; P_{ i } — absolute probability of the fact that the transferred message was i-m. The average number of the accepted information is found averaging on all possible couples of accepted and transferred messages.

The number of the transmitted data decreases if the probability of emergence of this message depends on earlier accepted messages. At the same time calculation of amount of information is made on more difficult formulas.

To the basic concepts of T. and. speed of information transfer of R (it is equal to the amount of information transferred on a communication channel in unit of time) and capacity of channel C belong (represents marginal speed, about a cut information can be transferred on this channel). If the speed of a broadcast R is less than the capacity of channel C, information can be transferred without mistakes. In practice at R close to With, high reliability can be reached only by coding enough long sequences of messages by codes with proofreading (the adjusting codes). At the same time the probability of a mistake tends to zero at aspiration of length of the coded sequence to infinity. In addition to the adjusting codes, for increase in a noise stability repetition of signals, coding with redemand, special broadband signals and other methods is used.

K T. and. often carry also rather developed theory of transmission of messages (the theory of communication) considering questions of creation of signals with the help to-rykh it is possible to transfer a large number of information on the channel with limited bandwidth at the set reliability. For information transfer different types of modulation and coding are used. In the course of modulation the message influences one of parameters of a high-frequency or pulse signal, napr, amplitude, frequency, a dlriyel-nost of impulses. One of the main problems consists in creation of multichannel communication, i.e. in transfer on one channel of messages on a condition of many sources. Big contribution to development of this direction T. and. it is brought by the Soviet scientists V. A. Kotelnikov, A. A. Harkevich, etc.

It should be noted that the model of a source of the message developed by Shannon is suitable for measurement of amount of information not in all areas. So, in the course of knowledge of the world we face tasks when in advance all conditions of bodies of interest and their statistical characteristics are not known; during the training the number of the obtained information depends on the level of knowledge of the trainee etc.

The Soviet mathematician A. N. Kolmogorov in 1965 offered a method of determination of amount of information, contained in an object X concerning an object at, based on a concept of complexity of the program of receiving at from x. Yu. A. Shreyder entered the concept «number of semantic information» connected with change of structure of system of concepts or their relations in language of the recipient of information. At the same time it is necessary to consider that if the arrived statement is clear and the recipient knows, it does not bear any information, i.e. the number of semantic information is equal in the message to zero; if the arrived set of words is not connected with the words which are available in the dictionary of the recipient at all, the number of semantic information is also equal to zero since the message will not be understood by the recipient. Only if it is possible to establish associative connection of the accepted sequence of words with the concepts and statements which are already stored in memory of the recipient, the number of semantic information differs from zero.

A number of works in the area T. and. it is devoted to a problem of so-called value of information. A. A. Harkevich suggested to measure the value of information through an increment of probability of achievement of the goal. In technical and economic systems usefulness of information is estimated in some cases in size of the economic losses arising at different mistakes in the course of obtaining information.

In biology and medicine of the idea and methods T. and. found application at a research of processes of information transfer in a nervous system (see. Nervous impulse ), studying of mechanisms of transfer of heritage (see. Genetic code , Heredity ), the analysis of mechanisms of perception of images (see. Recognition of images ), etc. Methods and achievements of T. and. find also big application in psychology, sociology, pedagogics and other areas of human knowledge.

**Bibliography:** Goldman S. The theory of information, the lane with English, M., 1957; Kogan I. M. Applied theory of information, M., 1981, bibliogr.; Seravin L. N. The theory of information from the point of view of the biologist, L., 1973; The Theory of information in biology, the lane with English, under the editorship of L. A. Blyumenfeld, M., 1960; At r-sul A. D. Information, M., 1971, bibliogr.; Shennonk. AA. Works on the theory of information and cybernetics, the lane with English, M., 1963; Shreyder Yu. A. About one model of the semantic theory of information, in book: Probl. cybernetics, under the editorship of A. A. Lyapunov, century 13, page 233, M., 1965.

*G A. Shastova*