PROBABILITIES THEORY

From Big Medical Encyclopedia

PROBABILITIES THEORY — the section of mathematics studying patterns of the mass accidental phenomena methods to-rogo found broad application in medical science and practice. Methods B. of t. are called probabilistic or statistical. They are used in various medical researches, napr, during the processing of laboratory and clinical data, anthropometrical and epidemiological researches, at diagnosis of diseases etc. Statistical methods are the cornerstone sanitary statistics (see). Due to broad implementation of the electronic diagnostic package, COMPUTER and cybernetic methods the flow of information demanding statistical processing sharply increases owing to what methods of a statistical analysis get a special role.

Basic data for statistical calculations are obtained experimentally. Methods of the analysis of results of experiences and assessment on them statistical characteristics, napr, probabilities of events, mathematical expectations, dispersions, the correlation moments, undertake from the mathematical statistics which is one of sections B. of t.

Basic concepts and laws. Experience in V. of t. call set of actions as a result of which the accidental phenomenon is observed, and any qualitative result of experience is called an event. Frequency of h of a nek-ry event And is called the relation of number of t of experiences in which there was an event And, to number n of all conducted experiments (h = m/n). At a large number of experiences (see. Large numbers law ) experimentally the main pattern of the mass accidental phenomena consisting in a certain stability of frequencies of events is found. Probability of an event And, the designated P (A), call number, characteristic of it, near to-rogo the frequency of an event is stabilized, at unlimited increase in number of experiences. The probability of an impossible event is accepted equal 0, and a certain event — equal 1. Generally 0?P (A)≤1. Probability — the objective numerical characteristic of an event existing irrespective of specific experiences.

Probabilities of events submit to laws of addition and multiplication of probabilities. Sum of events of A1, A2..., An call the event consisting in emergence of some one event, indifferently what. The law of addition of probabilities claims that the probability of the sum of not joint events is equal to the sum of probabilities of these events:

Nek-raya set of events is called full group of events if at least one of them surely appears as a result of experience.

The work of two events of A1 and A2 call the event consisting in joint emergence of both events. Conditional probability of an event of A2 concerning an event of A1 call number P (A2 | A1), near to-rogo the frequency of emergence of an event And ·, calculated only for that part of the conducted experiments aims to be stabilized, in a cut the event of A1 was observed.

The law of multiplication of probabilities says that the probability of the work of two events is equal to the probability of one of them increased by conditional probability of another: R (A1A2) = P(A1)P(A2 | A1). The law of multiplication of probabilities for any final number of events on induction (see. Inductive conclusion ) it is brought out of the law of multiplication for two events. Events of A1, A 2..., An call independent if each of them does not depend on each of the others and on all possible works made of them. The probability of joint emergence of independent events is equal to the work of their probabilities:

R (A1A2... An) = P (A1) P(А2)... R (An).

For calculation of probability of some event And if conditional probabilities P are known (And | Hi) where i = 1, 2..., n of this event of rather not joint events H1, H2..., Hn, and also probabilities of P (Hi),

where i = 1, 2..., n of events H1, H2..., Hn. the formula of a total probability is applied:

When it is required to find conditional probabilities of the not joint events forming full group, rather nek-ry observed event Bayes's formula is used (see. Bayes rule ).

Variate of X call such size, edges can accept various values as a result of experience, and in advance it is impossible to expect what value it will accept. For a discrete variate its possible values are separated one from another by intervals, free of other possible values. The law of distribution of such variate express compliance between its possible values h1, h2..., xN and probabilities of P1, P1..., PN of events consisting that a variate of the X signs these values. For a continuous variate its possible values entirely fill a numerical axis or some of its interval. In this case the law of distribution of probabilities is set by means of density of probability of f (x) representing a limit of the relation of probability of hit of its value in an infinitesimal interval (x, x + ∆ x) to length of this interval Ah at its tightening in a point. The curve representing the f (x) function is called a curve of distribution. The most wide spread occurance has «normal distribution». At simultaneous studying of several variates use the joint law of distribution of probabilities of these variates. For the solution of many tasks of practice it is not necessary to know the law of distribution of a variate at all, and the nobility only her average value and size of dispersion of rather average value is enough.

An average value of m, or mathematical expectation, a variate (MX, or mx) call the sum of its possible values increased by their probabilities. The size of dispersion of a variate of X is characterized by dispersion by Dx or a devirage quadratic deviation σx = by √Dx. At the same time dispersion is called mathematical expectation of a square of a deviation from mathematical expectation: Dx = M (X — mx) 2. At calculation of mx and Dx respectively formulas are applied to discrete and continuous variates:

For two variates of X1 and X2 in addition define their correlation moment (covariance) Kx1x2, representing mathematical expectation of the work of deviations accidental greatness from their mathematical expectations: Kx1x2 = M (X1-mx1) (X2-mx2). The correlation moment characterizes linear relation between variates. In case there are several variates, define the correlation moments between all sizes.

Century of t. along with variates considers also accidental processes (see), i.e. such functions of time which in each timepoint are variates. Section B. of t., in Krom are limited to consideration only of mathematical expectations, dispersions and the correlation moments, is called the correlation theory.


Bibliography: Boyarsky A. Ya. Statistical methods in pilot Medical studies, M., 1955; To Venttsa of l E. S. and L. A. Sheepherds. Probability theory, M., 1973; Gnedenko B. V. Course of probability theory, M., 1969, bibliogr.; Gnedenko B. V. and A. Ya Hin-chin. Elementary introduction to probability theory, M., 1970; Kaminsky L. S. Statistical processing of laboratory and clinical data, L., 1964; Jug of nicknames of P. A. A statistical method in clinical trials, M., 1955; Pugachev V. S. Introduction to probability theory, M., 1968; F e of l of l of e r Century. Introduction to probability theory and its appendices, the lane with English, t. i \22222222, M., 1967.

V. S. Pugachev, I. N. Sinitsyn.

Яндекс.Метрика