Random variables. Discrete random variables Presentation of a random variable and its distribution function




Discrete random variables Random variables that take only values ​​separated from each other, which can be listed in advance Examples: - the number of heads in three tosses of a coin; - number of hits on the target with 10 shots; - the number of calls received at the ambulance station per day.




The law of distribution of a random variable is any relationship that establishes a connection between the possible values ​​of a random variable and the corresponding probabilities. The distribution law of a random variable can be specified in the form of: a table, a graph, a formula (analytically).




Calculation of the probability of realizing certain values ​​of a random number The number of heads falling out is 0 - events: PP - probability 0.5 *0.5 =0.25 The number of heads falling out is 1 - events: P0 or OP - probability 0.5 *0.5 + 0.5*0.5 = 0.5 The number of heads is 2 – events: 00 – probability 0.5 *0.5 = 0.25 Sum of probabilities: 0.25 + 0.50 + 0.25 = 1




Calculating the values ​​of a series of distributions of a random number Problem. The shooter fires 3 shots at the target. The probability of hitting the target with each shot is 0.4. For each hit, the shooter is awarded 5 points. Construct a distribution series for the number of points scored. Probability of events: binomial distribution Event designation: hit - 1, missed - 0 Full group of events: 000, 100, 010, 001, 110, 101, 011, 111 k = 0, 1, 2, 3


Distribution series of the random number of knocked out event points number of points event probability0.2160.4320.2880.064


Addition and multiplication operations random variables The sum of two random variables X and Y is a random variable that is obtained by adding all values ​​of the random variable X and all values ​​of the random variable Y, the corresponding probabilities are multiplied X01 p0,20,70,1 Y123 p0,30,50,2


Operations of addition of random variables Z = = =2 0+1 =1 0+2 =2 0+3 =3 1+1 =2 1+2 =3 1+3 =4 p 0,060,10,040,210,350,140,030,050,02 Z01234 p0,060,310,420,190, 02


Operations of multiplication of random variables The product of two random variables X and Y is a random variable that is obtained by multiplying all values ​​of the random variable X and all values ​​of the random variable Y, the corresponding probabilities are multiplied X01 p0,20,70,1 Y123 p0,30,50, 2








Properties of the distribution function F(X) 0 F(x) 1 F(X) - non-decreasing function The probability of a random variable X falling into the interval (a,b) is equal to the difference between the values ​​of the distribution function at the right and left ends of the interval: P(a X


Basic characteristics of discrete random variables The mathematical expectation (average value) of a random variable is equal to the sum of the products of the values ​​​​accepted by this value and the corresponding probabilities: M(x) = x 1 P 1 + x 2 P x n P n =




Xixi PiPi x i P i (x i – M) 2 (x i – M) 2 P i 2 0.1 0.2 (2-3.6) 2 = 2,560.256 30.30.9 (3-3.6) 2 = 0.360.108 40.52 (4-3.6) 2 = 0.160.08 50.10.50.5 (5-3.6) 2 = 1.960.196 EXAMPLE: Calculate the main numerical characteristics for the number of drug orders received in 1 hour M( x)=3.6 D(x)=0.64
RECOMMENDED READING: Basic literature: Ganicheva A.V., Kozlov V.P. Mathematics for psychologists. M.: Aspect-press, 2005, with Pavlushkov I.V. Basics higher mathematics and mathematical statistics. M., GEOTAR-Media, Zhurbenko L. Mathematics in examples and problems. M.: Infra-M, Teaching aids: Shapiro L.A., Shilina N.G. Guide to practical classes in medical and biological statistics Krasnoyarsk: Polikom LLC. – 2003.

“Fundamentals of mathematical statistics” - The numerical value of a quantity is the number of successes in a series of tests. Some definitions. Fundamentals of the theory of testing statistical hypotheses. Errors when testing statistical hypotheses. In a series of n trials, k successes and n-k “failures” must occur simultaneously. What is the probability of choosing a white ball from a randomly selected basket?

“Basic statistical characteristics” - Median. Fashion series. Row range. Scope. Median of the series. Average arithmetic series numbers. Petronius. Find the arithmetic mean. School notebooks. Basic statistical characteristics. Statistics.

“Statistical research” - For the first time we find the term “statistics” in fiction. Relative frequency of an event. Range is the difference between the largest and lowest values series of data. Statistics is first and foremost a way of thinking. Hypothesis. Basic statistical characteristics. Do you need help completing homework mathematics.

“Probability theory and statistics” - Chebyshev’s theorem. Random value. Testing the hypothesis about the numerical value of probability. Flow of events. Multivariate random variable. Relative frequency. Dependent random variables. Testing the hypothesis about the significance of the sampling coefficient. Statistical meaning mathematical expectation. Random experiment.

“Elements of mathematical statistics” - Parts are manufactured on different machines. Confidence interval for unknown variance. Statistical estimates. Interval estimates. Selection methods. General population. Correlation moment. Testing statistical hypotheses. Calculation of confidence intervals for unknown variance. Comparison of two variances.

“Probability and mathematical statistics” - Descriptive statistics. White and red roses. Trimmed average. Assess the possibility of events occurring. Scatterplots. Chart images. Let's consider the events. Code for the safe. Bun. Accuracy of the obtained values. Combinatorial problems. There are only two letters in the Huahua alphabet. Math grades.

There are a total of 17 presentations in the topic

2 RANDOM VARIABLES AND THEIR DISTRIBUTION LAWS Distribution series. Distribution polygon The law of distribution of a random variable is any relation that establishes a connection between the possible values ​​of a random variable and the corresponding probabilities.


3 Consider a discontinuous random variable X with possible values ​​x 1, x 2, ..., x n. Each of these values ​​is possible, but not certain, and the value X can take each of them with some probability. As a result of the experiment, the value X will take one of these values, i.e., one of the complete group of incompatible events will occur. Let us denote the probabilities of these events by the letters p with the corresponding indices: P(X=x 1)=p 1; P(X=x2) = p2 ;...; P(X = x n) = p n. Since incompatible events form a complete group, the sum of the probabilities of all possible values ​​of the random variable is equal to one


4 The distribution series of the random variable X has the following form xixi x1x1 x2x2 …xnxn pipi p1p1 p2p2 …pnpn graphic image To give the distribution series a more visual appearance, they often resort to its graphical representation: the possible values ​​of the random variable are plotted along the abscissa axis, and the probabilities of these values ​​are plotted along the ordinate axis. This figure is called a distribution polygon.






7 The distribution function F(x) is sometimes also called the cumulative distribution function or the cumulative distribution law. The distribution function is the most universal characteristic of a random variable. It exists for all random variables: both discontinuous and continuous. The distribution function fully characterizes a random variable from a probabilistic point of view, i.e. is one of the forms of the distribution law.


X 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is equal to zero: F " title="8 Let us formulate some general properties distribution functions. 1. The distribution function F(x) is a non-decreasing function of its argument, i.e. for x 2 > x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is zero: F " class="link_thumb"> 8 8 Let us formulate some general properties of the distribution function. 1. The distribution function F(x) is a non-decreasing function of its argument, i.e. for x 2 > x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is equal to zero: F (-) = 0. 3. At plus infinity, the distribution function is equal to one: F (+) = 1. x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is zero: F "> x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is zero: F (- ) = 0. 3. At plus infinity, the distribution function is equal to one: F (+) = 1."> x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is equal to zero: F " title="8 Let us formulate some general properties of the distribution function. 1. The distribution function F(x) is a non-decreasing function of its argument, i.e. for x 2 > x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is zero: F"> title="8 Let us formulate some general properties of the distribution function. 1. The distribution function F(x) is a non-decreasing function of its argument, i.e. for x 2 > x 1 F(x 2) F(x 1). F(x 2) F(x 1). 2. At minus infinity, the distribution function is zero: F"> !}




10 Without giving a rigorous proof of these properties, we will illustrate them using a visual geometric interpretation. To do this, we will consider the random variable X as a random point X on the Ox axis, which as a result of experiment can take one or another position. Then the distribution function F(x) is the probability that a random point X as a result of the experiment will fall to the left of point x.


11 Distribution density Function f(x) - an arbitrary distribution function characterizes, as it were, the density with which the values ​​of a random variable are distributed at a given point. This function is called the distribution density (otherwise known as the “probability density”) of a continuous random variable. characterizes, as it were, the density with which the values ​​of a random variable are distributed at a given point. This function is called the distribution density (otherwise known as the “probability density”) of a continuous random variable. Sometimes the function f(x) is also called the “differential distribution function” or the “differential distribution law” of the value X.






14 Consider a continuous random variable X with distribution density f(x) and an elementary section dx adjacent to point x. The probability of a random variable X falling on this elementary section (with an accuracy of up to infinitesimal higher order) is equal to f(x)dx. The quantity f(х)dх is called the probability element. Geometrically, this is the area of ​​an elementary rectangle resting on the segment dx.


15


16 Let us express the probability of the value X falling on the segment from α to β through the distribution density. Obviously, it is equal to the sum of the probability elements over this entire section, i.e., the integral: Geometrically, the probability of the value X falling on the section (α, β) is equal to the area of ​​the distribution curve based on this section.


17 Basic properties of distribution density. Distribution density is non-negative function: The distribution density is a non-negative function: This property directly follows from the fact that the distribution function F(x) is a non-decreasing function. Integral to infinite limits of distribution density equal to one:




19 The mathematical expectation of a random variable is the sum of the products of all possible values ​​of a random variable by the probabilities of these values. – probabilities of values. – probabilities of values.




21 The concept of moment is widely used in mechanics to describe the distribution of masses. Exactly the same techniques are used in probability theory to describe the basic properties of the distribution of a random variable. Most often, two types of moments are used in practice: initial and central.






24 The central moment of order s of a random variable X is the mathematical expectation of the sth degree of the corresponding centered random variable: For any random variable, the central moment of the first order is equal to zero: since the mathematical expectation of a centered random variable is always equal to zero.


25 Of all the moments, the first initial moment (mathematical expectation) and the second central moment are most often used as characteristics of a random variable. The second central moment is called the variance of the random variable (D[X]). According to the definition of central moment: i.e. the variance of a random variable X is the mathematical expectation of the square of the corresponding centered variable.



29 NORMAL DISTRIBUTION LAW The normal distribution law (often called Gauss's law) plays an extremely important role in probability theory and occupies a special position among other distribution laws. This is the most frequently encountered distribution law in practice. The main feature that stands out normal law among other laws, is that it is a limiting law to which other laws of distribution approach under very common typical conditions.


30 The distribution curve, according to the normal law, has a symmetrical bell-shaped appearance. The maximum ordinate of the curve equal to corresponds to the point x = t; As we move away from point m, the distribution density decreases, and at x ± the curve asymptotically approaches the abscissa axis.


33 RAYLEIGH DISTRIBUTION LAW The distribution of the modulus of a vector on a plane, the coordinates of which are independent random variables that have a normal distribution law with zero mean and unit variance, is described by the Rayleigh distribution. The Rayleigh distribution is implemented when the measurement errors along the x and y coordinates are independent and normally distributed with equal variances.

Test questions1.
2.
3.
4.
5.
6.
7.
What is called a random variable?
What types of random variables do you know?
What is called discrete random
size?
What is the law of distribution called?
random variable?
How can you set the distribution law
random variable?
How can one set the DSV distribution law?
Name the main numerical characteristics
DSV, and write down the formulas for calculating them.

1. Types of random variables

One of the most important concepts in
theories
probabilities
is
concept of a random variable.
The quantity is called random,
if as a result of experience she can
accept
any
in advance
unknown values.

Random variables
CB
Discrete random variables
DSV
Continuous random variables
NSV

Discrete
random
magnitude
(DSV)

This
random variable that
accepts
separate
isolated,
countable
many meanings.
Example. Number of visitors
clinics during the day.

Continuous
random
magnitude
(NSV)

This
random
size,
taking any values
from some interval.
Example.
Weight
at random
selected tablet of some
drug.

Random variables denote
in Latin capital letters
alphabet: X, Y, Z, etc.,
and their values ​​are corresponding
lowercase letters: x, y, z, etc.

Example.
If
random
the value X has three possible
values, then they can be
designated as follows: x1, x2, x3.
X: x1, x2, x3.

2. Distribution of a discrete random variable

Law of distribution of DSV
called
correspondence
between
possible
values
And
their
probabilities.
Law
distribution
Can
introduce
V
form
tables,
formulas, graphically.

When specifying the law in tables
DSV distribution first line
tables
contains
possible
values, and the second – their probabilities:
X
x1
x2

xn
P
p1
p2

pn

Taking into account that in one
test SV accepts one thing and only
one possible value, we get that
events
X=x1 , X=x2 ,…, X=xn form a complete
group, therefore the sum of probabilities
of these events, that is, the sum of the probabilities
the second row of the table is equal to one:
p1+p2+…+pn=1.

p
p2
p1
pn
0
x1
x2


xn
x
For
visibility
distribution law
DSV can be depicted
graphically, why
V
rectangular
system
coordinates
are building
points
With
coordinates (xi ;pi),
and then connect them
straight segments.
Received
figure
called
polygon
distributions.

3. Distribution function

Random distribution function
of the quantity X is called a function
valid
variable
x,
defined by the equality F(x)=P(X It is also called integral
distribution function of DSV and NSV.

Since up to the value x1 the random variable X
did not occur, then the probability of event X< x1
equal to zero.
For all values ​​of x1 events X x1, i.e. p1.
But at x>x2 the SV can already accept two
possible values ​​of x1 and x2, so
probability of event X equal to the sum of probabilities p1+p2, etc.

If the discrete values ​​of random
quantities x1, x2 , … ,xn are located in
ascending order, then each value
xi of these quantities is put into correspondence
the sum of the probabilities of all previous ones
values ​​and probability pi:
x1
x2
x3

xn
p1 p1+ p2 p1+ p2 + p3 … p1+ p2 + p3+ … + pn

0,
p
1
F x p1 p2
...
1
at
x x1 ;
at
x1 x x2 ;
at
x2 x x3 ;
...
...
at
x xn .

By plotting the possible
DSV X values ​​and corresponding
amounts
probabilities
we get
stepped figure, which
is
schedule
functions
probability distributions.

y
p1+p2+…+pn
...
p1+p2
p1
0
x1
x2

xn
x

Properties of the distribution function of a random variable X

1)0 F x 1;
2) x1 x2 F x1 F x2

4. Numerical characteristics of discrete random variables

1). Mathematical expectation and its properties

The mathematical expectation of DSV X is called
the sum of the products of all its values ​​by
corresponding probabilities.
n
M X x1 p1 x2 p2 ... xn pn xi pi
i 1

Probabilistic meaning of mathematical expectation:

Mathematical expectation approximately
equals
average
arithmetic
observed
values
random
quantities. (On the number axis, possible
values ​​are located to the left and right of
mathematical
expectations,
T.
e.
mathematical
expectation
more
least
And
less
the greatest
possible values).

Properties of mathematical expectation

1.
Mathematical
expectation
constant
magnitude is equal to the most constant
M C C
2. The constant multiplier can be extended beyond
mathematical expectation sign
M CX C M X

3. Mathematical expectation of the amount
of a finite number of random variables is equal to
the sum of their mathematical expectations
M X Y M X M Y

4.
Mathematical
expectation
products of a finite number of independent
random variables is equal to their product
mathematical expectations.
(Two random variables are called
independent if the distribution law
one of them does not depend on what
possible
values
accepted
other
size)
M X Y M X M Y

2). Dispersion and its properties

Dispersion (scattering) DSV
called mathematical expectation
square
deviations
NE
from
her
mathematical expectation
D X M X M X
2

Dispersion properties:

1. The variance of a constant value is equal to
zero
D C 0

2. A constant multiplier can be
carry out
behind
sign
variances,
squaring it
D CX C D X
2

3. Variance of the sum of a finite number
independent SVs is equal to the sum of them
variances
D X Y D X D Y

Theorem. The variance of the DSV is equal to the difference
between the mathematical expectation of the square
DSV X and the square of its mathematical
expectations
D X M X M X
2
2

3). Standard deviation

Standard deviation
random
quantities
X
called
arithmetic
meaning
root
square of its variance
X D X

Example. Calculate the mathematical expectation, variance, standard deviation of a discrete random variable X,

defined as the number of students in
at random
selected
group,
using
the following data:
X
8
9
10
11
12
P
0,2
0,1
0,3
0,2
0,2

M X 8 0.2 9 0.1 10 0.3 11 0.2 12 0.2
1,6 0,9 3 2,2 2,4 10,1;

D X 8 0.2 9 0.1 10 0.3
2
2
2
11 0,2 12 0,2 10,1
2
2
103,9 102,01 1,89;
X 1.89 1.37.
2

Comment. Expectation and variance of the number of occurrences of an event in independent trials

If the probability of occurrence of event A in
each trial does not depend on the outcome of the others
tests, then such tests are
independent.
Let
these
probabilities
are the same and equal to p.
Then the probability of event A not occurring
in trial
q=1-p.

Theorem.
Mathematical
waiting for the number of occurrences of event A
V
independent tests equals
product of the number of tests by
probability of occurrence of event A in
each test:
M X n p

Theorem. Variance of number of appearances
events A in independent trials
equal to the product of the number of trials
on the probability of occurrence and not
appearance
events
A
V
one
test:
D X n p q

Example. Five pharmacies are checked
annual
balance.
Probability
correct balance sheet
each pharmacy is 0.7. Find
mathematical
expectation
And
dispersion of correctly formed
balances.
Solution.
By condition n=5; p=0.7;
q=1-0.7=0.3.

Slide 1

Slide description:

Slide 2

Slide description:

Slide 3

Slide description:

Slide 4

Slide description:

Slide 5

Slide description:

Suppose that n independent trials are performed, as a result of each of which event A may or may not occur. Let the probability of the occurrence of event A in each trial be equal to p. Let's consider a random variable - the number of occurrences of event A during n independent trials. The range of change consists of all integers from 0 to n inclusive. The law of probability distribution p(m) is determined by the Bernoulli formula (13"): Suppose that n independent trials are performed, as a result of each of which event A may or may not occur. Let the probability of the occurrence of event A in each trial be equal to p. Consider a random variable - the number of occurrences of event A during n independent trials. The area of ​​change consists of all integers from 0 to n inclusive. The probability distribution law p(m) is determined by the Bernoulli formula (13"):

Slide 6

Slide description:

Slide 7

Slide description:

Slide 8

Probabilities p(xi) are calculated using the Bernoulli formula for n=10. For x>6 they are practically equal to zero. The graph of the function p(x) is shown in Fig. 3. Probabilities p(xi) are calculated using the Bernoulli formula for n=10. For x>6 they are practically equal to zero. The graph of the function p(x) is shown in Fig. 3.