Definition of a random process. Realization, cross section of a random process

Before giving a definition of a random process, let us recall the basic concepts from the theory of random variables. As you know, a random variable is a quantity that, as a result of an experiment, can take one or another value, unknown in advance. There are discrete and continuous random variables. The main characteristic of a random variable is the distribution law, which can be given in the form of a graph or in an analytical form. Under the integral distribution law, the distribution function , where is the probability that the current value of the random variable is less than some value . With a differential distribution law, a probability density is used. The numerical characteristics of random variables are the so-called moments, of which the most common moment is the first order - the mean value (expectation) of the random variable and the central moment of the second order - the variance. If there are several random variables (a system of random variables), the concept of a correlation moment is introduced.

A generalization of the concept of a random variable is the concept random function, i.e. a function that, as a result of experience, can take on one form or another, unknown in advance. If the function argument is time t, then they call her random or stochastic process.

A specific type of random process obtained as a result of experience is called implementation random process and is an ordinary non-random (deterministic) function. On the other hand, at a fixed point in time, we have the so-called cross section of a random process in the form of a random variable.

To describe random processes, the concepts of the theory of random variables are generalized in a natural way. For some fixed point in time, the random process turns into a random variable for which one can introduce a function called one-dimensional distribution law random process. The one-dimensional distribution law is not an exhaustive characteristic of a random process. For example, it does not characterize the correlation (connection) between individual sections of a random process. If we take two different moments of time and , we can introduce a two-dimensional distribution law, and so on. Within the scope of our further consideration, we will restrict ourselves mainly to one-dimensional and two-dimensional laws.

Consider the simplest characteristics of a random process, similar to the numerical characteristics of a random variable. Expected value or set average

and dispersion

Mathematical expectation is a certain average curve around which individual realizations of a random process are grouped, and the variance characterizes the spread of possible realizations at each moment of time. Sometimes, the standard deviation is used.

To characterize the internal structure of a random process, the concept is introduced correlation (autocorrelation) functions

Along with the mathematical expectation (average over the set) (3.1), one more characteristic of a random process is introduced - mean random process for a separate implementation (average over time)

For two random processes, one can also introduce the concept of a cross-correlation function by analogy with (3.3).

One of the special cases of a random process that is widely used in practice is stationary random process is a random process, the probabilistic characteristics of which do not depend on time. So, for a stationary random process , , and the correlation function depends on the difference , i.e. is a function of one argument.

A stationary random process is to some extent similar to conventional or steady processes in control systems.

Stationary random processes have an interesting property called ergodic hypothesis. For a stationary random process, any mean over the set is equal to the mean over time. In particular, for example, This property often makes it possible to simplify the physical and mathematical modeling of systems under random influences.

As is known, in the analysis of deterministic signals, their spectral characteristics based on the Fourier series or integral are widely used. A similar concept can be introduced for random stationary processes. The difference will be that for a random process the amplitudes of the harmonic components will be random, and the spectrum of a static random process will describe the distribution of dispersions over different frequencies.

Spectral density of a stationary random process is related to its correlation function by Fourier transforms:

where the correlation function will be interpreted as the original, and - as the image.

There are tables linking originals and images. For example, if , then .

Let us note the connection between the spectral density and the correlation function with the dispersion D

Three types of random processes - Gaussian, stationary and Markov - have received wide practical use in the study of the state of various technical objects.

Gaussian random process is a random process X(t), the probability distribution of whose parameters obeys the normal law. The mathematical expectation (average value) M[X(t)] and the correlation function K x (t 1 ,t 2) uniquely determine the distribution of its parameters, hence the process as a whole.

Stationary random process(a random process homogeneous in time) is such a random process X(t) whose statistical characteristics are constant in time, that is, invariant to short-term perturbations: t → t + τ, X(t) → X(t + τ) for any fixed value τ. The process is completely determined by the mathematical expectation M and the correlation function

K x (t,τ) = M.

Markov random process- this is such a random process in which the probability of finding a system in any state in the future depends on what state the system is in at a given point in time and does not depend on how the system moved to this state. In short, the "future" and "past" of the process, given its known "present", are not connected with each other. Often a Markov process is characterized by the probabilities of a system's transition from one state to another (transition probabilities).

Changing the technical state of the system

As already mentioned, the task of predicting the technical state, in the most general sense, is to obtain some probabilistic characteristics of the system's performance in the future based on the control data of its present and past states.

Depending on what characteristic of the random process is determined during forecasting, reliability forecasting (determining the conditional probability density of the failure-free operation of the system after control) and technical condition forecasting (determining the conditional probability distribution density of the values ​​of the determining parameter) are distinguished based on past and present states. Figure 8.1 illustrates the difference between these characteristics. In this figure, x(t) is a segment of the implementation of the random process X(t), which describes the change in time of some defining parameter of the system, which has admissible boundaries (a, b) of change. The implementation segment is obtained as a result of observing a specific system instance from a given class of systems over the time interval (0, t k 2). At the moment t k 2 the last control of the system was carried out, and on its basis it is necessary to decide whether the system is suitable for operation before the next control moment t k 3 occurs.



rice. 8.1 Conditional probability density of failure-free operation p(x(t)) and f((x(t)) conditional probability distribution density of the values ​​of the determining parameter

Due to the fact that the external influences perceived by the system are of a random nature, the random process after the moment t k 2 can change in different ways (see the dotted lines in Fig. 8.1). A process that is a continuation of some initial process, provided that on the interval (0,t k 2) its implementation had a specific form x(t), is called conditional, or a posteriori, random process:

X ps(t)=x. (8.5)

Therefore, in order to make an informed decision on the appointment of the period for the next control of the system, it is necessary to know the characteristics of the a posteriori random process. A system will be considered suitable for performing the task, the determining parameters of which are within the allowable limits (a, b) at the time of the previous control and will not go beyond these limits until the end of the specified period of operation. Since the output of the defining parameters beyond the allowable boundaries is a random event, the assessment of the system performance can be the conditional probability of its failure-free operation after control. This is the probability that the random process never crosses the boundary (a, b) after moment of control; they call her predicted reliability systems and denote

P(x(t)=<<(ba)/X(t)=x(t), 0<

Thus, reliability prediction is the determination of the conditional probability of a system's failure-free operation, provided that at the time of control it was in some fixed operable state.

The most complete characteristic of the future technical state of the system is the conditional density of the probability distribution of its defining parameters, that is, the future values ​​of the random process

f(x(t k 3)/X(t)=x(t), 0<

provided that on the interval (0,t k 3) the implementation of the process had a specific form (Fig. 8.1).

Chapter 1. Basic concepts of the theory of random processes

Definition of a random process. Basic approaches to the assignment

random processes. The concept of realization and section.

Elementary random processes.

A random (stochastic, probabilistic) process is a function of a real variable t, the values ​​of which are the corresponding random variables X(t).

In the theory of random processes, t is treated as a time that takes values ​​from some subset T of the set of real numbers (t T, T R).

In the framework of classical mathematical analysis, the function y=f(t) is understood as such a type of dependence of the variables t and y, when a specific numerical value of the argument t corresponds to the only numerical value of the function y. For random processes, the situation is fundamentally different: setting a specific argument t leads to the appearance of a random variable X(t) with a known distribution law (if it is a discrete random variable) or with a given distribution density (if it is a continuous random variable). In other words, the characteristic under study at each moment of time is random in nature with a non-random distribution.

The values ​​that the ordinary function y=f(t) takes at each moment of time completely determine the structure and properties of this function. For random processes, the situation is completely different: here it is not enough to know the distribution of the random variable X(t) for each value of t, information is needed about the expected changes and their probabilities, that is, information about the degree of dependence of the upcoming value of the random process on its history.

The most general approach to describing random processes is to set all its multivariate distributions, when the probability of the following events occurring simultaneously is determined:

t 1 , t 2 ,…,t n T, n N: X(t i)x i ; i=1,2,…,n;

F(t 1 ;t 2 ;…;t n ;x 1 ;x 2 ;…;x n)= P(X(t 1)≤x 1 ; X(t 2)≤x 2 ;…; X(t n)≤x n).

This way of describing random processes is universal, but very cumbersome. To obtain significant results, the most important special cases are singled out, which allow the use of a more advanced analytical apparatus. In particular, it is convenient to consider random process X(t, ω) as a function of two variables: t T, ω Ω , which for any fixed value of t T becomes a random variable defined on the probability space (Ω, AA, P), where Ω is a non-empty set of elementary events ω; AA - σ-algebra of subsets of the set Ω, that is, the set of events; P is a probability measure defined on AA.

A non-random numerical function x(t)=X(t, ω 0) is called a realization (trajectory) of a random process X(t, ω).

The cross section of a random process X(t, ω) is a random variable that corresponds to the value t=t 0 .

If the argument t takes all real values ​​or all values ​​from some interval T of the real axis, then one speaks of a random process with continuous time. If t takes only fixed values, then one speaks of a random process with discrete time.

If the cross section of a random process is a discrete random variable, then such a process is called discrete state process. If any section is a continuous random variable, then the random process is called continuous state process.

In the general case, it is analytically impossible to specify a random process. The exception is the so-called elementary random processes, whose form is known, and random variables are included as parameters:

X(t)=X(t,A 1 ,…,A n), where A i , i=1,…,n are arbitrary random variables with a specific distribution.

Example 1 . A random process X(t)=A·e - t is considered, where A is a uniformly distributed discrete random variable that takes the values ​​(-1;0;1); t≥0. Depict all its implementations of the random process X(t) and show the sections at time t 0 =0; t 1 =1; t2=2.

Solution.

This random process is a process with continuous time and discrete states. At t=0, the section of the random process X(t) is a discrete random variable А(-1;0;1), distributed uniformly.

At t=0, the section of the random process X(t) is a discrete random variable А(-1;0;1), distributed uniformly.

At t=1, the section of the random process X(t) is a discrete random variable (-1/e;0;1/e), uniformly distributed.

At t=2, the section of the random process X(t) is a discrete random variable (-1/e 2 ;0;1/e 2 ). distributed uniformly.

Example 2 . A random process X(t)=sin At is considered, where A is a discrete random variable that takes the values ​​(0;1;2); the argument t takes discrete values ​​(0; π/4; π/2; π ). Show graphically all realizations and sections of this random process.

Solution.

This random process is a process with discrete time and discrete states.

Processes

view function

view function

Solution.

Mathematical expectation: m Y (t)=M(Xe - t)=e - t m X =me - t .

Dispersion: D Y (t)=D(Xe - t)=e -2 t DX=σ 2 e -2 t .

Standard deviation:

Correlation function: K Y (t 1; t 2) \u003d M ((X e - t 1 -m e - t 1) × (X e - t 2 -m e - t 2)) \u003d

E -(t 1+ t 2) M(X-m) 2 =σ 2 e -(t 1+ t 2) .

Normalized correlation function:

By the condition of the problem, the random variable X is normally distributed; for a fixed value of t, the cross section Y(t) depends linearly on the random variable X, and, by the property of the normal distribution, the cross section Y(t) is also normally distributed with a one-dimensional distribution density:

Example 4 Find the main characteristics of the random process Y(t)=W×e - Ut (t>0), where W and U are independent random variables; U is distributed evenly on the segment ; W has an expectation m W and a standard deviation σ W .

Solution.

Mathematical expectation: m Y (t)=M(We - Ut)=MW×M(e - Ut)=m w ×*M(e - Ut);

, (t>0).

Correlation function:

because

Dispersion:

Example 5 Find the one-dimensional distribution law of a random process: Y(t)=Vcos(Ψt-U), where V and U are independent random variables; V is normally distributed with parameters (m V ; σ V); Ψ-const; U- is evenly distributed on the segment .

Solution.

Mathematical expectation of a random process Y(t):

Dispersion:

Standard deviation:

We turn to the derivation of a one-dimensional distribution law. Let t-fixed moment of time, and the random variable U takes a fixed value U=u - const; u , then we obtain the following conditional characteristics of the random process Y(t):

M(Y(t)| U=u)=m V ×cos(Ψt-u);

D(Y(t)| U=u)= ×cos 2 (Ψt-u);

σ(Y(t)| U=u)= ×|cos(Ψt-u)|.

Since the random variable V is normally distributed and for a given value of the random variable U=u all sections are linearly dependent, then the conditional distribution in each section is normal and has the following density:

Unconditional one-dimensional density of the random process Y(t):

Obviously, this distribution is no longer normal.

Convergence and continuity

Convergence in probability.

They say that the sequence of random variables (Х n ) converges in probabilities to a random variable X for n®¥, if

Designation:

Note that for n®¥, the classical convergence of the probability to 1 takes place, that is, as the number n increases, it is possible to guarantee probabilities that are arbitrarily close to 1. But at the same time, it is impossible to guarantee the closeness of the values ​​of random variables X n to the values ​​of the random variable X for any arbitrarily large values ​​of n, since we are dealing with random variables.

stochastically continuous in point t 0 T if

3. Convergence on average to the power p³1.

They say that the sequence of random variables (X n ) converges in average in degree 1 to a random variable X, if

Designation: X n X.

In particular, (X n ) converges in rms to a random variable X, if

Designation:

Random process X(t), t T is called continuous in rms at the point t 0 T if

4. Convergence almost surely (convergence with probability one).

They say that the sequence of random variables (X n) converges almost certainly to a random variable X, if

where ωнW is an elementary event of the probability space (W, AA, P).

Designation: .

Weak convergence.

They say that the sequence ( F Xn (x)) of distribution functions of random variables X n converges weakly to the distribution function F X (x) of the random variable X if there is pointwise convergence at each point of continuity of the function F X (x).

Notation: F Xn (x)Þ F X (x).

Solution.

1) The mathematical expectation, variance, standard deviation, correlation function and normalized correlation function of the random process X(t) have the form (see. Example 3):

2) We proceed to the calculation of the characteristics of the random process X ’ (t). In accordance with Theorems 1-3 we get:

With the exception of the mathematical expectation (which changed sign), all other characteristics were completely preserved. Mutual correlation functions of the random process X(t) and its derivative X ’ (t) have the form:

3) According to Theorems 41-64 the main characteristics of the integral of the random process X(t) have the following meanings:

D(t1;t2)=?????????????

Mutual correlation functions of the random process X(t) and its integral Y(t):

Expression of the form

,

where φ ik (t), k=1;2;…-non-random functions; V i , k=1;2;… - uncorrelated centered random variables, is called the canonical expansion of the random process X(t), while the random variables V i are called the coefficients of the canonical expansion; and non-random functions φ ki (t) - coordinate functions of the canonical expansion.

Consider the characteristics of a random process

Since according to the condition then

Obviously, the same random process has different types of canonical expansion depending on the choice of coordinate functions. Moreover, even with the choice of coordinate functions, there is arbitrariness in the distribution of random variables V k. In practice, based on the results of experiments, estimates are obtained for the mathematical expectation and the correlation function: . After expansion into a double Fourier series in coordinate functions φ to (t):

get the values ​​of the dispersions D Vk of random variables V k .

Example 7. The random process X(t) has the following canonical decomposition: , where V k -normally distributed uncorrelated random variables with parameters (0; σ to); m 0 (t) is a non-random function. Find the main characteristics of the random process X(t), including distribution densities.

Solution.

From the general formulas obtained earlier, we have:

In each section, the random process X(t) has a normal distribution, since it is a linear combination of uncorrelated normally distributed random variables V k , while the one-dimensional distribution density has the form:

The two-dimensional distribution law is also normal and has the following two-dimensional distribution density:

Example 8 The mathematical expectation m X (t) and the correlation function K X (t 1 ;t 2)=t 1 t 2 of the random process X(t) are known, where . Find the canonical expansion of X(t) in terms of coordinate functions, provided that the expansion coefficients V k are normally distributed random variables.

Solution.

The correlation function has the following expansion

Consequently,

;

;

Because ,

then ; .

Distribution density of random variables V k:

The canonical expansion of the random process X(t) has the form:

.

narrow and broad sense.

A significant number of events occurring in nature, in particular, those related to the operation of technical devices, are “almost” steady-state, that is, the pattern of such events, subject to minor random fluctuations, nevertheless, generally persists over time. In these cases, it is customary to speak of stationary random processes.

For example, a pilot maintains a given flight altitude, but various external factors (gusts of wind, updrafts, changes in engine thrust, etc.) cause the flight altitude to fluctuate around a given value. Another example would be the trajectory of a pendulum. If it were left to itself, then, provided there were no systematic factors leading to the damping of oscillations, the pendulum would be in the mode of steady oscillations. But various external factors (gusts of wind, random fluctuations of the suspension point, etc.), without changing the parameters of the oscillatory mode as a whole, nevertheless make the characteristics of the movement not deterministic, but random.

Stationary (homogeneous in time) is a random SP process, the statistical characteristics of which do not change over time, i.e. are invariant under time and shifts.

Distinguish between random processes SP stationary in a broad and narrow sense.

Such that

The condition is met

F(t 1 ; t 2 ;… ;tn ; x 1 ; x 2 ;…; xn)=F(t 1 +τ; t 2 +τ;… ;tn +τ; x 1 ; x 2 ;…; xn ),

and, therefore, all n-dimensional distributions do not depend on time points t 1 ; t 2 ;… ;t n , and from n-1 duration of time intervals τ i ;:

In particular, the one-dimensional distribution density does not depend on the time t at all:

2D cross section density at times t 1 and t 2

n-dimensional density of sections at time t 1 ; t2 ...; tn:

The random process SP Xx(t) is called stationary in a broad sense if its first and second order moments are invariant with respect to time shift, that is, its mathematical expectation does not depend on time t and is a constant, and the correlation function depends only on the length of the time interval between sections:

It is obvious that a stationary random process of the SSP in the narrow sense is a stationary random process of the SSP and in the broad sense, the converse is not true.

Process SSP

2. 3. The correlation function of the stationary random process of the CSP is even:

Since it has the following symmetry

4. The variance of the stationary random process of the SSP is a constant equal to

the value of its correlation function at the point :

6. The correlation function of the stationary random process of the CSP is

positive definite, that is

Normalized correlation function of a stationary random process SSP is also even, positive definite, and, moreover,

Example 11. Find characteristics and draw a conclusion about the type of random process SP Xx(t):

where U 1 and b U 2 - uncorrelated random variables SW;

Solution.

Therefore, the random process X(t) is stationary in a broad sense. As follows from Example 10..., if U 1 and U 2 are independent, centering and normally distributed random variables CB, then the random process SP is also stationary in the broad sense.

Example 12. Prove stationarity in the broad sense that the random process SP Xx(t) is stationary in the broad sense:

where V and independent random variables CB; MV=m vV - const; is a random variable CV, nor evenly distributed on the segment;

Solution.

We write Xx(t) as follows:

Since the random variable is uniformly distributed on the interval , the distribution density has the form:

Consequently,

We get

Since the random process SP Xx(t) has constant mathematical expectation and variance, and the correlation function is a function of , then, regardless of the law of distribution of the random variable CV V M, the random process SP X x(t) is stationary in a broad sense.

Stationary connected joint ventures

Random processes of SP X(t)X(t) and Y(t)Y(t) are called stationary connected if their mutual correlation function depends only on the difference of arguments τ =t 2 -t 1:

R x XY y (t 1 ;t 2)=r x XY y (τ).

Stationarity of the random processes themselves SP X(t) X(t) and Y(t) Y(t) does not mean that they are stationary.

Let us note the main properties of the stationary random processes of the SP, the derivative and the integral of the stationary stochastic processes of the SP,

1) 1) rR x XYy (τ)=rR y YXx (-τ).

2) 2)

3) 3)

where

5) 5) where

6) 6) ;

Example 13 Correlation function of a stationary random process SSP X(t) X(t) has the form

Find correlation functions, variances, mutual correlation functions of random processes SP X(t), X’(t), .

Solution.

We confine our analysis of the case to the values D x X (t)=1.

Let's use the following relation:

We get:

Note that, as a result, when differentiating and, the stationary random process of the SPN X(t) passes into the stationary random process of the SPN X'(t) , while X(t) and X'(t) are stationary connected. When integrating a stationary random process X(t), a nonstationary random process Y(t) arises, and X(t) and Y(t) are not stationary connected.

And their characteristics

Among the stationary random processes of the CSP there is a special class of processes called ergodic , which have the following properties: the characteristics obtained by averaging the set of all realizations coincide with the corresponding characteristics obtained by averaging over time one realization observed on the interval (0, T) of a sufficiently long duration. That is, over a sufficiently large time interval any implementation goes through any state regardless of what was the initial state of the system at t=0; and in this sense, any realization fully represents the entire set of realizations.

There are non-stationary, stationary and ergodic random processes. The most general random process is non-stationary.

The random process is stationary, if its multivariate probability density depends only on the size of the intervals and does not depend on the position of these intervals in the range of the argument . It follows from this that, first, for a stationary process, the one-dimensional probability density does not depend on time, i.e., ; secondly, the two-dimensional probability density depends on the difference , i.e. etc. In this regard, all moments of a one-dimensional distribution, including the mathematical expectation and variance, are constant. It is often sufficient to determine a random process as stationary by the constancy of the first two moments. Thus for a stationary process:

A stationary random process is called ergodic if, when determining any statistical characteristics, averaging over a set of realizations is equivalent to averaging over time of one infinitely long realization; in this case

1.1.1. Gaussian stochastic processes

Gaussian , if all its finite-dimensional distributions are normal, that is,

t 1 ,t 2 ,…,t n T

random vector

(X(t 1);X(t 2);…;X(t n))

has the following distribution density:

,

where a i =MX(t i); =M(X(t i)-a i) 2 ; with ij =M((X(t i)-a i)(X(t j)-a j));
;

-algebraic complement of an element with ij .

1.1.2. Random processes with independent increments

with independent increments , if its increments on non-overlapping time intervals do not depend on each other:

t 1 ,t 2 ,…,t n T:t 1 ≤t 2 ≤…≤t n ,

random variables

X(t 2)-X(t 1); X(t 3)-X(t 2); …; X(tn)-X(tn-1)

independent.

1.1.3. Random processes with uncorrelated increments

The random process X(t) is called the process with uncorrelated increments, if the following conditions are met:

1) t T: MX 2 (t)< ∞;

2) t1,t2,t3,t4 T:t 1 ≤t 2 ≤t 3 ≤t 4: М((X(t 2)-X(t 1))(X(t 4)-X(t 3)))=0.

1.1.4. Stationary Stochastic Processes (see Chapter 5)

1.1.5. Markov stochastic processes

We confine ourselves to the definition Markovian random process with discrete states and discrete time (Markov chain).

Let system A be in one of the incompatible states A 1 ; BUT 2 ;…;BUT n , and the probability P ij ( s ) that in s th test, the system passes from the state to state A j , does not depend on the state of the system in the tests preceding s -1st. A random process of this type is called a Markov chain.

1.1.6. Poisson random processes

The random process X(t) is called Poisson process with parameter a (a>0) if it has the following properties:

1) t T; T=)

Up