Downloaded From : www.EasyEngineering.net

ww

w.E a

syE

ngi

nee

rin

g.n

et

**Note: Other Websites/Blogs Owners Please do not Copy (or) Republish this Materials, Students & Graduates if You Find the Same Materials with EasyEngineering.net Watermarks or Logo, Kindly report us to [email protected]

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

S.NO

1. 2. 3 4 5 6 7 8 9 10 11

CONTENTS

Page.NO

UNIT I RANDOM VARIABLES Introduction Discrete Random Variables Continuous Random Variables Moments Moment generating functions Binomial distribution Poisson distribution Geometric distribution Uniform distribution Exponential distribution Gamma distribution

1 1 5 14 14 18 21 25 27 29 31

ww w.E

20 21 22 23 24 25

UNIT II TWO –DIMENSIONAL RANDOM VARIABLES Introduction Joint distribution Marginal and Conditional Distribution Covariance Correlation Coefficient Problems Linear Regression Transformation of random variables Problems UNIT III RANDOM PROCESSES Introduction Classification stationary processes Markov processes Poisson processes Random Telegraph processes

26 27 28 29 30 31

UNIT IV CORRELATION AND SPECTRAL DENSITIES Introduction Auto Correlation functions Properties Cross Correlation functions Properties Power spectral density

11 12 13 14 15 16 17 18 19

asy En gin ee

37 37 38 43 44 41 45 46 47

rin g.n et 49 50 51 55 56 57

60 60 61 63 64 65

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

32 33

Properties Cross spectral density

34

Properties

35 36 37 38 39 40 41

66 66 67

UNIT V LINER SYSTEMS WITH RANDOM INPUTS Introduction Linear time invariant systems Problems Linear systems with random inputs Auto Correlation and Cross Correlation functions of inputs and outputs System transfer function Problems

ww w.E

71 72 72 73 74 75 76

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

LTPC3104

OBJECTIVES: To provide necessary basic concepts in probability and random processes for applications such as random signals, linear systems etc in communication engineering. UNIT I RANDOM VARIABLES 9+3 Discrete and continuous random variables – Moments – Moment generating functions – Binomial, Poisson, Geometric, Uniform, Exponential, Gamma and Normal distributions. UNIT II TWO - DIMENSIONAL RANDOM VARIABLES 9+3 Joint distributions – Marginal and conditional distributions – Covariance – Correlation and Linear regression – Transformation of random variables. UNIT III RANDOM PROCESSES 9+3 Classification – Stationary process – Markov process Poisson process – Random telegraph process. UNIT IV CORRELATION AND SPECTRAL DENSITIES 9+3 Auto correlation functions – Cross correlation functions – Properties – Power spectral density – Cross spectral density – Properties. UNIT V LINEAR SYSTEMS WITH RANDOM INPUTS 9+3 Linear time invariant system – System transfer function – Linear systems with random inputs – Auto correlation and Cross correlation functions of input and output. TOTAL (L:45+T:15): 60 PERIODS OUTCOMES:  The students will have an exposure of various distribution functions and help in acquiring skills in handling situations involving more than one variable. Able to analyze the response of random inputs to linear time invariant systems.

ww w.E

asy En gin ee

TEXT BOOKS: 1. Ibe.O.C., “Fundamentals of Applied Probability and Random Processes", Elsevier, 1st Indian Reprint, 2007. 2. Peebles. P.Z., "Probability, Random Variables and Random Signal Principles", Tata McGraw Hill, 4th Edition, New Delhi, 2002.

rin g.n et

REFERENCES: 1. Yates. R.D. and Goodman.D.J., "Probability and Stochastic Processes", 2nd Edition, Wiley India Pvt. Ltd., Bangalore, 2012. 2. Stark. H., and Woods. J.W., "Probability and Random Processes with Applications to Signal Processing", 3rd Edition,Pearson Education, Asia, 2002. 3. Miller. S.L. and Childers.D.G., "Probability and Random Processes with Applications to Signal Processing and Communications", Academic Press, 2004. 4. Hwei Hsu, "Schaum‟s Outline of Theory and Problems of Probability, Random Variables and Random Processes", Tata McGraw Hill Edition, New Delhi, 2004. 5. Cooper. G.R., McGillem. C.D., "Probabilistic Methods of Signal and System Analysis", 3rd Indian Edition, Oxford University Press, New Delhi, 2012.

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

UNIT - I RANDOM VARIABLES

Introduction Consider an experiment of throwing a coin twice. The outcomes {HH, HT, TH, TT} consider the sample space. Each of these outcome can be associated with a number by specifying a rule of association with a number by specifying a rule of association (eg. The number of heads). Such a rule of association is called a random variable. We denote a random variable by the capital letter (X, Y, etc) and any particular value of the random variable by x and y.

ww w.E

Thus a random variable X can be considered as a function that maps all elements in the sample space S into points on the real line. The notation X(S)=x means that x is the value associated with the outcomes S by the Random variable X. 1.1 SAMPLE SPACE Consider an experiment of throwing S = {HH, HT, TH, TT} constitute the sample space.

a

coin

asy En gin ee

twice.

The

outcomes

1.2 RANDOM VARIABLE In this sample space each of these outcomes can be associated with a number by specifying a rule of association. Such a rule of association is called a random variables. Eg : Number of heads We denote random variable by the letter (X, Y, etc) and any particular value of the random variable by x or y. S = {HH, HT, TH, TT} X(S) = {2, 1, 1, 0} Thus a random X can be the considered as a fun. That maps all elements in the sample space S into points on the real line. The notation X(S) = x means that x is the value associated with outcome s by the R.V.X. Example In the experiment of throwing a coin twice the sample space S is S = {HH,HT,TH,TT}. Let X be a random variable chosen such that X(S) = x (the number of heads). Note Any random variable whose only possible values are 0 and 1 is called a Bernoulli random variable.

rin g.n et

1.2.1 DISCRETE RANDOM VARIABLE Definition : A discrete random variable is a R.V.X whose possible values consitute finite set of values or countably infinite set of values. Examples 1

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

All the R.V.’s from Example : 1 are discrete R.V’s Remark The meaning of P(X ≤a). P(X ≤a) is simply the probability of the set of outcomes ‘S’ in the sample space for which X(s) ≤ a. Or P(X≤a) = P{S : X(S) ≤ a} In the above example : 1 we should write 3 P(X ≤ 1) = P(HH, HT, TH) = 4 3 Here P(X≤1) = means the probability of the R.V.X (the number of heads) is less than or equal 4 3 to 1 is . 4 Distribution function of the random variable X or cumulative distribution of the random variable X Def : The distribution function of a random variable X defined in (-∞, ∞) is given by F(x) = P(X ≤ x) = P{s : X(s) ≤ x} Note Let the random variable X takes values x1 , x2 , ….., x n with probabilities P 1 , P 2 , ….., P n and let x1 < x 2 < …..
ww w.E

asy En gin ee

rin g.n et

1.2.3 PROBABILITY MASS FUNCTION (OR) PROBABILITY FUNCTION Let X be a one dimenstional discrete R.V. which takes the values x1 , x2 , …… To each possible outcome ‘x i ’ we can associate a number p i . i.e., P(X = xi ) = P(x i ) = p i called the probability of x i . The number p i = P(x i ) satisfies the following conditions. (i) p ( x i ) ≥ 0, ∀i



(ii)

∑ p(x ) = 1 i =1

i

2

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

The function p(x) satisfying the above two conditions is called the probability mass function (or) probability distribution of the R.V.X. The probability distribution {x i , p i } can be displayed in the form of table as shown below. X = xi

x1

x2

…….

xi

P(X = xi ) = p i

p1

p2

…….

pi

Notation Let ‘S’ be a sample space. The set of all outcomes ‘S’ in S such that X(S) = x is denoted by writing X = x. P(X = x) = P{S : X(s) = x} |||ly P(x ≤ a) = P{S : X() ∈ (-∞, a)} and P(a < x ≤ b) = P{s : X(s) ∈ (a, b)} P(X = a or X = b) = P{(X = a) ∪ (X = b)} P(X = a and X = b) = P{(X = a) ∩ (X = b)} and so on.

ww w.E

Theorem :1 If X1 and X2 are random variable and K is a constant then KX 1 , X 1 + X 2 , X 1 X 2 , K 1 X 1 + K2 X 2 , X 1 -X 2 are also random variables.

asy En gin ee

Theorem :2 If ‘X’ is a random variable and f(•) is a continuous function, then f(X) is a random variable. Note

If F(x) is the distribution function of one dimensional random variable then I. 0 ≤ F(x) ≤ 1 II. If x < y, then F(x) ≤ F(y) III. F(-∞) = lim F(x) = 0 x →−∞

IV.

F(∞) = lim F(x) = 1 x →∞

If ‘X’ is a discrete R.V. taking values x1 , x 2 , x3 Where x1 < x2 < x i-1 x i …….. then P(X = xi ) = F(x i ) – F(xi-1 ) Example:1.2.1 A random variable X has the following probability function V.

Values of X

0 1

2

3

4

5

6

rin g.n et

7

8

Probability P(X) a 3a 5a 7a 9a 11a 13a 15a 17a (i) (ii) (iii) Solution

Determine the value of ‘a’ Find P(X<3), P(X≥3), P(0
Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Table 1 Values of X 0 1 p(x)

2

3

4

5

6

7

8

a 3a 5a 7a 9a 11a 13a 15a 17a

(i) We know that if p(x) is the probability of mass function then 8

∑ p(x ) = 1 i =0

i

p(0) + p(1) + p(2) + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) = 1 a + 3a + 5a + 7a + 9a + 11a + 13a + 15a + 17a = 81 a = a = put a = 1/81 in table 1, e get table 2 Table 2

ww w.E X=x 0 P(x)

1

2

3

4

5

6

1 1 1/81

7

8

1/81 3/81 5/81 7/81 9/81 11/81 13/81 15/81 17/81

asy En gin ee

(ii) P(X < 3)

= p(0) + p(1) + p(2) = 1/81+ 3/81 + 5/81 = 9/81 (ii) P(X ≥ 3) = 1 - p(X < 3) = 1 - 9/81 = 72/81 (iii) P(0 < x < 5) = p(1) + p(2) + p(3) + p(4) here 0 & 5 are not include = 3/81 + 5/81 + 7/81 + 9/81 3+5+7+8+9 24 = ––––––––––––––– = ––––– 81 81 (iv) To find the distribution function of X using table 2, we get

rin g.n et

X = x F(X) = P(x ≤ x) 0 1 2 3 4

F(0)

= p(0) = 1/81

F(1)

= P(X ≤ 1) = p(0) + p(1) = 1/81 + 3/81 = 4/81

F(2)

= P(X ≤ 2) = p(0) + p(1) + p(2) = 4/81 + 5/81 = 9/81

F(3)

= P(X ≤ 3) = p(0) + p(1) + p(2) + p(3) = 9/81 + 7/81 = 16/81

F(4)

= P(X ≤ 4) = p(0) + p(1) + …. + p(4) = 16/81 + 9/81 = 25/81

4

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

5 6 7 8

PROBABILITY AND RANDOM PROCESSES

F(5)

= P(X ≤ 5) = p(0) + p(1) + ….. + p(4) + p(5) = 2/81 + 11/81 = 36/81

F(6)

= P(X ≤ 6) = p(0) + p(1) + ….. + p(6) = 36/81 + 13/81 = 49/81

F(7)

= P(X ≤ 7) = p(0) + p(1) + …. + p(6) + p(7) = 49/81 + 15/81 = 64/81

F(8) p(8)

= P(X ≤ 8) = p(0) + p(1) + ….. + p(6) + p(7) + = 64/81 + 17/81 = 81/81 = 1

ww w.E

1.3 CONTINUOUS RANDOM VARIABLE Def : A R.V.’X’ which takes all possible values in a given internal is called a continuous random variable. Example : Age, height, weight are continuous R.V.’s.

asy En gin ee

1.3.1 PROBABILITY DENSITY FUNCTION Consider a continuous R.V. ‘X’ specified on a certain interval (a, b) (which can also be a infinite interval (-∞, ∞)). If there is a function y = f(x) such that P(x < X < x + ∆x) = f (x) lim ∆x → 0 ∆x Then this function f(x) is termed as the probability density function (or) simply density function of the R.V. ‘X’. It is also called the frequency function, distribution density or the probability density function. The curve y = f(x) is called the probability curve of the distribution curve. Remark If f(x) is p.d.f of the R.V.X then the probability that a value of the R.V. X will fall in some interval (a, b) is equal to the definite integral of the function f(x) a to b.

rin g.n et

b

P(a < x < b)

= ∫ f (x) dx

(or)

a

P(a ≤ X ≤ b)

b

= ∫ f (x) dx a

1.3.2 PROPERTIES OF P.D.F The p.d.f f(x) of a R.V.X has the following properties (i) f(x) ≥ 0, -∞ < x < ∞



(ii) ∫ f (x) dx = 1 −∞

Remark 5

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

1. In the case of discrete R.V. the probability at a point say at x = c is not zero. But in the case of a continuous R.V.X the probability at a point is always zero. ∞

P ( X = c ) = ∫ f (x) dx = [ x ]c = C − C = 0 C

−∞

2. If x is a continuous R.V. then we have p(a ≤ X ≤ b) = = p(a < X V b)

p(a ≤ X < b)

IMPORTANT DEFINITIONS INTERMS OF P.D.F If f(x) is the p.d.f of a random variable ‘X’ which is defined in the interval (a, b) then

ww w.E i

b

∫ x f (x) dx

Arithmetic mean

a

ii

Harmonic mean

iii

Geometric mean ‘G’ log G

iv

1 f (x) dx a x b



b

∫ log x f (x) dx

asy En gin ee a b

r ∫ x f (x) dx

Moments about origin

a b

v

r ∫ (x − A) f (x) dx

Moments about any point A

a b

vi

r ∫ (x − mean) f (x) dx

Moment about mean µ r

a b

vii

a b

viii

rin g.n et

2 ∫ (x − mean) f (x) dx

Variance µ 2

Mean deviation about the mean is M.D.

∫ | x − mean | f (x) dx

a

1.3.3 Mathematical Expectations Def :Let ‘X’ be a continuous random variable with probability density function f(x). Then the mathematical expectation of ‘X’ is denoted by E(X) and is given by ∞

E(X) = ∫ x f (x) dx −∞

It is denoted by

µ 'r



r = ∫ x f (x) dx −∞

Thus 6

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

µ1'

= E(X)

(µ1' about origin)

µ '2

= E(X 2 )

(µ '2 about origin)

∴  Mean =  X =µ1' =E(X) And

= µ '2 −µ '2 2

Variance

Variance= E(X 2 ) − [E(X)]2

(a)

th

* r moment (abut mean) Now ∞

E {X − E ( X )}

r

ww w.E

= ∫ {x − E(X)}r f (x) dx

=

−∞ ∞

∫ {x − X} f (x) dx r

−∞

Thus

µr =

∫ {x − X} f (x) dx r

asy En gin ee

µr =

Where



−∞

(b)

E[X − E(X) ] r

This gives the rth moment about mean and it is denoted by µ r Put r = 1 in (B) we get ∞

µr =

∫ {x − X}f (x) dx

−∞ ∞



−∞

−∞

rin g.n et

= ∫ x f (x) dx − ∫ x f (x) dx ∞

= X − X ∫ f (x) dx −∞

 ∞ f (x) dx = 1  −∞∫ 

= X−X µ1 =0 Put r = 2 in (B), we get

µ2 =



2 ∫ (x − X) f (x) dx

−∞

Variance = µ2

= E[X − E(X)]2

Which gives the variance interms of expectations. Note Let g(x) = K (Constant), then 7

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES ∞

E g (= X )     E (= K)

∫ K f (x) dx

−∞

=



K ∫ f (x) dx −∞

 ∞ f (x) dx =1  −∞∫ 

= K.1 = K Thus E(K) = K ⇒ E[a constant] = constant. 1.3.4 EXPECTATIONS (Discrete R.V.’s) Let ‘X’ be a discrete random variable with P.M.F p(x) Then

E(X) =

∑ x p(x)

ww w.E x

For discrete random variables ‘X’

E(X r ) =

r ∑ x p(x)

(by def)

x

If we denote r E(X = )

µ 'r

asy En gin ee

Then

µ=

E[X = ]

' r

r ∑ x p(x)

r

x

Put r = 1, we get

Mean µ 'r

= ∑ x p(x)

Put r = 2, we get

µ '2= ∴

2 E[X = ]

2 ∑ x p(x) x

µ 2 = µ '2 −µ1' 2 = E(X 2 ) − {E(X)}

2

The rth moment about mean

µ 'r =

E[{X − E(X)}r ] r = ∑ (x − X) p(x),

E(X) = X

rin g.n et

x

Put r = 2, we get 2 Variance = µ= ∑ (x − X) p(x) 2 x

1.3.5 ADDITION THEOREM (EXPECTATION) Theorem 1 If X and Y are two continuous random variable with pdf fx (x) and fy (y) then E(X+Y) = E(X) + E(Y) 8

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

1.3.6 MULTIPLICATION THEOREM OF EXPECTATION Theorem 2 If X and Y are independent random variables, Then E(XY) = E(X) . E(Y) Note : If X 1 , X 2 , ……, X n are ‘n’ independent random variables, then E[X 1 , X 2 , ……, X n] = E(X 1 ), E(X 2 ), ……, E(X n ) Theorem 3 If ‘X’ is a random variable with pdf f(x) and ‘a’ is a constant, then (i) E[a G(x)] = a E[G(x)] (ii) E[G(x)+a] = E[G(x)+a] Where G(X) is a function of ‘X’ which is also a random variable.

ww w.E

Theorem 4 If ‘X’ is a random variable with p.d.f. f(x) and ‘a’ and ‘b’ are constants, then E[ax + b] = a E(X) + b Cor 1:

asy En gin ee

If we take a = 1 and b = –E(X) = – X , then we get E(X- X ) = E(X) – E(X) = 0 Note

1 1 E  ≠  X  E(X)

E[log (x)] ≠ log E(X) E(X2) ≠ [E(X)]2 1.3.7 EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES Let X 1 , X 2 , ……, X n be any ‘n’ random variable and if a 1 , a 2 , ……, a n are constants, then E[a 1 X 1 + a 2 X 2 + ……+ a nX n ] = a 1 E(X 1 ) + a 2 E(X 2 )+ ……+ a nE(X n ) Result If X is a random variable, then Var (aX + b) = a2Var(X) ‘a’ and ‘b’ are constants. Covariance : If X and Y are random variables, then covariance between them is defined as Cov(X, Y) = E{[X - E(X)] [Y - E(Y)]} = E{XY - XE(Y) – E(X)Y + E(X)E(Y)} Cov(X, Y) = E(XY) – E(X) . E(Y) (A) If X and Y are independent, then E(XY) = E(X) E(Y) Sub (B) in (A), we get Cov (X, Y) = 0 ∴ If X and Y are independent, then

rin g.n et

9

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Cov (X, Y) = 0 Note (i) Cov(aX, bY) = ab Cov(X, Y) (ii) Cov(X+a, Y+b) = Cov(X, Y) (iii) Cov(aX+b, cY+d) = ac Cov(X, Y) (iv) Var (X 1 + X 2 ) = Var(X 1 ) + Var(X 2 ) + 2 Cov(X 1 , X 2 ) If X 1 , X 2 are independent Var (X 1 + X 2 ) = Var(X 1 ) + Var(X 2 ) EXPECTATION TABLE Discrete R.V’s

Continuous R.V’s

1. E(X) = ∑x p(x)

1. E(X) = ∫ x f (x) dx



ww w.E

−∞



2. E(X ) =µ =∑ x p(x) r

' r

2. E(X r ) =µ 'r = ∫ x r f (x) dx

r

x

−∞ ∞

3. Mean = µ 'r =∑ x p(x)

3. Mean = µ 'r = ∫ x f (x) dx −∞

asy En gin ee

4. µ '2 =∑ x 2 p(x)



4. µ '2 = ∫ x 2 f (x) dx −∞

5. Variance = µ '2 −µ1' 2 = E(X2) – {E(X)}2

5. Variance = µ '2 −µ1' 2 = E(X2) – {E(X)}2

SOLVED PROBLEMS ON DISCRETE R.V’S Example :1 When die is thrown, ‘X’ denotes the number that turns up. Find E(X), E(X2) and Var (X). Solution Let ‘X’ be the R.V. denoting the number that turns up in a die. ‘X’ takes values 1, 2, 3, 4, 5, 6 and with probability 1/6 for each X=x

1

2

3

4

5

1/6

1/6

1/6

1/6

1/6

p(x1 )

p(x2 )

p(x3 )

p(x4 )

p(x5 )

p(x)

rin g.n et 6

1/6

p(x6 )

Now

E(X) = = = =

6

∑ x i p(x i )

i =1

x1 p(x 1 ) + x2 p(x2 ) + x3 p(x3 ) + x 4 p(x 4 ) + x5 p(x 5 ) + x6 p(x6 ) 1 x (1/6) + 1 x (1/6) + 3 x (1/6) + 4 x (1/6) + 5 x (1/6) + 6 x (1/6) 21/6 = 7/2 (1) 10

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

E(X) =

6

∑ x i p(x p )

i =1

= =

x1 2p(x1 )+x 2 2p(x 2 )+x3 2p(x3 )+x 4 2p(x4 )+x 5 2p(x 5 )+x 6 p(x6 ) 1(1/6) + 4(1/6) + 9(1/6) + 16(1/6) + 25(1/6) + 36(1/6)

=

1 + 4 + 9 + 16 + 25 + 36 6

91 6

=

Variance (X) = Var (X) = E(X2) – [E(X)]2 2

91  7  91 49 = –  = − 6 2 6 4

=

(2)

35 12

Example :2 Find the value of (i) C (ii) mean of the following distribution given

ww w.E C(x − x 2 ), f (x) =  0

0 < x <1 otherwise

Solution

C(x − x 2 ),

Given f (x) = 

0 < x <1 otherwise

(1)

asy En gin ee

0



∫ f (x) dx =1

−∞ 1

2 1 ∫ C(x − x ) dx =

0 1

[using (1)] [∴ 0
 x 2 x3  C −  = 1 3 0 2 1 1 C −  = 1  2 3 3 − 2 C =1  6  C C=6 =1 6

(2)

Sub (2) in (1), f(x) = 6(x – x2), 0< x < 1

rin g.n et

(3)



Mean

= E(x) = ∫ x f (x) dx −∞

1

= ∫ x 6(x − x 2 ) dx

[from (3)]

[∴ 0 < x < 1]

0 1

= ∫ (6x 2 − x 3 ) dx 0

11

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES 1

 6x 3 6x 4  =  − 4  0  3 ∴ Mean = ½ Mean

C

½

6

1.4 CONTINUOUS DISTRIBUTION FUNCTION Def : If f(x) is a p.d.f. of a continuous random variable ‘X’, then the function ∞

F X (x) = F(x) = P(X ≤ x) = ∫ f (x) dx, − ∞ < x < ∞

ww w.E

−∞

is called the distribution function or cumulative distribution function of the random variable. * PROPERTIES OF CDF OF A R.V. ‘X’ (i) 0 ≤ F(x) ≤ 1, - ∞ < x < ∞ (ii) Lt F(x) = 1 Lt F(x) = 0 ,

asy En gin ee

x →−∞

b

(iii)

P(a ≤ X ≤ b) = ∫ f (x)= dx a

(iv)

x →−∞

F'(x) =

dF(x) dx

=

F(b) − F(a) f(x) ≥ 0

(v) P(X = xi ) = F(x i ) – F(xi – 1) Example :1.4.1 Given the p.d.f. of a continuous random variable ‘X’ follows

6x(1 − x), f (x) =  0

0 < x <1 , find c.d.f. for ‘X’ otherwise

Solution

6x(1 − x), 0

Given f (x) = 

The c.d.f is F(x) =

0 < x <1 otherwise

x

∫ f (x) dx , − ∞ < x < ∞

rin g.n et

−∞

(i) When x < 0, then x

F(x)

= ∫ f (x) dx −∞

x

= ∫ 0 dx

=0

−∞

(ii) When 0< x < 1, then 12

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES x

F(x)

= ∫ f (x) dx −∞

=

0

x

−∞

0

∫ f (x) dx + ∫ f (x) dx x

 x 2 x3  = 0 + ∫ 6x(1 − x) dx = 6 ∫ x(1 − x) dx = 6  −  3 0 0 0 2 = 3x 2 − 2x 3 x

x

(iii) When x > 1, then x

F(x)

= ∫ f (x) dx −∞

=

ww w.E 0

1

−∞

0

x

∫ 0dx + ∫ 6x(1 − x) dx + ∫ 0 dx 0

1

= 6 ∫ (x − x 2 ) dx

=1

0

Using (1), (2) & (3) we get

asy En gin ee

0,  F(x) = 3x 2 − 2x 3 , 1, 

Example:1.4.2

e − x ,

x<0

0 < x <1 x >1

x≥0 defined as follows a density function ? x<0

(i) If f (x) = 

0,

rin g.n et

(ii) If so determine the probability that the variate having this density will fall in the interval (1, 2). Solution Given

e − x , f (x) =  0,

x≥0 x<0

(a) In (0, ∞), e-x is +ve ∴f(x) ≥ 0 in (0, ∞) ∞

0

−∞

−∞ 0



(b) ∫ f (x) dx = ∫ f (x) dx + ∫ f (x) dx 0 ∞

= ∫ 0dx + ∫ e − x dx −∞

=  −e − x 

0

∞ 0

= − e −∞ + 1 13

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

=1 Hence f(x) is a p.d.f (ii) We know that b

P(a ≤ X ≤ b)

= ∫ f (x) dx a 2

P(1 ≤ X ≤ 2)

2

= ∫ e − x dx = [−e − x ]2+1

= ∫ f (x) dx 1

1

2

= ∫ e − x dx = [−e − x ]2+1 1

= -e-2 + e-1

= -0.135 + 0.368

= 0.233

Example:1.4..3 A probability curve y = f(x) has a range from 0 to ∞. If f(x) = e-x, find the mean and variance and the third moment about mean. Solution

ww w.E ∞

Mean

= ∫ x f (x) dx 0 ∞

= ∫ x e − x dx 0

Mean = 1 ∞

asy En gin ee ∞

=  x[−e − x ] − [e − x ] 0

Variance µ 2= ∫ (x − Mean) 2 f (x) dx 0



= ∫ (x − 1) 2 e − x dx 0

µ 2 =1

Third moment about mean

rin g.n et

b

µ3= ∫ (x − Mean)3 f (x) dx Here a = 0, b = ∞

a b

µ3 = ∫ (x − 1)3 e − x dx a

{

}

= (x − 1)3 ( −e − x ) − 3(x − 1) 2 (e − x ) + 6(x − 1)( −e − x ) − 6(e − x )



0

= -1 + 3 -6 + 6 = 2 µ3 = 2 1.5 MOMENT GENERATING FUNCTION Def : The moment generating function (MGF) of a random variable ‘X’ (about origin) whose probability function f(x) is given by M X(t) = E[etX] 14

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

 ∞ e tx f (x)dx, for a continuous probably function  ∫ =  x = −∞ ∞  ∑ e tx p(x), for a discrete probably function  x = −∞ Where t is real parameter and the integration or summation being extended to the entire range of x. Example :1.5.1 Prove that the rth moment of the R.V. ‘X’ about origin is M= X (t)



tr ' µr ∫ r = 0 r!

Proof WKT M X (t)

= E(etX)

 tX (tX) 2 (tX)3  (tX) r = E 1 + + + + .... + + + .... 2! 3! r!  1!  2 r t t = E[1] + t E(X) + E(X 2 ) + ..... + E(X r ) + ........ 2! r! 2 3 r t t t = 1 + t µ1' + µ '2 + µ3' + ..... + µ 'r + ........ 2! 3! r!

ww w.E M X (t)

asy En gin ee

[using µ 'r =E(X r ) ]

Thus rth moment = coefficient of

tr r!

Note 1. The above results gives MGF interms of moments. 2. Since M X (t) generates moments, it is known as moment generating function. Example:1.5.2 Find µ1' and µ '2 from M X (t) Proof

tr ' WKT M X (t) = ∑ µ r r = 0 r! t t2 tr M X (t) =µ '0 + µ1' + µ '2 + ..... + µ 'r 1! 2! r! ∞

rin g.n et (A)

Differenting (A) W.R.T ‘t’, we get '

M X (t)

2t ' t 3 ' =µ + µ 2 + µ3 + ..... 2! 3! ' 1

(B)

Put t = 0 in (B), we get

M X ' (0)

=µ1' =Mean

Mean = M1' (0)

(or)

d   dt (M X (t))  t =0 15

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

=µ '2 + t µ3' + .......

M X" (t) Put

t = 0 in (B)

M X (0) = µ "

' 2

 d2   dt 2 (M X (t))    t =0

(or)

 dr  In general µ = r (M X (t))   dt  t =0 ' r

Example :1.5.3 Obtain the MGF of X about the point X = a. Proof The moment generating function of X about the point X = a is M X (t)

= E[e t (X −a ) ]

ww w.E

  t2 tr = E 1 + t(X − a) + (X − a) 2 + .... + (X − a) r + .... 2! r!    Formula    2 e x =1 + x + x + ...   1! 2! t2 tr = E (1) + E[t(X − a)] + E[ (X − a) 2 ] + .... + E[ (X − a) r ] + .... 2! r! 2 r t t =1 + t E(X − a) + E(X − a) 2 + .... + E(X − a) r + .... 2! r! 2 r t t = 1 + t µ1' + µ '2 + .... + µ 'r + .... Where = µ 'r E[(X − a) r ] 2! r! 2 t ' tr ' ' [ M X (t)]x =a = 1 + tµ1 + µ 2 + ..... + µr + ..... 2! r!

asy En gin ee

Result:

M CX (t) = E[e tcx ]

(1)

M X (t) = E[e ]

(2)

ctx

From (1) & (2) we get

M CX (t) = M X (ct)

Example :1.5.4 If X 1 ,

M X1 + X2 +....+ Xn (t)

X2,

…..,

= E[e

Xn

are

t (X1 + X 2 +....+ X n )

independent

rin g.n et

variables,

then

prove

that

]

= E[e tX1 .e tX 2 .....e tX n ] = E(e tX1 ).E(e tX 2 ).....E(e tX n ) [∴ X 1 , X 2 , ….., X n are independent] 16

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

= M X1 (t).M X 2 (t)..........M X n (t) Example:1.5.5 t

− at   X−a Prove that if ∪ = , then M ∪ (t) = e h .M X  h  , where a, h are constants. h

Proof By definition

M ∪ (t) = E e tu 

  M X (t) = E[e tx ]

 t  Xh−a   = E e     

ww w.E  tXn − tan  = E e    tX

= E[ e h ] E[ e =e =e

− ta h − ta h

∴ M ∪ (t) = e

tX h

E[ e ] . MX

− at h

− ta h

]

asy En gin ee [by def]

t   h

X−a t and M X(t) is the MGF about origin. .M X   , where ∪ = h h

Example:1.5.6 Find the MGF for the distribution where

2 at x = 1 3  1 = f (x) = at x 2 3  otherwise 0  

rin g.n et

Solution

f (1) =

Given

f (2) =

2 3

1 3

f(3) = f(4) = …… = 0 MGF of a R.V. ‘X’ is given by 17

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

= E[e tx ]

M X (t) ∞

= ∑ e tx f (x) x =0

= e0 f(0) + et f(1) + e2t f(2) + ……. = 0 +et f(2/3) + e2t f(1/3) + 0 = 2/3et + 1/3e2t ∴ MGF is M= X (t)

et [2 + e t ] 3

1.6 Discrete Distributions The important discrete distribution of a random variable ‘X’ are 1. Binomial Distribution 2. Poisson Distribution 3. Geometric Distribution 1.6.1 BINOMIAL DISTRIBUTION Def : A random variable X is said to follow binomial distribution if its probability law is given by P(x) = p(X = x successes) = nC x p x qn-x Where x = 0, 1, 2, ……., n, p+q = 1 Note Assumptions in Binomial distribution i) There are only two possible outcomes for each trail (success or failure). ii) The probability of a success is the same for each trail. iii) There are ‘n’ trails, where ‘n’ is a constant. iv) The ‘n’ trails are independent.

ww w.E

asy En gin ee

rin g.n et

Example :1.6.1 Find the Moment Generating Function (MGF) of a binomial distribution about origin. Solution n

WKT

M X (t) = ∑ e tx p(x) x =0

Let ‘X’ be a random variable which follows binomial distribution then MGF about origin is given by tX E[e= ]

= M X (t) n

n

tx ∑ e p(x)

x =0

 p(x) = nC x p x q n − x 

x n −x ∑ e nC x p q

=

tx

x =0

= =

n

tx x n−x ∑ (e ) p nC x q

x =0 n

t x n−x ∑ (pe ) nC x q

x =0

18

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

∴ M X (t)

= (q + pe t ) n

Example:1.6.2 Find the mean and variance of binomial distribution. Solution

= (q + pe t ) n

M X (t)

= n(q + pe t ) n −1.pe t

∴ M 'X (t) Put t = 0, we get

M 'X (0)

n(q + p) n −1.p

=

1] [ (q + p) =

 Mean M 'X (0)  np (q + pe t ) n −1.e t + e t (n − 1)(q + pe t ) n − 2 .pe t 

Mean = E(X) = np

=

M"X (t)

ww w.E

Put t = 0, we get

=

M"X (t)

np (q + p) n −1 + (n − 1)(q + p) n − 2 .p 

= np [1 + (n − 1)p ]

= np + n 2 p 2 − np 2

asy En gin ee

= n 2 p 2 + np(1 − p)

M"X (0)

=

n 2 p 2 + npq

M"X (0)

=

2 E(X = ) n 2 p 2 + npq

Var ( X ) =

q] [ 1 − p =

E(X 2 ) − [E(X)]2= n 2 / p 2 + npq − n 2 / p 2= npq

Var (X) = npq S.D = npq Example :1.6.3 Find the Moment Generating Function (MGF) of a binomial distribution about mean (np). Solution Wkt the MGF of a random variable X about any point ‘a’ is M x(t) (about X = a) = E[et(X-a)] Here ‘a’ is mean of the binomial distribution M X(t) (about X = np) = E[et(X-np)] = E[etX . e-tnp)] = e-tnp . [-[e-tX)]] = e-tnp . (q+pet)-n = (e-tp)n. (q + pet)n ∴ MGF about mean = (e-tp)n. (q + pet)n

rin g.n et

Example :1.6.4 Additive property of binomial distribution. Solution 19

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

The sum of two binomial variants is not a binomial variate. Let X and Y be two independent binomial variates (n1 , p 1 ) and (n 2 , p 2 ) respectively. Then

MX = ( t ) 

(q

1

+ p 1e t ) ,

MY= ( t ) 

n1

∴ M X+Y (  t) = M X ( t ) .M Y ( t )

(

= q1 + p1e t

) . (q

(q

2

+ p 2e t )

with

parameter

n2

[ X & Y are independent R.V.’s]

n1

2

+ p 2e t )

(

n2

RHS cannot be expressed in the form q + pe t

) . Hence by uniqueness theorem of n

MGF X+Y is not a binomial variate. Hence in general, the sum of two binomial variates is not a binomial variate.

ww ( ) ( ) w.(E ) asy En gin ee

Example :1.6.5

If M X (t) =

n1

= q+pe t , M Y (t)

M X + Y (t) = q+pe t

q+pe t

n2

, then

n1 + n 2

Problems on Binomial Distribution 1. Check whether the following data follow a binomial distribution or not. Mean = 3; variance = 4. Solution Given Mean np = 3 (1) Variance npr = 4 (2)

(2) (1)

np 3 = npq 4 4 1 ⇒ q = = 1 which is > 1. 3 3 ⇒

rin g.n et

Since q > 1 which is not possible (0 < q < 1). The given data not follow binomial distribution. Example :1.6.5 The mean and SD of a binomial distribution are 5 and 2, determine the distribution. Solution Given Mean = np = 5 (1) (2) SD = npq = 2

(2) np 4 4 ⇒ =⇒ q= (1) npq 5 5 4 1 1 ∴ p =1 − = ⇒ p= 5 5 5 Sub (3) in (1) we get n x 1/5 = 5 20

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

n = 25 ∴ The binomial distribution is P(X = x) = p(x) = nC x px qn-x = 25C x(1/5)x(4/5)n-x,

x = 0, 1, 2, ….., 25

1.7 Passion Distribution Def : A random variable X is said to follow if its probability law is given by P(X = x)

e −λ λ x = p(x) = , x!

x = 0, 1, 2, ….., ∞

Poisson distribution is a limiting case of binomial distribution under the following conditions or assumptions. 1. The number of trails ‘n’ should e infinitely large i.e. n→∞. 2. The probability of successes ‘p’ for each trail is infinitely small. 3. np = λ , should be finite where λ is a constant. * To find MGF = E(etx) M X(t)

ww w.E

asy En gin ee



= ∑ e tx p(x) x =0

 λ x eλ   x =0  x!  ∞ e −λ (λe t ) x = ∑ x =0 x! ∞ (λe t ) x = e −λ ∑ x =0 x!   (λe t ) 2 = e −λ 1 + λe t + + ...... 2!   ∞

= ∑ e tx 

= e −λ eλe = eλ (e −1) t

t

M X(t) = eλ (e −1) t

Hence

rin g.n et

* To find Mean and Variance ∴

= eλ (e −1) t

M X (t)

WKT

= eλ (e −1) .e t t

M X '(t)

M X '(0)

= e −λ . λ



= µ1' E(X) = ∑ x.p(x) x =0

21

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

∞ x. e −λ λλ x −1 e −λ λ x = ∑ = x 0= x 0 x! x! ∞ x.λ x −1 = 0 + e −λ . λ ∑ x =1 x! x − 1 ∞ λ = λ e −λ . ∑ x =1 (x − 1)!   λ2 = λ e −λ 1 + λ + + ..... 2!   ∞

= ∑ x.

= λ e −λ .eλ Mean = λ

ww w.E

e −λ λ x µ = E[X ] = ∑ x .p(x) = ∑ x . = x 0= x 0 x! −λ x ∞ e λ = ∑ {x(x − 1) + x}. x =0 x! −λ x ∞ x(x − 1)e λ ∞ x.e −λ λ x = ∑ +∑ = x 0= x 0 x! x! x −2 ∞ λ = e −λ λ 2 ∑ +λ x = 0 (x − 2)(x − 3)....1 ∞ λ x −2 = e −λ λ 2 ∑ +λ x = 0 (x − 2)!  λ λ2 −λ 2  = e λ 1 + + + .... + λ  1! 2!  ' 2



2

2



2

asy En gin ee

= λ +λ 2

Variance µ 2 = E(X 2 ) − [E(X)]2 = λ2 + λ − λ2 = λ Variance = λ Hence Mean = Variance =λ Note : * sum of independent Poisson Vairates is also Poisson variate.

rin g.n et

PROBLEMS ON POISSON DISTRIBUTION Example:1.7.1 If x is a Poisson variate such that P(X=1) =

3 1 and P(X=2) = , find the P(X=0) and P(X=3). 10 5

Solution P(X = x)

=

e −λ λ x x! 22

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

∴ P(X=1) = λe −λ

= e −λ λ

3 = 10

(Given)

3 = 10

(1)

e −λ λ 2 1 = P(X=2) = (Given) 2! 5 e −λ λ 2 1 = 2! 5 3 (1) ⇒ e −λ λ = 10 2 (2) ⇒ e −λ λ 2 = 5 (3) 1 3 ⇒ = (4) λ 4 4 λ= 3 e −λ λ 0 = e −4/3 ∴ P(X=0) = 0! −λ 3 e λ e −4/3 (4 / 3)3 = P(X=3) = 3! 3!

ww w.E

(2) (3) (4)

asy En gin ee

Example :1.7.2 If X is a Poisson variable P(X = 2) = 9 P(X = 4) + 90 P(X=6) Find (i) Mean if X (ii) Variance of X Solution P(X=x) = Given

e −λ λ x , x = 0,1, 2,..... x! P(X = 2) = 9 P(X = 4) + 90 P(X=6)

e −λ λ 4 e −λ λ 6 e −λ λ 2 =9 + 90 4! 6! 2! 2 4 1 9λ 90λ + = 4! 6! 2 2 4 λ 1 3λ + = 8 8 2 2 3λ λ4 + 1= 4 4

rin g.n et

23

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

λ 4 + 3λ 2 − 4 =0 or λ2 = -4 λ = ± 2i or λ =±1 ∴ Mean = λ = 1, Variance = λ = 1 ∴ Standard Deviation = 1 λ2 = 1

1.7.3 Derive probability mass function of Poisson distribution as a limiting case of Binomial distribution Solution We know that the Binomial distribution is P(X=x) = nC x pxqn-x

n! p x (1 − p) n − x (n − x)! x!

ww w.E =

1.2.3.......(n − x)(n − x + 1)......np n (1 − p) n = 1.2.3.....(n − x) x! (1 − p) x x

1.2.3.......(n − x)(n − x + 1)......n  p  n =   (1 − p) 1.2.3.....(n − x) x! 1− p 

asy En gin ee

n(n − 1)(n − 2)......(n − x + 1) λ x 1  λ 1−  = x  x x! n  λ  n 1 −   n −x n n(n − 1)(n − 2)......(n − x + 1)  λ   λ  = x 1 −  1 −  x!  n  n

P(X=x)

  x − 1   1  2  11 − 1 −  ...... 1 −  n−x   n  n   n  x  λ   = λ 1 −  x!  n n−x x   x − 1   λ  λ  1  2  = 11 − 1 −  ...... 1 −    1 −  x!  n  n    n   n 

When n→∞ P(X=x)

n

rin g.n et

n−x   1  2    x − 1   λ   λx lt 1 − 1 − 1 −  ...... 1 −  =   1 −   x! n →∞   n  n    n    n   λx  1  2  x −1  = lt 1 −  lt 1 −  ...... lt 1 −   n →∞ x! n →∞  n  n →∞  n   n 

We know that

24

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES n −x

and

 λ lt 1 −  n →∞  n  1 lt 1 − = n →∞  n ∴ P(X=x)

= e −λ   x −1    2 lt 1 −  .....= lt 1 −   = 1 n →∞ n →∞  n   n  λ x −λ e , x 0,1, 2,...... ∞ = = x!

1.8 GEOMETRIC DISTRIBUTION Def: A discrete random variable ‘X’ is said to follow geometric distribution, if it assumes only non-negative values and its probability mass function is given by P(X=x) = p(x) = qx-1 ; x = 1, 2, ……., 0 < p < 1, Where q = 1-p Example:1.8.1 To find MGF M X(t) = E[etx]

ww w.E

= ∑ e tx p(x) ∞

asy En gin ee

= ∑ e tx q x −1p x =1 ∞

= ∑ e tx q x q −1p x =1 ∞

= ∑ e tx q x p / q x =1



= p / q ∑ e tx q x x =1 ∞

= p / q ∑ (e t q) x x =1

= p / q (e t q)1 + (e t q) 2 + (e t q)3 + .... Let

x = etq = p / q  x + x 2 + x 3 + ....

p p = (1 − x) −1 x 1 + x + x 2 + .... q q p = qe t 1 − qe t  = pe t [1 − qe t ]−1 q pe t = 1 − qe t =

∴M X (t)

rin g.n et

* To find the Mean & Variance 25

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

(1 − qe t )pe t − pe t (−qe t ) pe t = (1 − qe t ) 2 (1 − qe t ) 2 ∴ E(X) = M 'X ( 0 )   = 1/p M 'X ( t )    =

∴ Mean = 1/p

d  pe t  µ"X (t) = dt  (1 − qe t ) 2  (1 − qe t ) 2 pe t − pe t 2(1 − qe t )(−qe t ) = (1 − qe t ) 4 (1 − qe t ) 2 pe t + 2pe t qe t (1 − qe t ) = (1 − qe t ) 4 1+ q M"X (0) = 2 p (1 + q) 1 q Var (X) = E(X2) – [E(X)]2 = − 2 ⇒ 2 2 p p p q Var (X) = 2 p Variance

ww w.E

asy En gin ee

Note: Another form of geometric distribution P[X=x] = qxp ; x = 0, 1, 2, ….

M X (t) =

p (1 − qe t )

Mean = q/p,

rin g.n et

Variance = q/p2

Example:1.8.2 If the MGF of X is (5-4et)-1, find the distribution of X and P(X=5) Solution Let the geometric distribution be P(X = x) = qxp, x = 0, 1, 2, ….. The MGF of geometric distribution is given by

p 1 − qe t

(1) 1

 4  Here M X(t) = (5 - 4e ) ⇒ 5 1 − e t   5  4 1 Company (1) & (2) we get q= ; p= 5 5 t -1

∴ P(X = x)

= pqx,

−1

(2)

x = 0, 1, 2, 3, ……. 26

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

 1  4  =     5  5 

x

5

P(X = 5)

45  1  4  =    = 6 5  5  5 

1.9 CONTINUOUS DISTRIBUTIONS If ‘X’ is a continuous random variable then we have the following distribution 1. Uniform (Rectangular Distribution) 2. Exponential Distribution 3. Gamma Distribution 4. Normal Distribution 1. 9.1 Uniform Distribution (Rectangular Distribution) Def : A random variable X is set to follow uniform distribution if its

ww w.E

 1 , a
* To find MGF



M X (t)

asy En gin ee

= ∫ e tx f (x)dx −∞

b

= ∫ e tx a

1 dx b−a a

1  e tx  = b − a  t  b 1 e bx − eat  = (b − a)t

rin g.n et

∴ The MGF of uniform distribution is

M X (t) =

e bt − eat (b − a)t

* To find Mean and Variance ∞

E(X)

= ∫ x f (x)dx −∞

b

 x2     2 a

1 1 b dx = ∫ x dx = b−a b−a b−a a a 2 2 b −a b+a a+b = = = 2(b − a) 2 2 b

= ∫ bx

27

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

a+b 2

Mean µ1' =

Putting r = 2 in (A), we get

x2 µ = ∫ x f (x)dx = ∫ dx a a b−a a 2 + ab + b 2 = 3 ' ∴ Variance = µ 2 − µ1' 2 b

' 2

b

2

b 2 + ab + b 2  b + a  (b − a) 2 = − =  3 12  2  2

ww w.E Variance =

(b − a) 2 12

PROBLEMS ON UNIFORM DISTRIBUTION Example 1.9.1 If X is uniformly distributed over (-α,α), α< 0, find α so that (i) P(X>1) = 1/3 (ii) P(|X| < 1) = P(|X| > 1) Solution If X is uniformly distributed in (-α, α), then its p.d.f. is

f (x) (i)

 1  =  2α 0

P(X>1) = 1/3

α

asy En gin ee

−α < x < α otherwise

∫ f (x)dx = 1 / 3

1 α

1 dx = 1 / 3 1 2α 1 α ( x )1 = 1 / 3 2α ∫

(ii)



1 ( α − 1) =1 / 3 2α

rin g.n et

α=3 P(|X| < 1) = P(|X| > 1) = 1 - P(|X| < 1) P(|X| < 1) + P(|X| < 1) = 1 2 P(|X| < 1) = 1 2 P(-1 < X < 1) = 1 1

2 ∫ f (x)dx = 1 1

28

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

1 dx = 1 /α 12 1

2/ ∫

⇒α = 2

Note: 1. The distribution function F(x) is given by

−α < x < α 0 x − a  =  F(x) a≤x≤b − b a  b
ww w.E F(x)

1  =  2a 0

−a < x < a

otherwise

asy En gin ee

1.10 THE EXPONENTIAL DISTRIBUTION Def :A continuous random variable ‘X’ is said to follow an exponential distribution with parameter λ>0 if its probability density function is given by

F(x)

λe = 0

−λx

x>a otherwise

To find MGF Solution ∞

M X (t)

= ∫ e tx f (x)dx −∞ ∞

= ∫ e λe tx

−λx



dx

=λ∫e

0

− ( λ− t )x

dx

0 ∞

 e − ( λ− t )x  = λ   λ − t 0 λ e −∞ − e −0  = −(λ − t) λ ∴ MGF of x = , λ> t λ−t

=

λ λ−t

rin g.n et

* To find Mean and Variance We know that MGF is

29

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES −1

λ 1 t  M X(t)= = = 1 −  λ − t 1− t  λ  λ 2 t t tr =+ 1 + 2 + ..... + r λ λ λ 2 t t  2!  t r  t!  =1 + +  2  + ..... +  r  λ 2!  λ  r!  λ  r ∞  t  M X(t) = ∑   r =0  λ  t1 1 µ1' Coefficient = of ∴= Mean 1! λ 2 t 2 = µ '2 Coefficient = of 2! λ 2 2 1 1 Variance= µ 2 =µ '2 − µ1' 2 = 2 − 2 = λ λ λ2 1 1 Mean = Variance = 2 λ λ

ww w.E

asy En gin ee

Example: 1.10.1 Let ‘X’ be a random variable with p.d.f

F(x)

 1 −3x x>0  e = 3 0 otherwise 

Find 1) P(X > 3) 2) MGF of ‘X’ Solution WKT the exponential distribution is F(x) = λe-λx, x > 0 Here λ =

1 3 ∞ 1 −x = ∫ e 3 dx 3 3



P(x>3) = ∫ f (x) dx 3

P(X>3) = e-1 MGF is

M X (t)

=

rin g.n et

λ λ−t

30

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

1 1 1 3 3 = = = 1 1 − 3t 1 − 3t −t 3 3 1 M X(t) = 1 − 3t Note If X is exponentially distributed, then P(X > s+t / x > s) = P(X > t), for any s, t > 0. 1.11 GAMMA DISTRIBUTION Definition A Continuous random variable X taking non-negative values is said to follow gamma distribution , if its probability density function is given by

ww w.E

, α>0, 0 < x < ∞

f(x) =

=0, elsewhere

and

=

asy En gin ee dx

=0, elsewhere

When α is the parameter of the distribution.

Additive property of Gamma Variates are independent gamma variates with parameters If X 1 ,X 2 , X 3,.... X k λ 1, λ 2,….. λ k respectively then X1+X2 + X 3+.... +X k is also a gamma variates with parameter λ 1 + λ 2 + ….. + λ k . Example :1.11.1 Customer demand for milk in a certain locality ,per month , is Known to be a general Gamma RV.If the average demand is a liters and the most likely demand b liters (b
rin g.n et

f(x) =

. xk-1 e-λx e-λx , x>0 31

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

f(x) =

.[(k-1) xk-2 e-λx -

e-λx ]

= 0 ,when x=0 , x= f” (x) =

.[(k-1) xk-2 e-λx -

e-λx ]

<0 , when x= Therefour f(x) is maximum , when x= i.e ,Most likely demand =

ww w.E and E(X) =

Now V(X) = = = a (a-b)

=b

….(1)

………(2)

=

From (1) and (2)

asy En gin ee

TUTORIAL QUESTIONS

1.It is known that the probability of an item produced by a certain machine will be defective is 0.05. If the produced items are sent to the market in packets of 20, fine the no. of packets containing at least, exactly and atmost 2 defective items in a consignment of 1000 packets using (i) Binomial distribution (ii) Poisson approximation to binomial distribution. 2. The daily consumption of milk in excess of 20,000 gallons is approximately exponentially distributed with . 3000 = θ The city has a daily stock of 35,000 gallons. What is the probability that of two days selected at random, the stock is insufficient for both days. 3.The density function of a random variable X is given by f(x)= KX(2-X), 0≤X≤2.Find K, mean, variance and rth moment. 4.A binomial variable X satisfies the relation 9P(X=4)=P(X=2) when n=6. Find the parameter p of the Binomial distribution. 5. Find the M.G.F for Poisson Distribution. 6. If X and Y are independent Poisson variates such that P(X=1)=P(X=2) and P(Y=2)=P(Y=3). Find V(X-2Y). 7.A discrete random variable has the following probability distribution X: 0 1 2 3 4 5 6 7 8 P(X) a 3a 5a 7a 9a 11a 13a 15a 17a Find the value of a, P(X<3) and c.d.f of X.

rin g.n et

32

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

7. In a component manufacturing industry, there is a small probability of 1/500 for any component to be defective. The components are supplied in packets of 10. Use Poisson distribution to calculate the approximate number of packets containing (1). No defective. (2). Two defective components in a consignment of 10,000 packets.

WORKED OUT EXAMPLES Example :1 Given the p.d.f. of a continuous random variable ‘X’ follows

6x(1 − x), f (x) =  0

0 < x <1

ww w.E

otherwise

, find c.d.f. for ‘X’

Solution

6x(1 − x), 0

Given f (x) = 

The c.d.f is F(x) =

x

0 < x <1 otherwise

asy En gin ee

∫ f (x) dx , − ∞ < x < ∞

−∞

(i) When x < 0, then x

F(x)

= ∫ f (x) dx −∞ x

= ∫ 0 dx

=0

−∞

(ii) When 0< x < 1, then x

F(x)

= ∫ f (x) dx −∞

=

0

x

−∞

0

rin g.n et

∫ f (x) dx + ∫ f (x) dx

x

 x 2 x3  = 0 + ∫ 6x(1 − x) dx = 6 ∫ x(1 − x) dx = 6  −  3 0 0 0 2 = 3x 2 − 2x 3 x

x

(iii) When x > 1, then x

F(x)

= ∫ f (x) dx −∞

=

0

1

x

0

0

6 ∫ (x − x 2 ) dx

=1

∫ 0dx + ∫ 6x(1 − x) dx + ∫ 0 dx

−∞

1

0

33

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Using (1), (2) & (3) we get

x<0

0,  F(x) = 3x 2 − 2x 3 , 1, 

0 < x <1 x >1

Example :2 A random variable X has the following probability function Values of X

0 1

2

3

4

5

6

7

8

Probability P(X) a 3a 5a 7a 9a 11a 13a 15a 17a

ww w.E

(i) (ii) (iii) Solution Table 1

Determine the value of ‘a’ Find P(X<3), P(X≥3), P(0
Values of X 0 1 2 3 4 5 6 7 8 p(x) a 3a 5a 7a 9a 11a 13a 15a 17a

asy En gin ee

(i) We know that if p(x) is the probability of mass function then 8

∑ p(x ) = 1 i =0

i

p(0) + p(1) + p(2) + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) = 1 a + 3a + 5a + 7a + 9a + 11a + 13a + 15a + 17a = 1 81 a = 1 a = 1/81 put a = 1/81 in table 1, e get table 2 Table 2 X=x 0 1 2 3 4 5 6 7 P(x)

rin g.n et 8

1/81 3/81 5/81 7/81 9/81 11/81 13/81 15/81 17/81

(ii) P(X < 3)

= p(0) + p(1) + p(2) = 1/81+ 3/81 + 5/81 = 9/81 (ii) P(X ≥ 3) = 1 - p(X < 3) = 1 - 9/81 = 72/81 (iii) P(0 < x < 5) = p(1) + p(2) + p(3) + p(4) here 0 & 5 are not include = 3/81 + 5/81 + 7/81 + 9/81 3+5+7+8+9 24 = ––––––––––––––– = ––––– 81 81 (iv) To find the distribution function of X using table 2, we get 34

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

X=x

PROBABILITY AND RANDOM PROCESSES

F(X) = P(x ≤ x)

0

F(0)

1

F(1) = P(X ≤ 1) = p(0) + p(1) = 1/81 + 3/81 = 4/81

2

F(2) = P(X ≤ 2) = p(0) + p(1) + p(2) = 4/81 + 5/81 = 9/81

3

F(3) = P(X ≤ 3) = p(0) + p(1) + p(2) + p(3) = 9/81 + 7/81 = 16/81

4

F(4) = P(X ≤ 4) = p(0) + p(1) + …. + p(4) = 16/81 + 9/81 = 25/81

5 6 7 8

= p(0) = 1/81

ww w.E

F(5) = P(X ≤ 5) = p(0) + p(1) + ….. + p(4) + p(5) = 2/81 + 11/81 = 36/81 F(6) = P(X ≤ 6) = p(0) + p(1) + ….. + p(6) = 36/81 + 13/81 = 49/81

asy En gin ee

F(7) = P(X ≤ 7) = p(0) + p(1) + …. + p(6) + p(7) = 49/81 + 15/81 = 64/81

F(8) = P(X ≤ 8) = p(0) + p(1) + ….. + p(6) + p(7) + p(8) = 64/81 + 17/81 = 81/81 = 1

Example :3 The mean and SD of a binomial distribution are 5 and 2, determine the distribution. Solution Given Mean = np = 5 (1) SD = npq = 2 (2)

rin g.n et

(2) np 4 4 ⇒ =⇒ q= (1) npq 5 5 4 1 1 ∴ p =1 − = ⇒ p= 5 5 5 Sub (3) in (1) we get n x 1/5 = 5 n = 25 ∴ The binomial distribution is P(X = x) = p(x) = nC x px qn-x x = 25C x(1/5) (4/5)n-x, Example :4 If X is a Poisson variable

x = 0, 1, 2, ….., 25

35

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

P(X = 2) Find Solution P(X=x) = Given

PROBABILITY AND RANDOM PROCESSES

= 9 P(X = 4) + 90 P(X=6) (i) Mean if X (ii) Variance of X

e −λ λ x , x = 0,1, 2,..... x! P(X = 2) = 9 P(X = 4) + 90 P(X=6)

e −λ λ 2 e −λ λ 4 e −λ λ 6 =9 + 90 2! 4! 6! 2 4 1 9λ 90λ + = 4! 6! 2 2 4 λ 1 3λ + = 8 8 2 2 4 3λ λ + 1= 4 4

ww w.E

λ 4 + 3λ 2 − 4 =0 λ2 = 1

asy En gin ee

or λ2 = -4 λ = ± 2i or λ =±1 ∴ Mean = λ = 1, Variance = λ = 1 ∴ Standard Deviation = 1

rin g.n et

36

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

UNIT – II TWO DIMENSIONAL RANDOM VARIABLES Introduction

In the previous chapter we studied various aspects of the theory of a single R.V. In this chapter we extend our theory to include two R.V's one for each coordinator axis X and Y of the XY Plane. DEFINITION : Let S be the sample space. Let X = X(S) & Y = Y(S) be two functions each assigning a real number to each outcome s ∈ S. hen (X, Y) is a two dimensional random variable. 2.1 Types of random variables 1. Discrete R.V.’s 2. Continuous R.V.’s Discrete R.V.’s (Two Dimensional Discrete R.V.’s) If the possible values of (X, Y) are finite, then (X, Y) is called a two dimensional discrete R.V. and it can be represented by (x i , y), i = 1,2,….,m. In the study of two dimensional discrete R.V.’s we have the following 5 important terms. • Joint Probability Function (JPF) (or) Joint Probability Mass Function. • Joint Probability Distribution. • Marginal Probability Function of X. • Marginal Probability Function of Y. • Conditional Probability Function.

ww w.E

asy En gin ee

rin g.n et

2.1.1 Joint Probability Function of discrete R.V.’s X and Y The function P(X = x i , Y = y j ) = P(x i , y j ) is called the joint probability function for discrete random variable X and Y is denote by p ij . Note 1. P(X = xi , Y = yj ) = P[(X = xi )∩(Y = yj )] = p ij 2. It should satisfies the following conditions (i) p ij ≥ ∀ i, j (ii) Σ j Σ i p ij = 1 2.1.2 Marginal Probability Function of X If the joint probability distribution of two random variables X and Y is given then the marginal probability function of X is given by (marginal probability function of Y) P x(x i ) = p i Conditional Probabilities The conditional probabilities function of X given Y = yj is given by P[X = xi / Y = yj ] p ij P[X = xi / Y = yj ] = –––––––––––––––– = –––– P[Y = yj ] p .j 37

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

The set {x i , p ij / p .j }, i = 1, 2, 3, …..is called the conditional probability distribution of X given Y = yj . The conditional probability function of Y given X = xi is given by P[Y = yi / X = xj ] p ij P[Y = yi / X = xj ] = –––––––––––––––– = –––– P[X = xj ] pi. The set {yi , p ij / p i. }, j = 1, 2, 3, …..is called the conditional probability distribution of Y given X = x i . SOLVED PROBLEMS ON MARGINAL DISTRIBUTION Example:2.1.1 From the following joint distribution of X and Y find the marginal distributions. X 0 1 2 Y 0 3/28 9/28 3/28

ww w.E

Solution

X Y 0 1 2

1

3/14

3/14

0

2

1/28

0

0

asy En gin ee

P X (X) = P(X=x)

0

2

P Y (y) = p(Y=y)

3/28 P(0,0)

3/28 P(2,0)

15/28 = P y (0)

3/14 P(0, 1)

3/14 P(1,1)

6/14 = P y (1)

1/28 P(0,2) 10/28 = 5/14 P X (0)

0 P(2,2) 3/28 P X (2)

1/28 = P y (2)

The marginal distribution of X P X (0) = P(X = 0) = p(0,0) + p(0,1) + p(0,2) = 5/14 P X (1) = P(X = 1) = p(1,0) + p(1,1) + p(1,2) = 15/28 P X (2) = P(X = 2) = p(2,0) + p(2,1) + p(2,2) = 3/28 Marginal probability function of X is

5 14 , x = 0   15 = PX (x) = , x 1 28  3  28 , x = 2 

1

rin g.n et

The marginal distribution of Y P Y (0) = P(Y = 0) = p(0,0) + p(1,0) + p(2,0) = 15/28 P Y (1) = P(Y = 1) = p(0,1) + p(2,1) + p(1,1) = 3/7 38

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

P Y (2) = P(Y = 2) = p(0,2) + p(1,2) + p(2,2) = 1/28 Marginal probability function of Y is

 15  28 , y = 0  3 PY (y) = , y 1 = 7 1  28 , y = 2  2.3 CONTINUOUS RANDOM VARIABLES • Two dimensional continuous R.V.’s If (X, Y) can take all the values in a region R in the XY plans then (X, Y) is called twodimensional continuous random variable. • Joint probability density function :

ww w.E

∞ ∞

(i) f XY (x,y) ≥0 ; (ii) ∫ ∫ f XY (x, y) dydx =1

asy En gin ee

−∞ −∞



Joint probability distribution function F(x,y) = P[X ≤ x, Y ≤ y]

y −∞  −∞

 

x

= ∫  ∫ f (x, y)dx  dy •

Marginal probability density function ∞

f(x) = fX (x) = ∫ f x,y (x, y)dy (Marginal pdf of X) −∞ ∞

f(y) = fY (x) = ∫ f x,y (x, y)dy (Marginal pdf of Y) −∞



Conditional probability density function

f (x, y) , f (x) > 0 f (x) f (x, y) (ii) P(X= x / Y= y) = f (x / y) = , f (y) > 0 f (y) (i) P(Y= y / X= x) = f (y / x) =

rin g.n et

Example :2.3.1 Show

that

the

function

2  (2x + 3y), 0 < x < 1, f (x, y) =  5 0 otherwise

0 < y <1

is a joint density function of X and Y. Solution 39

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

We know that if f(x,y) satisfies the conditions ∞ ∞

(i) f (x, y) ≥ 0

(ii) ∫ ∫ f (x, y) = 1 , then f(x,y) is a jdf −∞ −∞

2  (2x + 3y), 0 < x < 1, f (x, y) =  5 0 otherwise

Given

0 < y <1

(i) f (x, y) ≥ 0 in the given interval 0 ≤ (x,y) ≤ 1 ∞ ∞

2 00 5

11

(ii) ∫ ∫ f (x, y) dx dy = ∫ ∫ (2x + 3y) dx dy −∞ −∞

1

 2 1 1  x2 = ∫ ∫ 2 + 3xy  dy 5 00 2 0

ww w.E

1

21 2 3y 2  2 3 = ∫ (1 + 3y) dy =  y + = 1 +   50 5 2 0 5  2  2 5 =   =1 5 2

asy En gin ee

Since f(x,y) satisfies the two conditions it is a j.d.f. Example :2.3.2 The j.d.f of the random variables X and Y is given

0 < x < 1, otherwise

8xy, f (x, y) =  0,

0
Find (i) f X(x) (ii) fY (y) Solution We know that (i) The marginal pdf of ‘X’ is

(iii) f(y/x)



x

−∞

0

f X (x) = f(x) = ∫ f (x, = y)dy

8xy dy 4x 3 ∫=

f (x) = 4x , 0 < x < 1 3

(ii) The marginal pdf of ‘Y’ is ∞

1

−∞

0

fY (y) = f(y) = ∫ f (x, = y)dy

f= (y) 4y, 0 < y < α

rin g.n et

8xy dy 4y ∫=

(iii) We know that

f (y / x)

f (x, y) f (x) 8xy 2y = = , 0 < y < x, 0 < x < 1 3 4x x2 =

40

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Result Marginal pdf g

Marginal pdf y

F(y/x)

4x3, 0
4y, 0
2y ,0 < y < x, 0 < x < 1 x2

2.4 REGRESSION * Line of regression The line of regression of X on Y is given by

x= − x r.

σy (y − y) σx

The line of regression of Y on X is given by

σy (x − x) σx

ww w.E y= − y r.

* Angle between two lines of Regression.

1 − r 2  σyσx  tan θ = r  σ x 2 + σ y2 

   

asy En gin ee

* Regression coefficient Regression coefficients of Y on X

r.

σy = b YX σx

Regression coefficient of X and Y

r.

σx = b XY σy

rin g.n et

∴ Correlation coefficient r = ± b XY × b YX

Example:2.4.1 1. From the following data, find (i) The two regression equation (ii) The coefficient of correlation between the marks in Economic and Statistics. (iii) The most likely marks in statistics when marks in Economic are 30. Marks in Economics Marks in Statistics

25 40

28 46

35 49

32 41

31 36

36 32

29 31

38 30

34 33

32 39

Solution 41

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

X

Y

X − X = X − 32

X − Y = Y − 38

(X − X)

25 28 35 32 31 36 29 38 34 32 320

43 46 4 41 36 32 31 30 33 39 380

-7 -4 3 0 -1 4 -3 6 2 0 0

5 8 11 3 -2 -6 -7 -8 -5 1 0

49 16 9 0 1 16 09 36 4 0 140

ww w.E

Here = X

2

(Y − Y) (X − X) (Y − Y) 2

25 64 121 9 4 36 49 64 25 1 398

2

-35 -32 33 0 2 -24 +21 -48 -48 100 -93

∑ X 320 ∑ Y 380 = = 32 and= Y = = 38 n 10 n 10

Coefficient of regression of Y on X is

b YX =

∑ (X − X)(Y − Y) −93 = = − 0.6643 2 140 ∑ (X − X)

asy En gin ee

Coefficient of regression of X on Y is

b XY =

∑ (X − X)(Y − Y) −93 = = − 0.2337 2 398 ∑ (Y − Y)

Equation of the line of regression of X and Y is

X−X

= b XY (Y − Y)

X – 32

= -0.2337 (y – 38) X = -0.2337 y + 0.2337 x 38 + 32 X = -0.2337 y + 40.8806 Equation of the line of regression of Y on X is

Y−Y

= b YX (X − X)

Y – 38

= -0.6643 (x – 32) = -0.6643 x + 38 + 0.6643 x 32 = -0.6642 x + 59.2576 Coefficient of Correlation r2 = bYX × b XY = -0.6643 x (-0.2337) r = 0.1552 r = ± 0.1552 Y

rin g.n et

r = ± 0.394 Now we have to find the most likely mark, in statistics (Y) when marks in economics (X) are 30. y = -0.6643 x + 59.2575 42

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Put x = 30, we get y = -0.6643 x 30 + 59.2536 = 39.3286 y~ 39 2.5 COVARIANCE Def : If X and Y are random variables, then Covariance between X and Y is defined as Cov (X, Y) = E(XY) – E(X) . E(Y) Cov (X, Y) = 0 [If X & Y are independent]

• •

2.6 CORRELATION Types of Correlation Positive Correlation (If two variables deviate in same direction) Negative Correlation (If two variables constantly deviate in opposite direction)

ww w.E

2.7 KARL-PEARSON’S COEFFICIENT OF CORRELATION Correlation coefficient between two random variables X and Y usually denoted by r(X, Y) is a numerical measure of linear relationship between them and is defined as

asy En gin ee

Cov(X, Y) , σ X .σ Y 1 Where Cov (X, Y) = ∑ XY − X Y n ∑X ∑Y = σX ; = σY n n =

r(X, Y)

* Limits of correlation coefficient -1 ≤ r ≤ 1. X & Y independent, ∴ r(X, Y) = 0. Note :Types of correlation based on ‘r’. Values of ‘r’ Correlation is said to be r=1 perfect and positive 0
rin g.n et

SOLVED PROBLEMS ON CORRELATION Example :2.6.1 Calculated the correlation coefficient for the following heights of fathers X and their sons Y. X 65 66 67 67 68 69 70 72 Y

67

68

65

68

72

72

69

71

43

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Solution X Y

U = X – 68

V = Y – 68

UV

U2

V2

65

67

-3

-1

3

9

1

66

68

-2

0

0

4

0

67

65

-1

-3

3

1

9

67

68

-1

0

0

1

0

68

72

0

4

0

0

16

69

72

1

4

4

1

16

ww w.E

70

69

2

1

2

4

1

72

71

4

3

12

16

9

∑U = 0

∑V = 0

∑ UV = 24 ∑ U 2 = 36 ∑ V 2 = 52

Now

∑U = n ∑V V= = n

0 = 0 8 8 = 1 8

U=

asy En gin ee

Cov (X, Y) = Cov (U, V)



24 ∑ UV − UV = −0= 3 n 8

σ= U

2 ∑U − U= n

σ= V

2 ∑V − V= n

2

2

∴r(X, Y)

36 − 0= 2.121 8

52 − 1= 2.345 8 Cov(U, V) 3 = r(U, V) = = σ U .σ V 2.121 x 2.345 = 0.6031

(by 1, 2, 3)

(1)

rin g.n et (2) (3)

Example :2.6.2 Let

X

be

a

random

variable

with

p.d.f.

Y = x2, find the correlation coefficient between X and Y.

f (x) =

1 , − 1 ≤ x ≤1 2

and

let

Solution

44

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES ∞

E(X) = ∫ x.f (x) dx −∞

1

1  x2  =   2  2  −1

1 = ∫ x. dx 2 −1 1

11 1  − = 0 2 2 2

=

E(X) = 0 ∞

E(Y) = ∫ x 2 .f (x) dx −∞

1

1 1  x3  = ∫ x . dx =   2 2  3  −1 −1 1

2

=

11 1 1 2 1  + = . = 23 3 2 3 3

E(XY) = E(XX2)

ww w.E

1

 x4  = E(X )= ∫ x .f (x) dx = 0 =  −∞  4  −1 3



3

E(XY) = 0

∴r(X, Y) = ρ(X, Y) =

Cov(X, Y) =0 σX σY

asy En gin ee

ρ = 0. Note : E(X) and E(XY) are equal to zero, noted not find σ x &σ y .

2.8 TRANSFORMS OF TWO DIMENSIONAL RANDOM VARIABLE Formula: ∞

f U (u) = ∫ f u,v (u, v) dv −∞ ∞

&

f V (u) = ∫ f u,v (u, v) du −∞

f UV (u, V)

= f XY (x, y)

∂ (x, y) ∂ (u, v)

rin g.n et

Example : 1 If the joint pdf of (X, Y) is given by fxy (x, y) = x+y, 0 ≤ x, y ≤ 1, find the pdf of ∪ = XY. Solution Given f xy (x, y) = x + y Given U = XY Let V=Y

= x

u = &y V v 45

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

∂x 1 ∂x − u ∂y ∂y = = . = ; 0;= 1 2 ∂u V ∂v V ∂u ∂v ∂y ∂x 1 −u ∂ (x, y) ∂u ∂v ∴J = = = V V2 ∂y ∂y ∂ (u, v) 0 1 ∂u ∂v 1 ⇒|J|= V

(1)

=

1 1 −1 = v v (2)

The joint p.d.f. (u, v) is given by

= f xy (x, y) |J|

f uv (u, v)

ww w.E

1 |v| 1 u  =  + u Vv  = (x + y)

The range of V : Since 0 ≤ y ≤ 1, we have 0 ≤ V ≤ 1 The range of u : Given 0≤x≤1 ⇒

0 ≤

u ≤ v

(3)

asy En gin ee (∴ V = y)

1

⇒ 0≤u≤v Hence the p.d.f. of (u, v) is given by

f uv (u, v)

=

1u   + v  , 0 ≤ u ≤ v, 0 ≤ v ≤ 1 v v 

rin g.n et

Now ∞

f U (u) = ∫ f u,v (u, v) dv −∞ 1

= ∫ f u,v (u, v) dv u

 u  = ∫  2 + 1 dv u v  1

1

 v −1  =  v + u.  −1  u  ∴fu (u) = 2(1-u), 0 < u < 1 p.d.f of (u, v)

p.d.f of u = XY

46

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

f uv (u, = v)

1u   + v v v 

fu (u) = 2(1-u), 0 < u < 1

0 ≤ u ≤ v, 0 ≤ v ≤ 1

TUTORIAL QUESTIONS 1. The jpdf of r.v X and Y is given by f(x,y)=3(x+y),01/2)(ii) P(Y
ww w.E

asy En gin ee Find Cov (X,Y).

7. If the equations of the two lines of regression of y on x and x on y are respectively 7x-16y+9=0; 5y-4x-3=0, calculate the coefficient of correlation. WORKEDOUT EXAMPLES Example 1 The j.d.f of the random variables X and Y is given

0 < x < 1, otherwise

8xy, f (x, y) =  0,

0
Find (i) f X(x) (ii) fY (y) Solution We know that (i) The marginal pdf of ‘X’ is

(iii) f(y/x)



x

−∞

0

f X (x) = f(x) = ∫ f (x, = y)dy

rin g.n et

8xy dy 4x 3 ∫=

f (x) = 4x , 0 < x < 1 3

(ii) The marginal pdf of ‘Y’ is ∞

1

−∞

0

fY (y) = f(y) = ∫ f (x, = y)dy

f= (y) 4y, 0 < y < α

8xy dy 4y ∫= 47

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

(iii) We know that

f (x, y) f (x) 8xy 2y = = , 0 < y < x, 0 < x < 1 3 4x x2

f (y / x)

=

Example 2 Let

X

be

a

random

variable

with

1 , − 1 ≤ x ≤1 2

f (x) =

p.d.f.

Y = x2, find the correlation coefficient between X and Y.

and

let

Solution ∞

ww w.E

E(X) = ∫ x.f (x) dx −∞

1 = ∫ x. dx 2 −1 1

E(X) = 0



1

1  x2  =   2  2  −1

11 1  − = 0 2 2 2

=

asy En gin ee

E(Y) = ∫ x 2 .f (x) dx −∞

1

1 1  x3  = ∫ x . dx =   2 2  3  −1 −1 1

2

E(XY) = E(XX2)

=

11 1 1 2 1  + = . = 23 3 2 3 3 1

 x4  = E(X )= ∫ x .f (x) dx = 0 =  −∞  4  −1 3

E(XY) = 0 ∴r(X, Y) = ρ(X, Y) =



3

Cov(X, Y) =0 σX σY

ρ = 0. Note : E(X) and E(XY) are equal to zero, noted not find σ x &σ y .

rin g.n et

Result Marginal pdf g

Marginal pdf y

F(y/x)

4x3, 0
4y, 0
2y ,0 < y < x, 0 < x < 1 x2

48

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

UNIT - III RANDOM PROCESSES Introduction

In chapter 1, we discussed about random variables. Random variable is a function of the possible outcomes of a experiment. But, it does not include the concept of time. In the real situations, we come across so many time varying functions which are random in nature. In electrical and electronics engineering, we studied about signals. Generally, signals are classified into two types. (i) Deterministic (ii) Random Here both deterministic and random signals are functions of time. Hence it is possible for us to determine the value of a signal at any given time. But this is not possible in the case of a random signal, since uncertainty of some element is always associated with it. The probability model used for characterizing a random signal is called a random process or stochastic process.

ww w.E

asy En gin ee

3.1 RANDOM PROCESS CONCEPT A random process is a collection (ensemble) of real variable {X(s, t)} that are functions of a real variable t where s ∈ S, S is the sample space and t ∈T. (T is an index set). REMARK i) If t is fixed, then {X(s, t)} is a random variable. ii) If S and t are fixed {X(s, t)} is a number. iii) If S is fixed, {X(s, t)} is a signal time function. NOTATION

rin g.n et

Here after we denote the random process {X(s, t)} by {X(t)} where the index set T is assumed to be continuous process is denoted by {X(n)} or {Xn}. A comparison between random variable and random process

Random Variable Random Process A function of the possible outcomes of A function of the possible outcomes of an experiment is X(s) an experiment and also time i.e, X(s, t) Outcomes are mapped into wave from Outcome is mapped into a number x. which is a fun of time 't'.

49

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

3.2 CLASSIFICATION OF RANDOM PROCESSES We can classify the random process according to the characteristics of time t and the random variable X = X(t) t & x have values in the ranges –∞< t <∞ and –∞< x <∞.

Random Process is a function of

ww w.E Random Variables

Discrete

Time t

asy En gin ee Discrete

Continuous

Continuous

3.2.1 CONTINUOUS RANDOM PROCESS If 'S' is continuous and t takes any value, then X(t) is a continuous random variable. Example Let X(t) = Maximum temperature of a particular place in (0, t). Here 'S' is a continuous set and t ≥ 0 (takes all values), {X(t)} is a continuous random process.

rin g.n et

3.2.2 DISCRETE RANDOM PROCESS If 'S' assumes only discrete values and t is continuous then we call such random process {X(t) as Discrete Random Process. Example Let X(t) be the number of telephone calls received in the interval (0, t). Here, S = {1, 2, 3, …} T = {t, t ≥ 0} ∴ {X(t)} is a discrete random process. 3.2.3 CONTINUOUS RANDOM SEQUENCE If 'S' is a continuous but time 't' takes only discrete is called discrete random sequence. Example: Let X n denote the outcome of the nth toss of a fair die. Here S = {1, 2, 3, 4, 5, 6} T = {1, 2, 3, …} ∴ (X n , n = 1, 2, 3, …} is a discrete random sequence. 50

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

3.3 CLASSIFICATION OF RANDOM PROCESSES BASED ON ITS SAMPLE FUNCTIONS Non-Deterministic Process A Process is called non-deterministic process if the future values of any sample function cannot be predicted exactly from observed values. Deterministic Process A process is called deterministic if future value of any sample function can be predicted from past values. 3.3.1 STATIONARY PROCESS A random process is said to be stationary if its mean, variance, moments etc are constant. Other processes are called non stationary.

ww w.E

1. 1st Order Distribution Function of {X(t)} For a specific t, X(t) is a random variable as it was observed earlier. F(x, t) = P{X(t) ≤ x} is called the first order distribution of the process {X(t)}.

asy En gin ee

1st Order Density Function of {X(t)}

f ( x, t ) =

∂ F ( x, t ) is called the first order density of {X, t} ∂x

2nd Order distribution function of {X(t)}

F ( x1 , x 2 ; t1 , t 2 ) =P {X ( t1 ) ≤ x1;X ( t 2 ) ≤ x 2 } is the point distribution of the random

variables X(t 1 ) and X(t 2 ) and is called the second order distribution of the process {X(t)}. 2nd order density function of {X(T)}

rin g.n et

∂ 2 F ( x1 , x 2 ; t 1 , t 2 ) is called the second order density of {X(t)}. f ( x1 , x 2 ; t 1 , t 2 ) = ∂x, ∂x 2

3.3.2 First - Order Stationary Process Definition A random process is called stationary to order, one or first order stationary if its 1st order density function does not change with a shift in time origin. In other words, f X= ( x1 , t1 ) f X ( x1 , t1 + C ) must be true for any t 1 and any real number C if {X(t 1 )} is to be a first order stationary process. Example :3.3.1 Show that a first order stationary process has a constant mean. Solution Let us consider a random process {X(t 1 )} at two different times t 1 and t 2 . 51

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES ∞

∴ E  X ( t1 )  = xf ( x, t1 )dx



−∞

[f(x,t 1 ) is the density form of the random process X(t 1 )] ∞

∴ E  X ( t 2 )  = xf ( x, t 2 )dx



−∞

[f(x,t 2 ) is the density form of the random process X(t 2 )] Let t 2 = t 1 + C ∴E=  X ( t 2 ) 





+ C )dx ∫ xf ( x, t )dx ∫ xf ( x, t= 1

−∞

= E  X ( t1 ) 

1

−∞

ww w.E

Thus

E  X ( t 2 ) 

=E  X ( t1 ) 

Mean process {X(t1)} = mean of the random process {X(t 2 )}. Definition 2: If the process is first order stationary, then Mean = E(X(t)] = constant 3.3.4 Second Order Stationary Process A random process is said to be second order stationary, if the second order density function stationary. f ( x1 , x 2 ;= t1 , t 2 ) f ( x1 , x 2 ; t1 + C, t 2 + C ) ∀x1 , x 2 and C.

asy En gin ee

E ( X12 ) , E ( X 22 ) , E ( X1 , X 2 ) denote change with time, where

X = X(t 1 ); X2 = X(t 2 ). 3.3.5 Strongly Stationary Process A random process is called a strongly stationary process or Strict Sense Stationary Process (SSS Process) if all its finite dimensional distribution are invariance under translation of time 't'. f X (x 1 , x2 ; t 1 , t 2 ) = fX(x 1 , x2 ; t 1 +C, t 2 +C) f X (x 1 , x2 , x3 ; t 1 , t 2 , t 3 ) = fX (x 1 , x2 , x 3 ; t 1 +C, t 2 +C, t 3 +C) In general f X (x 1 , x 2 ..x n ; t 1 , t 2 …t n) = f X(x1 , x2 ..x n ; t 1 +C, t 2 +C..t n +C) for any t 1 and any real number C.

rin g.n et

3.3.6 Jointly - Stationary in the Strict Sense {X(t)} and Y{(t)} are said to be jointly stationary in the strict sense, if the joint distribution of X(t) and Y(t) are invariant under translation of time. Definition Mean:

µ X ( t )= E  X ( t1 )  , −∞ < t < ∞ µ  X ( t )  is also called mean function or ensemble average of the random process.

3.3.7 Auto Correlation of a Random Process 52

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Let X(t 1 ) and X(t 2 ) be the two given numbers of the random process {X(t)}. The auto correlation is

R XX ( t1 , t 2 ) = E {X ( t1 ) xX ( t 2 )}

Mean Square Value Putting t 1 = t 2 = t in (1), we get R XX (t,t) = E[X(t) X(t)] ⇒ R XX ( t, t ) = E  X 2 ( t )  is the mean square value of the random process. 3.3.8 Auto Covariance of A Random Process

{

}

C XX ( t1 , t 2 ) = E  X ( t1 ) − E ( X ( t1 ) )   X ( t 2 ) − E ( X ( t 2 ) )  = R XX ( t1 , t 2 ) − E  X ( t1 )  E  X ( t 2 ) 

ww w.E

Correlation Coefficient The correlation coefficient of the random process {X(t)} is defined as

C XX ( t1 , t 2 ) ρXX ( t1 , t 2 ) = Var X ( t1 ) xVar X ( t 2 )

Where C XX (t 1 , t 2 ) denotes the auto covariance.

asy En gin ee

3.4 CROSS CORRELATION The cross correlation of the two random process {X(t)} and {Y(t)} is defined by R XY (t 1 , t 2 ) = E[X(t 1 ) Y (t 2 )] 3.5 WIDE - SENSE STATIONARY (WSS) A random process {X(t)} is called a weakly stationary process or covariance stationary process or wide-sense stationary process if i) E{X(t)} = Constant ii) E[X(t) X(t+τ] = R XX (τ) depend only on τ when τ = t 2 - t 1 . REMARKS : SSS Process of order two is a WSS Process and not conversely.

rin g.n et

3.6 EVOLUTIONARY PROCESS A random process that is not stationary in any sense is called as evolutionary process.

SOLVED PROBLEMS ON WIDE SENSE STATIONARY PROCESS Example:3.6.1 Given an example of stationary random process and justify your claim. Solution: Let us consider a random process X(t) = A as (wt + θ) where A &ω are custom and 'θ' is uniformlydistribution random Variable in the interval (0, 2π). Since 'θ' is uniformly distributed in (0, 2π), we have 53

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

1  ,0 < C < 2π f ( θ ) = 2π 0 ,otherwise  ∞

∴ E[X(t)]

=

∫ X ( t ) f ( θ ) dθ

−∞ 2π

=

1

∫ A ω ( ω t + θ ) 2 π dθ 0

2π A sin ( ωt + θ )  0 2π A = Sin ( 2π + ωt ) − Sin ( ωt + 0 )  2π  A = [Sinωt − sin ωt ] 2π

=

ww w.E

= 0 constant Since E[X(t)] = a constant, the process X(t) is a stationary random process.

asy En gin ee

Example:3.6.2 which are not stationary Examine whether the Poisson process {X(t)} given by the probability law P{X(t)=n] =

e −λt ( λt ) , n = 0, 1, 2, …. n

Solution We know that the mean is given by ∞

rin g.n et

E  X ( t )  =∑ nPn ( t ) n =0

ne −λt ( λt ) = ∑ n n =0 ∞

e −λt ( λt ) = ∑ n −1 n =1 ∞

=e

−λt



∑ n =1

=e

−λt

( λt )

n

n

n

n −1

 λt ( λt ) 2  + ...  + 1!  0! 

54

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

 λt ( λt ) 2  = ( λt ) e  1 + + + ...    1 2   −λt λt = ( λt ) e e = λt , depends on t −λt

Hence Poisson process is not a stationary process. 3.7 ERGODIC RANDOM PROCESS Time Average The time average of a random process {X(t)} is defined as T

XT =

1 X ( t ) dt 2T −∫T

ww w.E

Ensemble Average The ensemble average of a random process {X(t)} is the expected value of the random variable X at time t Ensemble Average = E[X(t)] Ergodic Random Process {X(t)} is said to be mean Ergodic If lim X T = µ T →∞ T

asy En gin ee

1 X ( t ) dt = ⇒ lim µ T →∞ 2T ∫ −T Mean Ergodic Theorem

Let {X(t)} be a random process with constant mean µ and let X T be its time average. Then {X(t)} is mean ergodic if

lim Var X T = 0

T →∞

rin g.n et

Correlation Ergodic Process The stationary process {X(t)} is said to be correlation ergodic if the process {Y(t)} is mean ergodic where Y(t) = X(t) X(t+λ) E  y ( t )  = lim YT when YT is the time average of Y(t). |T| →∞

3.8 MARKOV PROCESS Definition A random process {X(t)} is said to be markovian if

P  X ( t n +1 ) ≤ X n +1 / X ( n ) + x n , x ( t n −1 ) = x n −1...x ( t 0 = x 0 )  P  X ( t n +1 ) ≤ X n +1 / x ( t n ) = x n 

Where t 0 ≤ t1 ≤ t 2 ≤ ... ≤ t n ≤ t n +1 Examples of Markov Process

55

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

1.The probability of raining today depends only on previous weather conditions existed for the last two days and not on past weather conditions. 2.A different equation is markovian. Classification of Markov Process

Markov Process

Continuous Parameter Markov Process

Discrete Parameter Markov Process

Continuous Parameter Markov Chain

Discrete Parameter Markov Chain

ww w.E

3.9 MARKOV CHAIN Definition We define the Markov Chain as follows If P{X n = a n/X n-1 = a n-1 , X n-2 = a n-2 , … X 0 = a 0 } ⇒P{X n = a n / X n-1 = a n-1 } for all n. the process {X n }, n = 0, 1, 2… is called as Markov Chains. 1.a1, a2, a3, … an are called the states of the Markov Chain. 2.The conditional probability P{X= = Pij ( n − 1, n ) is called the one step aj | X n= ai} n −1 transition probability from state a i to state a j at the nth step. 3.The tmp of a Markov chain is a stochastic matricx i) P ij ≥ 0 ii) ΣP ij = 1 [Sum of elements of any row is 1]

asy En gin ee

rin g.n et

3.10 Poisson Process The Poisson Process is a continuous parameter discrete state process which is very useful model for many practical situations. It describe number of times occurred. When an experiment is conducted as a function of time. Property Law for the Poisson Process Let λ be the rate of occurrences or number of occurrences per unit time and P n (t) be the probability of n occurrences of the event in the interval (0, t) is a Poisson distribution with parameter λt. n e −λt ( λt ) i.e. P  X ( t ) = n  = , n = 0,1, 2,... n!

e −λt ( λt ) Pn ( t ) = n!

n

56

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Second Order Probability Function of a Homogeneous Poisson Process

P  X ( t 1 )  = n1 =  X ( t 2 ) = n 2  = P  X ( t1 = n1 )  .P  X ( t 2 = n 2 )  /

 X ( t1 ) =n 2  , t 2 > t1 = P  X ( t1 ) = n1  .P [the even occurs n2 -n times in the interval (t 2 =t 1 ) e −λt1 ( λt1 ) 1 e = . n1 n

−λ ( t 2 − t1 )

{λ ( t

− t1 )}

n 2 = n1

2

, n 2 ≥ n1

n 2 − n1

 e −λt 2 .λ n 2 .t1n1 ( t 2 − t1 )n 2 − n1  , n 2 ≥ n1   n, n 2 − n1 =     , otherwise 0 

ww w.E

3.11SEMI RANDOM TELEGRAPH SIGNAL PROCESS If N(t) represents the number of occurrence of a specified event in (0, t) and X(t) = (–)N(t), then {X(t)} is called a semi-random telegraph signal process. 3.11.1 RANDOM TELEGRAPH SIGNAL PROCESS Definition A random telegraph process is a discrete random process X(t) satisfying the following: i. X(t) assumes only one of the two possible values 1 or –1 at any time 't' ii. X(0) = 1 or –1 with equal probability 1/2 iii. The number of occurrence N(t) from one value to another occurring in any interval of length 't' is a Poisson process with rate λ, so that the probability of exactly 'r' transitions is

asy En gin ee

e −λt ( λt ) P  N ( t ) = r  = , r = 0,1, 2,... r! r

rin g.n et

A typical sample function of telegraph process.

(0,1)

(0,-1) Note: The process is an example for a discrete random process. * Mean and Auto Correlation P{X(t) = 1} and P{X(t) = 1" for any t. 57

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

TUTORIAL QUESTIONS 1.. The t.p.m of a Marko cain with three states 0,1,2 is P= and the initial state distribution is Find (i)P[X 2 =3] ii)P[X 3 =2, X 2 =3, X 1 =3, X 0 =2] 2. Three boys A, B, C are throwing a ball each other. A always throws the ball to B and B always throws the ball to C, but C is just as likely to throw the ball to B as to A. S.T. the process is Markovian. Find the transition matrix and classify the states 3. A housewife buys 3 kinds of cereals A, B, C. She never buys the same cereal in successive weeks. If she buys cereal A, the next week she buys cereal B. However if she buys P or C the next week she is 3 times as likely to buy A as the other cereal. How often she buys each of the cereals? 4. A man either drives a car or catches a train to go to office each day. He never goes 2 days in a row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by train. Now suppose that on the first day of week, the man tossed a fair die and drove to work if a 6 appeared. Find 1) the probability that he takes a train on the 3rd day. 2). The probability that he drives to work in the long run.

ww w.E

asy En gin ee

WORKED OUT EXAMPLES

Example:1.Let X n denote the outcome of the nth toss of a fair die. Here S = {1, 2, 3, 4, 5, 6} T = {1, 2, 3, …} ∴ (X n , n = 1, 2, 3, …} is a discrete random sequence.

rin g.n et

Example:2 Given an example of stationary random process and justify your claim. Solution: Let us consider a random process X(t) = A as (wt + θ) where A &ω are custom and 'θ' is uniformly distribution random Variable in the interval (0, 2π). Since 'θ' is uniformly distributed in (0, 2π), we have

1  ,0 < C < 2π f ( θ ) = 2π 0 ,otherwise  ∞

∴ E[X(t)]

=

∫ X ( t ) f ( θ ) dθ

−∞

58

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES 2π

=

1

∫ A ω ( ω t + θ ) 2 π dθ 0

2π A sin ( ωt + θ )  0 2π A = Sin ( 2π + ωt ) − Sin ( ωt + 0 )  2π  A = [Sinωt − sin ωt ] 2π

=

= 0 constant Since E[X(t)] = a constant, the process X(t) is a stationary random process.

ww w.E

Example:3.which are not stationary .Examine whether the Poisson process {X(t)} given by the

e −λt ( λt ) , n = 0, 1, 2, …. probability law P{X(t)=n] = n

asy En gin ee

Solution We know that the mean is given by ∞

E  X ( t )  =∑ nPn ( t ) n =0

ne −λt ( λt ) = ∑ n n =0 ∞

e −λt ( λt ) = ∑ n −1 n =1 ∞

=e

−λt



∑ n =1

=

=

( λt )

n

n

n

n −1

 λt ( λt ) 2  e  + + ... 1!  0!   λt ( λt ) 2  −λt + ...  ( λt ) e 1 + +  1 2   −λt λt ( λt ) e e λt , depends on t −λt

rin g.n et

= = Hence Poisson process is not a stationary process.

59

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

UNIT - 4 CORRELATION AND SPECTRAL DENSITY Introduction

The power spectrum of a time series x(t) describes how the variance of the data x(t) is distributed over the frequency components into which x(t) may be decomposed. This distribution of the variance may be described either by a measure µ or by a statistical cumulative distribution function S(f) = the power contributed by frequencies from 0 upto f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the spectral distribution function of x. The spectral density at a frequency f gives the rate of variance contributed by frequencies in the immediate neighbourhood of f to the variance of x per unit frequency.

ww w.E

4.1 Auto Correlation of a Random Process Let X(t 1 ) and X(t 2 ) be the two given random variables. Then auto correlation is R XX (t 1 , t 2 ) = E[X(t 1 ) X(t 2 )] Mean Square Value Putting t 1 = t 2 = t in (1) R XX (t, t) = E[X(t) X(t)] ⇒ RXX (t, t) = E[X2(t)] Which is called the mean square value of the random process.

asy En gin ee

Auto Correlation Function Definition: Auto Correlation Function of the random process {X(t)} is R XX = (τ) = E{(t) X(t+τ)} Note: R XX (τ) = R(τ) = R X (τ)

rin g.n et

PROPERTY: 1 The mean square value of the Random process may be obtained from the auto correlation function. R XX(τ), by putting τ = 0. is known as Average power of the random process {X(t)}. PROPERTY: 2 R XX(τ) is an even function of τ. R XX (τ) = R XX (-τ) PROPERTY: 3 If the process X(t) contains a periodic component of the same period. PROPERTY: 4 If a random E[X(t)] = X then

process

{X(t)}

has

no

periodic

components,

and

60

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

lim R XX ( τ )

lim= R XX ( τ ) X = (or)X 2

|T| →∞

|T| →∞

i.e., when τ→∞, the auto correlation function represents the square of the mean of the random process. PROPERTY: 5 The auto correlation function of a random process cannot have an arbitrary shape. SOLVED PROBLEMS ON AUTO CORRELATION Example : 1 Check whether the following function are valid auto correlation function (i) 5 sin nπ (ii)

1 1 + 9τ 2

ww w.E

Solution: (i) Given

R XX(τ) = 5 Sin nπ R XX (–τ) = 5 Sin n(–π) = –5 Sin nπ R XX(τ) ≠ R XX (–τ), the given function is not an auto correlation function.

asy En gin ee

1 1 + 9τ 2 1 R XX (–τ) = = R XX ( τ ) 2 1 + 9 ( −τ )

(ii) Given R XX (τ) =

∴ The given function is an auto correlation function.

rin g.n et

Example : 2 Find the mean and variance of a stationary random process whose auto correlation function is given by

R XX ( τ ) = 18 +

2 6 + τ2

Solution

2 6 + τ2 = X2 lim R XX ( τ )

Given R XX ( τ ) = 18 + | τ | →∞

 

2

 

= lim 18 + | τ | →∞ 6 + τ2 

2 | τ | →∞ 6 + τ 2

= 18 + lim

61

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

= 18 +

X E  X ( t )  Var {X(t)} We know that

E  X 2 ( t ) 

2 6+

= 18 + 0 = 18 = 18 = 18 = E[X2(t)] - {E[X(t)]}2 = R XX(0) = 18 +

ww w.E =

2 55 = 6+0 3

1 3

Example : 3 Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation function of process {X(t)}

asy En gin ee

Solution Consider, R XX '(t 1 , t 2 ) = E{X(t 1 )X'(t 2 )}

 X ( t 2 + h ) − X ( t 2 )   h  



= E  X ( t1 ) lim  n →0



 X ( t1 ) X ( t 2 + h ) − X ( t1 ) X ( t 2 )   h  

= lim E  h →0

 R XX ( t1 , t 2 + h ) − R X ( t1 , t 2 )   h  

= lim  h →0 ⇒ R XX ' (t 1 , t 2 ) Similarly R XX ' (t 1 , t 2 ) ⇒ R X ' X (t 1 , t 2 )

∂ R XX ( t1 , t 2 ) ∂t 2 ∂ = R XX ' ( t, t 2 ) ∂t1 ∂ = R XX ( t1 , t 2 ) ∂t, ∂t 2 =

(1)

rin g.n et

by (1)

Auto Covariance The auto covariance of the process {X(t)} denoted by C XX (t 1 , t 2 ) or C(t 1 , t 2 ) is defined as 62

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

{

)}

(

CXX ( t1 ,= t2 ) E  X ( t1 ) − E ( X ( t1 ) )   X t 2 − E  X ( t 2 )     4.2 CORRELATION COEFFICIENT

C (t ,t ) ρXX ( t1 , t 2 ) =XX 1 2 Var X ( t1 ) x Var X ( t 2 )

Where C XX(t 1 , t 2 ) denotes the auto covariance. 4.3 CROSS CORRELATION Cross correlation between the two random process {X(t)} and {Y(t)} is defined as R XY (t 1 , t 2 ) = E[X(t 1 ) Y(t 2 )] where X(t 1 ) Y(t 2 ) are random variables. 4.4 CROSS COVARIANCE Let {X(t)} and {Y(t)} be any two random process. Then the cross covariance is defined as

ww { w.E

)}

(

CXY ( t1 ,= t2 ) E  X ( t1 ) − E ( Y ( t1 ) )   X t 2 − E  Y ( t 2 )    

The relation between Mean Cross Correlation and cross covariance is as follows:

asy En gin ee

C XY ( t1 , t 2 = R XY ( t1 , t 2 ) − E  X ( t1 ) E  Y ( t 2 )   )

Definition Two random process {X(t)} and {Y(t)} are said to be uncorrelated if

CXY ( t1 , t 2 )

0, ∀ t1 , t 2

Hence from the above remark we have, R XY (t 1 , t 2 ) = E[X(t 1 ) Y(t 2 )] 4.4.1 CROSS CORRELATION COEFFICIENT

ρXY ( t1 , t 2 )

c XY ( t1 , t 2 ) = Var ( X ( t1 ) ) Var ( X ( t 2 ) )

rin g.n et

4.4.1 CROSS CORRELATION AND ITS PROPERTIES Let {X(t)} and {Y(t)} be two random. Then the cross correlation between them is also defined as R XY (t, t+τ) = E  X ( t ) Y ( t + τ )  = R XY (τ) PROPERTY : 1 R XY (τ) = R YX (–τ) PROPERTY : 2

If {X(t)} and {Y(t)} are two random process then R XY ( τ ) ≤ R XX ( 0 ) R YY ( 0 ) , where R XX(τ) and R YY (τ) are their respective auto correlation functions. 63

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

PROPERTY : 3 If {X(t)} and {Y(t)} are two random process then,

R XY ( τ ) ≤ 1  R XX ( 0 ) + R YY ( 0 )  2 SOLVED PROBLEMS ON CROSS CORRELATION Example:4.4.1 Two random process {X(t)} and {Y(t)} are given by X(t) = A cos (ωt+θ), Y(t) = A sin (ωt + θ) where A and ω are constants and 'θ' is a uniform random variable over 0 to 2π. Find the cross correlation function. Solution By def. we have R XY (τ) = R XY (t, t+τ) Now, R XY (t, t+τ) = E[X(t). Y(t+τ)] = E [A cos (ωt + θ). A sin (ω (t+τ) + θ)] = A 2 E sin ω ( t + τ ) + θ cos ( ωt + θ ) 

ww w.E

{

}

asy En gin ee

Since 'θ' is a uniformly distributed random variable we have

1 , 0 ≤ θ ≤ 2π 2π Now E sin {ω ( t + τ ) + θ} cos ( ωt + θ )  f(0) =



=

∫ sin ( ωt + ωτ + θ ) .cos ( wt + θ ) f ( θ ) dθ

−∞ 2π

=

 1  t +θ ∫0 sin ( ωt + ω ) .cos ( ωt + θ )  2π  dθ

=

1 sin ( ωt + ωτ + θ ) cos ( ωt + θ ) dθ 2π ∫0



1 = 2π



∫ 0

1 {sin ( ωt + ωτ + θ + ωt + θ ) 2

rin g.n et

+ sin [ ωt + ωτ + θ − ωt − θ]} dθ

1 = 2π



∫ 0

sin [ 2ωt + ωτ + 2θ] + sin ( ωτ ) dθ 2

64

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES 2π

 1  cos ( 2ωt + ωτ + 2θ ) = + sin ωτ ( θ )  − 4π  2 0 =

=

 1  cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ + 0 ) + + sin ωτ ( 2π − 0 )  − 4π  2 2   1  cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ ) + + 2π sin ωτ  − 4π  2 2 

1 [0 + 2π sin ωτ] 4π 1 = sin ωτ 2 =

(3)

ww w.E

Substituting (3) in (1) we get

A2 sin ωτ 2

R XY (= t, t τ )

asy En gin ee

4.5 SPECTRAL DENSITIES (POWER SPECTRAL DENSITY) INTRODUCTION (i) Fourier Transformation (ii) Inverse Fourier Transform (iii) Properties of Auto Correlation Function (iv) Basic Trigonometric Formula (v) Basic Integration

rin g.n et

4.5.1 SPECIAL REPRESENTATION Let x(t) be a deterministic signal. The Fourier transform of x(t) is defined as

F  x = ( t ) x = (w)



∫ x (t)e

− iωt

dt

−∞

Here X(ω) is called "spectrum of x(t)". Hence x(t) = Inverse Fourier Transform of X(ω) ∞

1 = X ( ω) eiωt dω . ∫ 2π −∞ Definition The average power P(T) of x(t) over the interval (-T, T) is given by T

1 P (T) = x 2 ( t ) dt ∫ 2T − T 65

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES ∞ X T ( ω) 1 = dω ∫ 2π −∞ 2T 2

(1)

Definition The average power PXX for the random process {X(t)} is given by T

PXX

1 = lim E  X 2 ( t )  dt ∫ T →∞ 2π −T  X ( ω) 2  E 1  T  dω = lim ∫ 2π −∞ T →∞ 2T ∞

ww w.E

(2)

4.6 POWER SPECTRAL DENSITY FUNCTION Definition If {X(t)} is a stationary process (either in the strict sense or wide sense) with auto correlation function R XX(τ), then the Fourier transform of R XX (τ) is called the power spectral density function of {X(t)} and is denoted by S XX (ω) or S(ω) or S X(ω). S XX (ω)= Fourier Transform of R XX (τ)

asy En gin ee



=

∫ R ( τ) e XX

−∞

Thus,

SXX ( f ) =



− iωτ

∫ R ( τ) e XX

−∞



− i2 πfτ



4.6.1 WIENER KHINCHINE RELATION

SXX= ( ω)

SXX ( f ) =



∫ R ( τ) e XX

−∞ ∞

∫ R ( τ) e XX

− iωτ

rin g.n et



− i2 πfτ



−∞

To find R XX(τ) if S XX(ω) or S XX(f) is given

R = XX ( τ )



1 SXX ( ω) eiωτdω ∫ 2π −∞

[inverse Fourier transform of S XX (ω)]



1 (or) R XX ( τ ) SXX ( f ) e − i2 πfτdτ = ∫ 2π −∞ [inverse Fourier transform of S XX(f)] 4.7 PROPERTIES OF POWER SPECTRAL DENSITY FUNCTION Property 1: 66

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

The value of the spectral density function at zero frequency is equal to the total area under the group of the auto correlation function. ∞

SXX ( f ) =

∫ R ( τ) e

− i2 πfc

XX



−∞

Taking f = 0, we get ∞

Sxx(0) =

∫ R ( τ ) dτ XX

−∞

TUTORIAL QUESTIONS 1. Find the ACF of {Y(t)} = AX(t)cos (w 0 + ) where X(t) is a zero mean stationary random process with ACF A and w 0 are constants and is uniformly distributed over (0, 2 ) and independent of X(t). 2. Find the ACF of the periodic time function X(t) = A sinwt 3.If X(t) is a WSS process and if Y(t) = X(t + a) – X(t – a), prove that

ww w.E

4. If X(t) = A sin( distributed over (-

), where A and are constants and is a random variable, uniformly ), Find the A.C.F of {Y(t)} where Y(t) = X2(t).

asy En gin ee

5.. Let X(t) and Y(t) be defined by X(t) = Acos t + Bsin t and Y(t) = B cos t – Asin t Where is a constant and A nd B are independent random variables both having zero mean and varaince . Find the cross correlation of X(t) and Y(t). Are X(t) and Y(t) jointly W.S.S processes? 6. Two random processes X(t) and Y(t) are given by X(t) = A cos ( ), Y(t) = A sin( ), where A and are constants and is uniformly distributed over (0, 2 ). Find the cross correlation of X(t) and Y(t) and verify that . 7..If U(t) = X cos t + Y sin t and V(t) = Y cost + X sint t where X and Y are independent random varables such that E(X) = 0 = E(Y), E[X2] = E[Y2] = 1, show that U(t) and V(t) are not jointly W.S.S but they are individually stationary in the wide sense. 8. Random Prosesses X(t) and Y(t) are defined by X(t) = A cos ( ), Y(t) = B cos ( ) where A, B and are constants and is uniformly distributed over (0, 2 ). Find the cross correlation and show that X(t) and Y(t) are jointly W.S.S

rin g.n et

WORKEDOUT EXAMPLES Example 1.Check whether the following function are valid auto correlation function (i) 5 sin nπ (ii)

1 1 + 9τ 2

Solution: (i) Given

R XX(τ) = 5 Sin nπ R XX (–τ) = 5 Sin n(–π) = –5 Sin nπ 67

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

R XX(τ) ≠ R XX (–τ), the given function is not an auto correlation function.

1 1 + 9τ 2 1 R XX (–τ) = = R XX ( τ ) 2 1 + 9 ( −τ )

(ii) Given R XX (τ) =

∴ The given function is an auto correlation function. Example : 2 Find the mean and variance of a stationary random process whose auto correlation function is given by

ww w.E R XX ( τ ) = 18 +

2 6 + τ2

Solution

2 6 + τ2 = X2 lim R XX ( τ )

Given R XX ( τ ) = 18 +

asy En gin ee

| τ | →∞

 

2

 

= lim 18 + | τ | →∞ 6 + τ2 

2 | τ | →∞ 6 + τ 2 2 = 18 + 6+ = 18 + lim

X E  X ( t )  Var {X(t)} We know that

E  X 2 ( t ) 

rin g.n et

= 18 + 0 = 18 = 18 = 18

= E[X2(t)] - {E[X(t)]}2 = R XX(0) = 18 + =

2 55 = 6+0 3

1 3

Example : 3 68

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation function of process {X(t)} Solution Consider, R XX '(t 1 , t 2 ) = E{X(t 1 )X'(t 2 )}



 X ( t 2 + h ) − X ( t 2 )   h  

= E  X ( t1 ) lim  n →0



 X ( t1 ) X ( t 2 + h ) − X ( t1 ) X ( t 2 )   h  

= lim E  h →0

 R XX ( t1 , t 2 + h ) − R X ( t1 , t 2 )   h →0 h  

ww w.E

= lim 

⇒ R XX ' (t 1 , t 2 )

Similarly R XX ' (t 1 , t 2 )

∂ R XX ( t1 , t 2 ) ∂t 2 ∂ = R XX ' ( t, t 2 ) ∂t1 ∂ = R XX ( t1 , t 2 ) ∂t, ∂t 2 =

(1)

asy En gin ee

⇒ R X ' X (t 1 , t 2 )

by (1)

Example :4 Two random process {X(t)} and {Y(t)} are given by X(t) = A cos (ωt+θ), Y(t) = A sin (ωt + θ) where A and ω are constants and 'θ' is a uniform random variable over 0 to 2π. Find the cross correlation function. Solution By def. we have R XY (τ) = R XY (t, t+τ) Now, R XY (t, t+τ) = E[X(t). Y(t+τ)] = E [A cos (ωt + θ). A sin (ω (t+τ) + θ)] = A 2 E sin ω ( t + τ ) + θ cos ( ωt + θ ) 

{

rin g.n et

}

Since 'θ' is a uniformly distributed random variable we have

1 , 0 ≤ θ ≤ 2π 2π Now E sin {ω ( t + τ ) + θ} cos ( ωt + θ )  f(0) =



=

∫ sin ( ωt + ωτ + θ ) .cos ( wt + θ ) f ( θ ) dθ

−∞

69

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES 2π

=

  ∫ sin ( ωt + ω ) .cos ( ωt + θ )  2π  dθ 1

t +θ

0



1 = sin ( ωt + ωτ + θ ) cos ( ωt + θ ) dθ 2π ∫0

1 = 2π



∫ 2 {sin ( ωt + ωτ + θ + ωt + θ ) 1

0

+ sin [ ωt + ωτ + θ − ωt − θ]} dθ

1 = 2π





ww w.E

0

sin [ 2ωt + ωτ + 2θ] + sin ( ωτ ) dθ 2



 1  cos ( 2ωt + ωτ + 2θ ) = + sin ωτ ( θ )  − 4π  2 0 =

 1  cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ + 0 ) + + sin ωτ ( 2π − 0 )  − 4π  2 2 

asy En gin ee

 1  cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ ) + + 2π sin ωτ  − 4π  2 2  1 = [0 + 2π sin ωτ] 4π 1 = sin ωτ (3) 2 =

rin g.n et

Substituting (3) in (1) we get

A2 R XY (= t, t τ ) sin ωτ 2

70

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

UNIT – 5 LINEAR SYSTEM WITH RANDOM INPUTS Introduction

Mathematically a "system" is a functional relationship between the input x(t) and y(t). We can write the relationship as y(f) = f[x(t): –∞< + <∞] Let x(t) represents a sample function of a random process {X(t)}. Suppose the system produces an output or response y(f) and the ensemble of the output functions forms a random process {Y(t)}. Then the process {Y(t)} can be considered as the output of the system or transformation 'f' with {X(t)} as the input and the system is completely specified by the operator "f".

ww w.E

5.1 LINEAR TIME INVARIANT SYSTEM Mathematically a "system" is a functional relationship between the input x(t) and output y(t). we can write the relationship

y ( t )= f  x ( t )

: − ∞ < t < ∞

asy En gin ee

5.2 CLASSIFICATION OF SYSTEM 1. Linear System: f is called a linear system, if it satisfies

f a1X1 ( t ) ± a= 2 x 2 ( t )  a1f  X1 ( t ) ± a 2f  X 2 ( t )  

2. Time Invariant System: Let Y(t) = f[X(t)] If Y ( t + h ) =f  X ( t + h )  , then f is called a time invariant system or X(t) and Y(t) are said to form a time invariant system. 3. Causal System: Suppose the value of the output Y(t) at t = t 0 depends only on the past values of the input X(t), t≤t 0 . In other words, if Y ( t 0 ) =f  X ( t ) : t ≤ t 0  , then such a system is called a causal

rin g.n et

system. 4. Memory less System: If the output Y(t) at a given time t = t 0 depends only on X(t 0 ) and not on any other past or future values of X(t), then the system f is called memory less system. 5. Stable System: A linear time invariant system is said to be stable if its response to any bounded input is bounded. REMARK: i) Noted that when we write X(t) we mean X(s,t) where s ∈ S, S is the sample space. If the system operator only on the variable t treating S as a parameter, it is called a deterministic system. ()  → Input X t

Linear System

()  → Output Y t

71

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

h(t) (a) ()  → Input X t

() LTI System  → Output Y t

h(t) (b) a) Shows a general single input - output linear system b) Shows a linear time invariant system 5.3 REPRESENTATION OF SYSTEM IN THE FORM OF CONVOLUTION

Y(t) =h (t)x X(t) ∞

ww w.E = Y(t)

∫ h ( u ) X ( t − u ) du

−∞ ∞

∫ h ( t − u ) X ( u ) du

=

asy En gin ee

−∞

5.4 UNIT IMPULSE RESPONSE TO THE SYSTEM If the input of the system is the unit impulse function, then the output or response is the system weighting function. Y(t) = h(t) Which is the system weight function. 5.4.1 PROPERTIES OF LINEAR SYSTEMS WITH RANDOM INPUT Property 1: If the input X(t) and its output Y(t) are related by

Y(t) =



rin g.n et

∫ h ( u ) X ( t − u ) du , then the system is a linear time - invariant system.

−∞

Property 2: If the input to a time - invariant, stable linear system is a WSS process, then the output will also be a WSS process, i.e To show that if {X(t)} is a WSS process then the output {Y(t)} is a WSS process. Property 3: ∞

If

{X(t)}

is

a

WSS

process

and

if

Y(t)

=

∫ h ( u ) X ( t − u ) du ,

then

−∞

R XY (= τ ) R XX ( τ ) x h ( τ ) Property 4 :



If

{(X(t)}

is

a

WSS

process

and

if

Y(t)

=

∫ h ( u ) X ( t − u ) du ,

then

−∞

R YY= ( τ ) R XY ( τ ) x h ( −τ ) 72

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

Property 5: ∞

If

{X(t)}

is

a

WSS

process

and

if

Y(t)

=

∫ h ( u ) X ( t − u ) du ,

then

−∞

R YY= ( τ ) R XX ( τ ) x h ( τ ) x h ( −τ ) Property 6: ∞

∫ h ( u ) X ( t − u ) du , then S ( ω=)

If {X(t)} is a WSS process if Y(t) =

XY

SXX ( ω) H ( ω)

−∞

. Property 7: ∞

If

{X(t)}

is

a

WSS

ww w.E

SYY ( ω= ) SXX ( ω) H ( ω)

process

and

if

Y(t)

=

∫ h ( u ) X ( t − u ) du ,

then

−∞

2

Note:

Instead of taking R XY ( τ )= E  X ( t ) Y ( t + τ )  in properties (3), (4) & (5), if we start with R XY ( τ )= E  X ( t − τ ) Y ( t )  , then the above property can also stated as

asy En gin ee

a) R XY= ( τ ) R XY ( τ ) xh ( −τ )

b) R YY= ( τ ) R XY ( τ ) xh ( −τ )

c) R YY ( τ ) = R XX ( τ ) x h ( −τ ) x h ( τ ) REMARK :

(i) We have written H ( ω) H * ( ω)=

H ( ω) because 2

H ( ω) = F  h ( τ )  H * ( ω)= F  h ( −τ ) 

(

)

=  F h ( τ ) 

rin g.n et

= H ( ω) (ii) Equation (c) gives a relationship between the spectral densities of the input and output process in the system. (iii) System transfer function: We call H ( ω = ) F{h ( τ )} as the power transfer function or system transfer function. SOLVED PROBLEMS ON AUTO CROSS CORRELATION FUNCTIONS OF INPUT AND OUTPUT Example :5.4.1 Find the power spectral density of the random telegraph signal. Solution We know, the auto correlation of the telegraph signal process X(y) is 73

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

R XX ( τ ) =e

−2 λ τ

∴ The power spectral density is ∞

SXX= ( ω)

∫ R ( τ) e

−∞ 0

=

− iωτ

XX

∫e

2 λτ − iωτ

e

−∞





dτ + ∫ e −2 λτe − iωτdτ 0

 τ = −τ when τ < 0   τ = −τ when τ > 0  0

( 2 λ−iω)τ

ww w.E



=

∫e

−∞

dτ + ∫ e −2 λτe − iωτdτ 0

0



 e( 2 λ−iω)τ   e −( 2 λ+iω)τ  =   +   ( 2λ − iω)  ∞  − ( 2λ + iω)  0 1 1 e0 − e −∞  − e −∞ − e −0  = ( 2λ − iω) ( 2λ + iω)

asy En gin ee

=

1 1 (1 − 0 ) − ( 0 − 1) ( 2λ − iω) ( 2λ + i ω )

=

1 1 e −∞ − e −0  + ( 2λ − i ω ) ( 2 λ + i ω )

=

1 (1 − 0 ) + ( 2λ − iω)

=

2λ + iω + 2λ − iω ( 2λ − iω)( 2λ + iω)

rin g.n et

4λ SXX ( ω) = 2 4λ + ω2

Example : 5.4.2 A linear time invariant system has a impulse response h ( t ) = e −βt U ( t ) . Find the power spectral density of the output Y(t) corresponding to the input X(t). Solution: Given X(t) - Input Y(t) - output S YY (ω) - |H(ω)|2 S XX(ω)

74

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES ∞

Now H ( ω) = h ( t ) e − iωt dt



−∞ 0

=

∫ h (t)e

− iωt

−∞



dt + ∫ e −βt e − iωt dt 0



− β+ iω t = 0+ e ( ) dt

∫ 0



 e −(β+iω)t  =    − ( β + iω )  0 1 = e −∞ − e −0 − ( β + iω )

(

)

ww ( ) w.E asy En gin ee

|H(ω)|

=

1 e −∞ − e −0 − ( β + iω )

=

1 β + iω

=

=

∴ SYY ( ω) =

1 β + iω 1

β2 + ω2

1 SXX ( ω) β + ω2

rin g.n et

2

TUTORIAL QUESTIONS

1. State and Prove Power spectral density of system response theorem. 2. Suppose that X(t) is the input to an LTI system impulse response h1 (t) and that Y(t) is the input to another LTI system with impulse response h2 (t). It is assumed that X(t) and Y(t) are jointly wide sense stationary. Let V(t) and Z(t) denote that random processes at the respective system outputs. Find the cross correlation of X(t) and Y(t). 3. The input to the RC filter is a white noise process with ACF . If the frequency response

find the auto correlation and the mean square value of the

output process Y(t). 4. A random process X(t0 having ACF

, where P and

are real positive λ e − λt , t > 0 constants, is applied to the input of the system with impulse response h(t) =  where 0, t < 0 75

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

MA6451

PROBABILITY AND RANDOM PROCESSES

λ is a real positive constant. Find the ACF of the network’s response Y(t). Find the cross correlation

.

WORKEDOUT EXAMPLES Example: 1 Find the power spectral density of the random telegraph signal. Solution We know, the auto correlation of the telegraph signal process X(y) is

R XX ( τ ) =e

−2 λ τ

∴ The power spectral density is ∞

SXX= ( ω)

∫ R ( τ) e

ww w.E

− iωτ

XX

−∞ 0

=



−2 λτ − iωτ 2 λτ − iωτ ∫ e e dτ + ∫ e e dτ

−∞

0

=



∫e

−∞

( 2 λ−iω)τ

0

 τ = −τ when τ < 0   τ = −τ when τ > 0 

asy En gin ee ∞

dτ + ∫ e −2 λτe − iωτdτ 0

( 2 λ−iω)τ

0



 e   e −( 2 λ+iω)τ  =   +  2 i 2 i λ − ω − λ + ω ( ) ( )  ∞  0 1 1 e0 − e −∞  − e −∞ − e −0  = ( 2λ − iω) ( 2λ + iω) =

1 1 (1 − 0 ) − ( 0 − 1) ( 2λ − iω) ( 2λ + i ω )

=

1 1 e −∞ − e −0  + ( 2λ − i ω ) ( 2 λ + i ω )

=

1 (1 − 0 ) + ( 2λ − iω)

=

2λ + iω + 2λ − iω ( 2λ − iω)( 2λ + iω)

rin g.n et

4λ SXX ( ω) = 2 4λ + ω2 76

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

Probability & Random Process Formulas UNIT-I (RANDOM VARIABLES) 1) Discrete random variable: A random variable whose set of possible values is either finite or countably  infinite is called discrete random variable.    Eg: (i) Let X represent the sum of the numbers on the 2 dice, when two  dice are thrown.  In this case the random variable X takes the values 2, 3, 4, 5, 6,  7, 8, 9, 10, 11 and 12.  So X is a discrete random variable.          (ii) Number of transmitted bits received in error.  2) Continuous random variable: A random variable X is said to be continuous if it takes all possible values  between certain limits.    Eg: The length of time during which a vacuum tube installed in a circuit  functions is a continuous random variable, number of scratches on a surface,  proportion of defective parts among 1000 tested, number of transmitted in  error.  3)

ww w.E

asy En gin ee

Sl.No. Discrete random variable 1 





i =−∞

2  3 

Continuous random variable ∞

p( x i ) = 1  



F ( x) = P [ X ≤ x]  

Mean = E [ X ] = ∑ xi p( xi )  

Mean = E [ X ] =

E  X  = ∑ x p( xi )   2

rin g.n et

F ( x) = P [ X ≤ x] =

i



f ( x )dx = 1  

−∞

2 i

E  X 2  =

i

x



f ( x )dx  

−∞



∫ xf ( x )dx

−∞



∫x

2

f ( x )dx

−∞



Var ( X ) = E ( X 2 ) −  E ( X )    Var ( X ) = E ( X 2 ) −  E ( X ) 



Moment =  E  X r  = ∑ xir pi  

2

i



M.G.F 

Moment =  E  X r  =

2



∫x

r

f ( x )dx

−∞

M.G.F 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

M X ( t ) = E  e tX  = ∑ e tx p( x )   x

M X ( t ) = E  e tX  =



∫e

tx

f ( x )dx  

−∞

4) E ( aX + b ) = aE ( X ) + b

5) Var ( aX + b ) = a 2 Var ( X )

6) Var ( aX ± bY ) = a 2 Var ( X ) + b 2Var (Y )

7) Standard Deviation = Var ( X )

8) f ( x ) = F ′( x ) 9) p( X > a ) = 1 − p( X ≤ a ) p( A ∩ B) , p( B) ≠ 0 10) p ( A / B ) = p( B)

ww w.E

11) If A and B are independent, then p ( A ∩ B ) = p ( A ) ⋅ p ( B ) . 

12) 1st Moment about origin =  E [ X ]  =   M X ′ ( t )     (Mean)   t =0 2nd Moment about origin =  E  X 2   =   M X ′′ ( t )      t =0 r t The co-efficient of   =  E  X r    (rth Moment about the origin) r! 13) Limitation of M.G.F: i) A random variable X may have no moments although its m.g.f exists. ii) A random variable X can have its m.g.f and some or all moments, yet the  m.g.f does not generate the moments. iii) A random variable X can have all or some moments, but m.g.f does not  exist except perhaps at one point. 14) Properties of M.G.F: i) If Y = aX + b, then  MY ( t ) = e bt M X ( at ) .

asy En gin ee

ii) iii)

rin g.n et

M cX ( t ) = M X ( ct ) , where c is constant.

If X and Y are two independent random variables then  M X +Y ( t ) = M X ( t ) ⋅ M Y ( t ) .

15) P.D.F, M.G.F, Mean and Variance of all the distributions: Sl. Distributio M.G.F P.D.F ( P ( X = x ) ) No. 1 

n Binomial 



Poisson 

nc x p x q n− x  

Mean

Variance

np  

npq  



λ 

λ 

pe t   1 − qe t

1   p

q   p2

( q + pe ) t

n

 



Geometric  

e λ   x! q x −1 p  (or)  q x p   −λ

x

e

(

λ e t −1

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net



Uniform  



Exponential  



Gamma 



Normal  

f ( x) = f ( x) =

λ λ−t

e − x x λ −1 , 0 < x < ∞, λ > 0   Γ(λ ) 1

σ 2π

e

−1  x − µ  2  σ 

   e

 

1

 

µt +

t 2σ 2 2

1

 

λ2

λ 

λ 

µ 

σ2 

λ

1   (1 − t )λ

2

ww w.E

a + b (b − a )2 2 12    

e bt − e at   ( b − a )t

 1 , a< x 0, λ > 0   f ( x) =  otherwise  0,

 

 

  16) Memoryless property of exponential distribution  P ( X > S + t / X > S ) = P ( X > t ) .  17) Function of random variable:  fY ( y ) = f X ( x )

dx dy

asy En gin ee

UNIT-II (RANDOM VARIABLES) 1)

∑∑ p

ij

i ∞ ∞

∫∫

−∞ −∞

j

= 1   (Discrete random variable) 

f ( x , y )dxdy = 1   (Continuous random variable) 

rin g.n et

2) Conditional probability function X given Y  P { X = xi / Y = yi } =

Conditional probability function Y given X  P {Y = yi / X = xi } = P { X < a / Y < b} =

P ( X < a,Y < b )   P (Y < b )

P ( x, y ) P( y)

P ( x, y )

3) Conditional density function of X given Y, 

f ( x / y) =

f ( x, y) .  f ( y)

Conditional density function of Y given X, 

f ( y / x) =

f ( x, y) .  f ( x)

P( x)





4) If X and Y are independent random variables then 

f ( x , y ) = f ( x ). f ( y )   

 

(for continuous random variable) 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

P ( X = x , Y = y ) = P ( X = x ) . P (Y = y ) (for discrete random variable)  d b

5) Joint probability density function P ( a ≤ X ≤ b, c ≤ Y ≤ d ) = ∫ ∫ f ( x , y )dxdy .  c a b a

P ( X < a , Y < b ) = ∫ ∫ f ( x , y )dxdy   0 0

6) Marginal density function of X,  f ( x ) = f X ( x ) =





f ( x , y )dy  

−∞

Marginal density function of Y,  f ( y ) = fY ( y ) =

ww w.E





f ( x , y )dx  

−∞

7) P ( X + Y ≥ 1) = 1 − P ( X + Y < 1) 8) Correlation co – efficient (Discrete):  ρ ( x , y ) = Cov ( X , Y ) =

Cov ( X , Y )

σ Xσ Y

asy En gin ee

1 1 XY − XY ,    σ X = X 2 − X 2 ,    σ Y = ∑ ∑ n n

9) Correlation co – efficient (Continuous):  ρ ( x , y ) =

1 Y 2 −Y 2   ∑ n

Cov ( X , Y )

σ Xσ Y

Cov ( X , Y ) = E ( X , Y ) − E ( X ) E (Y ) ,    σ X = Var ( X ) ,    σ Y = Var (Y )

rin g.n et

10) If X and Y are uncorrelated random variables, then Cov ( X , Y ) = 0 . 11) E ( X ) =





∞ ∞

−∞

−∞

−∞ −∞

∫ xf ( x )dx ,    E (Y ) = ∫ yf ( y )dy ,   E ( X ,Y ) = ∫ ∫ xyf ( x , y )dxdy .

12) Regression for Discrete random variable:

∑( x − x)( y − y) ∑( y − y) ∑( x − x)( y − y) = ∑( x − x)

Regression line X on Y is  x − x = bxy ( y − y ) ,  

bxy =

2

Regression line Y on X is  y − y = b yx ( x − x ) , 

byx

2

Correlation through the regression,   ρ = ± bXY .bYX       Note:  ρ ( x , y ) = r ( x , y )    

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

13) Regression for Continuous random variable:

Regression line X on Y is  x − E ( x ) = bxy ( y − E ( y ) ) ,  

bxy = r

σx σy

Regression line Y on X is  y − E ( y ) = b yx ( x − E ( x ) ) ,  

b yx = r

σy   σx

x = E ( x / y) =

Regression curve X on Y is  



∫ x f ( x / y ) dx  

 

−∞

y = E ( y / x) =

Regression curve Y on X is  

ww w.E



∫ y f ( y / x ) dy  

 

−∞

14) Transformation Random Variables:

  fY ( y ) = f X ( x )

dx     dy

 

(One dimensional random variable) 

asy En gin ee

∂u ∂x fUV ( u, v ) = f XY ( x , y ) ∂v ∂x

∂u ∂y   ∂v ∂y

(Two dimensional random variable)

15) Central limit theorem (Liapounoff’s form)

rin g.n et

If X1, X2, …Xn be a sequence of independent R.Vs with E[Xi] = µi and Var(Xi) = σi2, i  = 1,2,…n and if  Sn =  X1 +  X2 + … + Xn then under certain general conditions, Sn  n

n

follows a normal distribution with mean  µ = ∑ µi and variance  σ 2 = ∑ σ i2  as  i =1

n→∞. 16) Central limit theorem (Lindberg – Levy’s form)

i =1

If X1, X2, …Xn be a sequence of independent identically distributed R.Vs with E[Xi]  = µi and Var(Xi) = σi2, i = 1,2,…n and if  Sn =  X1 +  X2 + … + Xn then under certain  general conditions, Sn follows a normal distribution with mean  nµ and variance 

nσ 2  as  n → ∞ .  Note:  z =

Sn − nµ X −µ   ( for n variables),             z =  ( for single variables)  σ σ n n

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS) Random Process: A random process is a collection of random variables {X(s,t)} that are  functions of a real variable, namely time ‘t’ where s Є S and t Є T. 

1)

  Classification of Random Processes: We can classify the random process according to the characteristics of time t  and the random variable X.  We shall consider only four cases based on t and X  having values in the ranges -∞< t <∞ and   -∞ < x < ∞. 

2)

ww w.E  

Continuous random process  Continuous random sequence  Discrete random process 

asy En gin ee

Discrete random sequence 

Continuous random process: If X and t are continuous, then we call X(t) , a Continuous Random Process.  Example:  If X(t) represents the maximum temperature at a place in the  interval (0,t), {X(t)} is a Continuous Random Process.  Continuous Random Sequence: A random process for which X is continuous but time takes only discrete values is  called a Continuous Random Sequence.   Example:  If Xn represents the temperature at the end of the nth hour of a day, then  {Xn, 1≤n≤24} is a Continuous Random Sequence.  Discrete Random Process: If X assumes only discrete values and t is continuous, then we call such random  process {X(t)} as Discrete Random Process.  Example:  If X(t) represents the number of telephone calls received in the interval  (0,t) the {X(t)} is a discrete random process since S = {0,1,2,3, . . . }  Discrete Random Sequence: A random process in which both the random variable and time are discrete is called  Discrete Random Sequence.  Example:  If Xn represents the outcome of the nth toss of a fair die, the {Xn : n≥1} is a  discrete random sequence.  Since T = {1,2,3, . . . } and S = {1,2,3,4,5,6} 

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

3)

4)

Condition for Stationary Process:  E [ X ( t )] = Constant , Var [ X ( t )] = constant . If the process is not stationary then it is called evolutionary.    Wide Sense Stationary (or) Weak Sense Stationary (or) Covariance Stationary: A random process is said to be WSS or Covariance Stationary if it satisfies the  following conditions.  i) The mean of the process is constant (i.e)  E ( X ( t ) ) = constant .  ii)

5)

Auto correlation function depends only on  τ (i.e) RXX (τ ) = E [ X ( t ). X ( t + τ )]  

Time average:

1 The time average of a random process  { X ( t )} is defined as X T = 2T

ww w.E 6)

7)

T

∫ X (t ) dt . 

−T

T

1 If the interval is  ( 0,T ) , then the time average is  X T = ∫ X ( t ) dt .  T 0 Ergodic Process: A random process  { X ( t )} is called ergodic if all its ensemble averages are 

asy En gin ee

interchangeable with the corresponding time average X T .  Mean ergodic: Let  { X ( t )} be a random process with mean  E [ X ( t )] = µ  and time average X T ,  then  { X ( t )} is said to be mean ergodic if  X T → µ  as  T → ∞  (i.e)  E [ X ( t )] = Lt X T .  T →∞

Note:  Lt var ( X T ) = 0  (by mean ergodic theorem)  T →∞

8)

rin g.n et

Correlation ergodic process: The stationary process  { X ( t )} is said to be correlation ergodic if the process  YT .  {Y ( t )} is mean ergodic where  Y ( t ) = X ( t ) X ( t + τ ) . (i.e)  E (Y ( t ) ) = TLt →∞

9)

Where  YT is the time average of  Y ( t ) .  Auto covariance function: C XX (τ ) = RXX (τ ) − E ( X ( t ) ) E ( X ( t + τ ) )  

10) Mean and variance of time average: T 1 E  X T  = ∫ E [ X ( t )] dt   Mean:    T 0 Variance: 

1 Var  X T  = 2T

2T



RXX (τ )C XX (τ ) dτ  

−2 T

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

11) Markov process: A random process in which the future value depends only on the present value  and not on the past values, is called a markov process. It is symbolically  represented by  P  X ( t n +1 ) ≤ xn +1 / X ( t n ) = xn , X ( t n −1 ) = xn −1 ... X ( t 0 ) = x0    = P  X ( t n +1 ) ≤ xn +1 / X ( t n ) = xn        Where  t 0 ≤ t1 ≤ t 2 ≤ ... ≤ t n ≤ t n +1   12) Markov Chain: If for all  n ,  P  X n = an / X n −1 = an −1 , X n − 2 = an − 2 , ... X 0 = a0  = P  X n = an / X n −1 = an −1     

then the process  { X n } ,  n = 0,1, 2, ...  is called the markov chain. Where 

ww w.E

a0 , a1 , a2 , ...an , ...  are called the states of the markov chain.  13) Transition Probability Matrix (tpm): When the Markov Chain is homogenous, the one step transition probability is  denoted by Pij.  The matrix P = {Pij} is called transition probability matrix.  14) Chapman – Kolmogorov theorem: If ‘P’ is the tpm of a homogeneous Markov chain, then the n – step tpm P(n) is 

asy En gin ee n

equal to Pn.  (i.e)  Pij( n ) =  Pij  .  15) Markov Chain property: If  Π = ( Π 1 , Π 2 , Π 3 ) , then  ΠP = Π  and  Π 1 + Π 2 + Π 3 = 1 .  16) Poisson process: If  X ( t ) represents the number of occurrences of a certain event in  (0, t ) ,then 

the discrete random process  { X ( t )} is called the Poisson process, provided the  following postulates are satisfied.    P [1 occurrence in ( t , t + ∆t )] = λ∆t + O ( ∆t )   (i) (ii) (iii) (iv)

P [ 0 occurrence in ( t , t + ∆t )] = 1 − λ ∆t + O ( ∆t )  

P [ 2 or more occurrences in ( t , t + ∆t )] = O ( ∆t )  

rin g.n et

X ( t )  is independent of the number of occurrences of the event in any  interval. 

17) Probability law of Poisson process:  P { X ( t ) = x} =

e−λt ( λ t )

x

, x! Mean E [ X ( t )] = λ t ,   E  X 2 ( t )  = λ 2 t 2 + λ t , Var [ X ( t )] = λ t . 

x = 0,1, 2, ...∞  

UNIT-IV (CORRELATION AND SPECTRAL DENSITY) RXX (τ )  - Auto correlation function 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

S XX ( ω )  - Power spectral density (or) Spectral density  RXY (τ )  - Cross correlation function  S XY ( ω )  - Cross power spectral density 

1)

Auto correlation to Power spectral density (spectral density):   S XX ( ω ) =



∫ R (τ ) e

− iωτ

XX

dτ  

−∞

2)

Power spectral density to Auto correlation:

ww w.E RXX (τ ) =

3)

1 2π



ωτ ∫ S (ω ) e dω   i

XX

 

 

−∞

Condition for X ( t ) and X ( t + τ ) are uncorrelated random process is C XX (τ ) = RXX (τ ) − E [ X ( t )] E [ X ( t + τ )] = 0  

4)

asy En gin ee

Cross power spectrum to Cross correlation:

1 RXY (τ ) = 2π 5)



ωτ ∫ S ( ω ) e dω   i

XY

−∞

General formula: i)

ax ∫ e cos bx dx =

e ax ( a cos bx + b sin bx )   a 2 + b2

rin g.n et

ii)

e ax ax e bx dx sin = ( a sin bx − b cos bx )   ∫ a 2 + b2

iii)

a  a2  x + ax =  x +  −   2 4 

iv)

sin θ =

e iθ − e − iθ   2i

v)

cos θ =

e iθ + e − iθ   2

2

2

     

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS) 1) Linear system:

f is called a linear system if it satisfies  f a1 X 1 ( t ) ± a2 X 2 ( t ) = a1 f ( X 1 ( t ) ) ± a2 f ( X 2 ( t ) )  

2) Time – invariant system: Let  Y ( t ) = f ( X ( t ) ) . If  Y ( t + h) = f ( X ( t + h) )  then  f  is called a time –  invariant system. 

ww w.E

3) Relation between input X ( t ) and output Y ( t ) :

Y (t ) =



∫ h(u) X (t − u) du  

−∞

Where  h( u)  system weighting function. 

asy En gin ee

4) Relation between power spectrum of X ( t ) and output Y ( t ) : SYY (ω ) = S XX (ω ) H (ω )   2

If  H (ω )  is not given use the following formula  H (ω ) = 5) Contour integral: ∞

e imx π −ma ∫−∞ a 2 + x 2 = a e     τ

−aτ  1  e = 6) F −1  2   2 2a a + ω 

 

(One of the result) 



∫e

− jω t

h( t ) dt   

−∞

rin g.n et

(from the Fourier transform) 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

UNIT I: PROBABILITY AND RANDOM VARIABLES PART B QUESTIONS 1. A random variable X has the following probability distribution X 0 1 2 3 4 5 6 7 P(X) 0 K 2k 2k 3k k2 2k2 7k2+k Find (i) The value of k, (ii) P[ 1.5 < X < 4.5 / X >2 ] and (iii) The smallest value of λ for which P(X ≤ λ) < (1/2). 2. A bag contains 5 balls and its not known how many of them are white. Two balls are drawn at random from the bag and they are noted to be white. What is the chance that the balls in the bag all are white. 1 x 3. Let the random variable X have the PDF f(x) = e 2 , x >0 Find the moment 2 generating function, mean and variance. 4. A die is tossed until 6 appear. What is the probability that it must tossed more than 4 times. 5. A man draws 3 balls from an urn containing 5 white and 7 black balls. He gets Rs. 10 for each white ball and Rs 5 for each black ball. Find his expectation. 6. In a certain binary communication channel, owing to noise, the probability that a transmitted zero is received as zero is 0.95 and the probability that a transmitted one is received as one is 0.9. If the probability that a zero is transmitted is 0.4, find the probability that (i) a one is received (ii) a one was transmitted given that one was received 7. Find the MGF and rth moment for the distribution whose PDF is f(x) = k e –x , x >0. Find also standard deviation. 8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second bag contains 2 white, 3 red and 5 black balls and third bag contains 3 white, 4 red and 2 black balls. One bag is chosen at random and from it 3 balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are the probabilities that they were taken from first bag, second bag and third bag. 9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < ½) (ii) P ( ¼ < X < ½) (iii) P ( X > ¾ / X > ½ ) 10. If the density function of a continuous random variable X is given by ax 0≤x≤1 a 1≤x≤2 f(x) = 3a – ax 2≤x≤3 0 otherwise

ww w.E

asy En gin ee

rin g.n et

(1) Find a (2) Find the cdf of X 11. If the the moments of a random variable X are defined by E ( X r ) = 0.6, r = 1,2,.. Show that P (X =0 ) = 0.4 P ( X = 1) = 0.6, P ( X  2 ) = 0.

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

12. In a continuous distribution, the probability density is given by f(x) = kx (2 – x) 0 < x < 2. Find k, mean , varilance and the distribution function.

13. The cumulative distribution function of a random variable X is given by 0, x<0 x2, 0≤x≤½ 3 F(x) = 1 ½≤x≤3 (3  x) 2 25 1 x 3 Find the pdf of X and evaluate P ( |X| ≤ 1 ) using both the pdf and cdf 14. Find the moment generating function of the geometric random variable with the pdf f(x) = p q x-1, x = 1,2,3.. and hence find its mean and variance.

ww w.E

15. A box contains 5 red and 4 white balls. A ball from the box is taken our at random and kept outside. If once again a ball is drawn from the box, what is the probability that the drawn ball is red? 16. A discrete random variable X has moment generating function 5

asy En gin ee

1 3  M X(t) =   et  Find E(x), Var(X) and P (X=2) 4 4  17. The pdf of the samples of the amplitude of speech wave foem is found to decay exponentially at rate , so the following pdf is proposed f(x) = Ce | x| , -  < X < . Find C, E(x)

18.

rin g.n et

Find the MGF of a binomial distribution and hence find the mean and variance. . Find the recurrence relation of central moments for a binomial distribution. 19.

The number of monthly breakdowns of a computer is a RV having a poisson distribution with mean equal to 1.8. Find the probability that this computer will function for a month (a) without a breakdown, (b) Wish only one breakdown, (c) Wish at least one break down. 20. Find MGF and hence find mean and variance form of binomial distribution. 21. State and prove additive property of poisson random variable. 22. If X and Y are two independent poisson random variable, then show that probability distribution of X given X+Y follows binomial distribution. 23. Find MGF and hence find mean and variance of a geometric distribution. 24. State and prove memory less property of a Geometric Distribution. 25. Find the mean and variance of a uniform distribution.

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

UNIT- II TWO DIMENSIONAL RANDOM VARIABLES Part- B 1. If f (x, y) = x+y , 0< x <1, 0< y < 1 0 , Otherwise Compute the correlation cp- efficient between X and Y.

2. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) = (8 /9) xy, 1 ≤ x ≤ y ≤ 2 find the marginal density unctions of X and Y. Find also the conditional density function of Y / X =x, and X / Y = y. 3. The joint probability density function of X and Y is given by f(x, y) = (x + y) /3, 0 ≤ x ≤1 & 0 0, x2 >0. Find the probability that the first random variable will take on a value between 1 and 2 and the second random variable will take on avalue between 2 and 3. \also find the probability that the first random variable will take on a value less than 2 and the second random variable will take on a value greater than 2.

ww w.E

asy En gin ee

5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1 <1, 0< x2 < 1 6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density function. Find P(X + Y < 1 ) . Are X and Y independent. 7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y2), 0 < x < 1 , 0< y <1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)

rin g.n et

8. Two random variable X and Y have p.d.f f(x, y) = x2 + ( xy / 3), 0 ≤ x ≤1, 0≤ y ≤ 2. Prove that X and Y are not independent. Find the conditional density function 9. X and Y are 2 random variable joint p.d.f. f(x, y) = 4xy the p. d. f. of

x 2 +y2 .

e



 x2  y 2



, x ,y ≥ 0, find

10. Two random variable X and Y have joint f(x y) 2 – x –y, 0< x <1, 0< y < 1. Find the Marginal probability density function of X and Y. Also find the conditional density unction and covariance between X and Y. 11. Let X and Y be two random variables each taking three values –1, 0 and 1 and having the joint p.d.f. X Y -1 0 1 Prove that X and Y have different -1 0 0.1 0.1 expections. Also Prove that X and Y are 0 0.2 0.2 0.2 uncorrelated and find Var X and Var Y 1 0 0.1 0.1

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

12. 20 dice are thrown. Find the approximate probability tat the sum obtained is between 65 and 75 using central limit theorem. 13. Examine whether the variables X and Y are independent whose joint density is –xy – x, 0< x , y < ∞ f(x ,y) = x e . 14. Let X and Y be independent standard normal random variables. Find the pdf of z =X / Y. 15. Let X and Y be independent uniform random variables over (0,1) . Find the PDF of Z = X + Y

ww w.E

UNIT-III CLASSIFICATION OF RANDOM PROCESS PART B

asy   E ngi   nee

1. The process { X(t) } whose proabability distribution is given by

P [ X(t) = n] = =

 at 

n 1

1  at

n 1

, n  1, 2...

at ,n  0 1  at

Show that it is not stationary

rin g.n et

2. A raining process is considered as a two state Markov chain. If it rains, it is considered to be in state 0 and if it does not rain, the chain is in state 1. the  0.6 0.4  transitioin probability of the markov chain is defined as P    . Find the  0.2 0.8  probability of the Markov chain is defined as today assuming that it is raining today. Find also the unconditional probability that it will rain after three days with the initial probabilities of state 0 and state 1 as 0.4 and 0.6 respectively. 3. Let X(t) be a Poisson process with arrival rate . Find E {( X (t) – X (s)2 } for t > s. 4. Let { Xn ; n = 1,2..} be a Markov chain on the space S = { 1,2,3} with on step  0 1 0    transition probability matrix P  1/ 2 0 1/ 2  (1) Sketch transition diagram (2) Is  1 0 0   

the chain irreducible? Explain. (3) Is the chain Ergodic? Explain 5. Consider a random process X(t) defined by X(t) = U cost + (V+1) sint, where U and V are independent random variables for which E (U ) = E(V) = 0 ; E (U 2) = E

WWW.VIDYARTHIPLUS.COM

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

( V2 ) = 1 (1) Find the auto covariance function of X (t) (2) IS X (t) wide sense stationary? Explain your answer. 6. Discuss the pure birth process and hence obtain its probabilities, mean and variance. 7. At the receiver of an AM radio, the received signal contains a cosine carrier signal at the carrier frequency  with a random phase  that is uniform distributed over ( 0,2). The received carrier signal is X (t) = A cos(t +  ). Show that the process is second order stationary 8. Assume that a computer system is in any one of the three states busy, idle and under repair respectively denoted by 0,1,2. observing its state at 2 pm each day,  0.6 0.2 0.2    we get the transition probability matrix as P   0.1 0.8 0.1  . Find out the 3rd  0.6 0 0.4   

ww w.E

step transition probability matrix. Determine the limiting probabilities.

9. Given a random variable  with density f () and another random variable  uniformly distributed in (-, ) and independent of  and X (t) = a cos (t + ),

asy En gin ee

Prove that { X (t)} is a WSS Process.

10. A man either drives a car or catches a train to go to office each day. He never goes 2 days in a row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by train. Now suppose that on the first day of the week, the man tossed a fair die and drove to work iff a 6 appeared.

rin g.n et

Find (1) the probability that he takes a train on the 3 rd day. (2) the probability that he drives to work in the long run.

11. Show that the process X (t) = A cost + B sin t (where A and B are random

variables) is wide sense stationary, if (1) E (A ) = E(B) = 0 (2) E(A2) = E (B2 ) and E(AB) = 0 12. Find probability distribution of Poisson process.

13. Prove that sum of two Poisson processes is again a Poisson process. 14. Write classifications of random processes with example

Unit 4 Correlation and Spectrum Densities

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

ww w.E

asy En gin ee

rin g.n et

V+ TEAM

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

ww w.E

asy En gin ee

rin g.n et

V+ TEAM

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

WWW.VIDYARTHIPLUS.COM

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww

w.E

asy En gi

nee

rin

g.n

et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww

w.E

asy En gi

nee

rin

g.n

et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww

w.E

asy En gi

nee

rin

g.n

et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww

w.E

asy En gi

nee

rin

g.n

et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

B.E./B.Tech. DEGREE EXAMINATION, APRIL/MAY 2010 Fourth Semester Electronics and Communication Engineering MA2261 – PROBABILITY AND RANDOM PROCESS (Common to Bio – Medical Engineering) (Regulation 2008) Time : Three hours

Maximum : 100 marks

ww w.E

Use of Statistical Tables is permitted Answer ALL Questions PART A – (10 x 2 = 20 Marks)

asy En gin ee

1. If the p.d.f of a random variable X is f ( x ) =

x in 0 ≤ x ≤ 2 , find P ( X > 1.5 / X > 1) . 2

2. If the MGF of a uniform distribution for a random variable X is

1 5t e − e 4 t ) , find E ( X ) . ( t

3. Find the value of k , if f ( x , y ) = k (1 − x )(1 − y ) in 0 < x , y < 1 and f ( x , y ) = 0 , otherwise, is to be the joint density function.

rin g.n et

4. A random variable X has mean 10 and variance 16. Find the lower bound for P (5 < X < 15) . 5. Define a wide sense stationary process. 6. Define a Markov chain and give an example.

7. Find the mean of the stationary process { x ( t )} , whose autocorrelation function is given by

R(τ ) = 16 +

9 . 1 + 16τ 2

8. Find the power spectral density function of the stationary process whose autocorrelation function is given by e

−τ

.

9. Define time – invariant system. 10. State autocorrelation function of the white noise.

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

PART B – (5 x 16 = 80 marks)

11. (a) (i) The probability mass function of random variable X is defined as P ( X = 0) = 3C 2 ,

P ( X = 1) = 4C − 10C 2 , P ( X = 2) = 5C − 1 , where C > 0 and P ( X = r ) = 0 if

r ≠ 0,1, 2 . Find (1) The value of C (2) P (0 < X < 2 / x > 0)

ww w.E

(3) The distribution function of X

(4) The largest value of X for which F ( x ) <

1 . 2

asy En gin ee

(ii) If the probability that an applicant for a driver’s license will pass the road test on any given trial is 0.8. What is the probability that he will finally pass the test (1) On the fourth trial and (2) In less than 4 trials? Or

rin g.n et

(b) (i) Find the MGF of the two parameter exponential distribution whose density function is given by f ( x ) = λ e − λ ( x − a ) , x ≥ a and hence find the mean and variance.

(ii) The marks obtained by a number of students in a certain subject are assumed to be

normally distributed with mean 65 and standard deviation 5. If 3 students are selected at

random from this group, what is the probability that two of them will have marks over 70?

12. (a) (i) Find the bivariate probability distribution of (X,Y) given below:

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

Y

1

2

0

0

3

4

5

6

X

0

1 1/16 1/16

1/32 2/32 2/32 3/32 1/8

1/8

2 1/32 1/32 1/64 1/64

1/8

1/8

0

2/64

Find the marginal distributions, conditional distribution of X given Y = 1 and conditional distribution of Y given X = 0.

ww w.E

(ii) Find the covariance of X and Y, if the random variable (X,Y) has the joint p.d.f

f ( x , y ) = x + y , 0 ≤ x ≤ 1, 0 ≤ y ≤ 1 and f ( x , y ) = 0 , otherwise.

asy En gin ee Or

(b) (i) The joint p.d.f of two dimensional random variable (X,Y) is given by f ( x , y ) =

8 xy , 9

0 ≤ x ≤ y ≤ 2 and f ( x , y ) = 0 , otherwise. Find the densities of X and Y, and the conditional densities f ( x / y ) and f ( y / x ) .

rin g.n et

(ii) A sample of size 100 is taken from a population whose mean is 60 and variance is 400.

Using Central Limit Theorem, find the probability with which the mean of the sample will not differ from 60 by more than 4.

13. (a) (i) Examine whether the random process { X ( t )} = A cos(ω t + θ ) is a wide sense

stationary if A and ω are constants and θ is uniformly distributed random variable in (0,2π).

(ii) Assume that the number of messages input to a communication channel in an interval of duration t seconds, is a Poisson process with mean λ = 0.3 . Compute (1) The probability that exactly 3 messages will arrive during 10 second interval (2) The probability that the number of message arrivals in an interval of duration 5 seconds is between 3 and 7. Or

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

(b) (i) The random binary transmission process { X ( t )} is a wide sense process with zero mean and autocorrelation function R(τ ) = 1 −

τ T

, where T is a constant. Find the mean and variance

of the time average of { X ( t )} over (0, T). Is { X ( t )} mean – ergodic? (ii) The transition probability matrix of a Markov chain { X ( t )} , n = 1, 2, 3, ... having three

 0.1 0.5  states 1, 2, 3 is P =  0.6 0.2  0.3 0.4  P ( X 2 = 3 ) and P ( X 3 = 2, X 2

ww w.E

0.4   0.2  , and the initial distribution is P (0) = [ 0.7 0.2 0.1] , Find 0.3  = 3, X 1 = 3, X 0 = 2 ) .

14. (a) (i) Find the autocorrelation function of the periodic time function of the period time function { X ( t )} = A sin ω t .

asy En gin ee

(ii) The autocorrelation function of the random binary transmission { X ( t )} is given by

R(τ ) = 1 −

τ T

for τ < T and R(τ ) = 0 for τ < T . Find the power spectrum of the

process { X ( t )} .

Or

(b) (i)

rin g.n et

{ X ( t )} and {Y ( t )} are zero mean and stochastically independent random processes

having autocorrelation functions RXX (τ ) = e

−τ

and RYY (τ ) = cos 2πτ respectively. Find

(1) The autocorrelation function of W ( t ) = X ( t ) + Y ( t ) and Z ( t ) = X ( t ) − Y ( t ) (2) The cross correlation function of W ( t ) and Z ( t ) .

(ii) Find the autocorrelation function of the process { X ( t )} for which the power spectral density is given by S XX (ω ) = 1 + ω 2 for ω < 1 and S XX (ω ) = 0 for ω > 1 . 15. (a) (i) A wide sense stationary random process { X ( t )} with autocorrelation RXX (τ ) = e

−aτ

where A and a are real positive constants, is applied to the input of an Linear transmission input system with impulse response h( t ) = e − bt u( t ) where b is a real positive constant. Find the autocorrelation of the output Y ( t ) of the system.

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

(ii) If { X ( t )} is a band limited process such that S XX (ω ) = 0 when ω > σ , prove that

2 RXX (0) − RXX (τ ) ≤ σ 2τ 2 RXX (0) . Or (b) (i) Assume a random process X ( t ) is given as input to a system with transfer function

H (ω ) = 1 for −ω 0 < ω < ω 0 . If the autocorrelation function of the input process is

N0 δ (t ) , 2

find the autocorrelation function of the output process. (ii) If Y ( t ) = A cos(ω t + θ ) + N ( t ) , where A is a constant, θ is a random variable with a

ww w.E

uniform distribution in ( −π , π ) and { N ( t )} is a band limited Gaussian white noise with a

N0 for ω − ω 0 < ω B and S NN (ω ) = 0 , elsewhere. Find the 2 power spectral density of Y ( t ) , assuming that N ( t ) and θ are independent. power spectral density S NN (ω ) =

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

 

B.E./B.Tech. DEGREE EXAMINATIONS, APRIL/MAY 2011 Fourth Semester  Electronics and Communication Engineering   MA2261 – PROBABILITY AND RANDOM PROCESS (Common to Bio – Medical Engineering)  (Regulation 2008)  Time : Three hours 

 

 

ww w.E

 

 

 

 

 

Answer ALL Questions  PART A – (10 x 2 = 20 Marks) 

x<0

 0,

1. The CDF of a continuous random variable is given by  F ( x ) =  PDF and mean of X . 

Maximum : 100 marks 

asy En gin ee 1 − e

− x/5

, 0≤ x<∞

.  Find the 

2. Establish the memoryless property of the exponential distribution. 

3. Let  X and  Y be continuous random variables with joint probability density function 

f XY ( x , y ) =

fY / X ( y / x ) . 

x( x − y ) , 0 < x < 2, − x < y < x  and  f XY ( x , y ) = 0  elsewhere. Find  8

rin g.n et

4. Find the acute angle between the two lines of regression, assuming the two lines of regression.  5. Prove that a first order stationary process has a constant mean.  6. State the postulates of a Poisson process. 

7. The autocorrelation function of a stationary random process is R(τ ) = 16 +

9 . Find the  1 + 16τ 2

mean and variance of the process.  8. Prove that for a WSS process  { X ( t )} , RXX ( t , t + τ )  is an even function of  τ .  9. Find the system Transfer function, if a Linear Time Invariant system has an impulse function 

1 ;t ≤c  .  H ( t ) =  2c 0 ; t ≥ c   

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

  10. Define white noise.    PART B – (5 x 16 = 80 marks)  11.   (a)  The probability density function of a random variable  X  is given by     

0< x<1  x,                   f X ( x ) =  k (2 − x ), 1 ≤ x ≤ 2 .   0, otherwise  (1) Find the value of ‘ k ’.   

 

 

 

 

 

(4) 

(2) Find  P (0.2 < x < 1.2)    

 

 

 

 

 

(3) 

 

 

 

 

(4) 

 

 

(5) 

ww w.E

(3) What is  P [ 0.5 < x < 1.5 / x ≥ 1]  

asy En gin ee

(4) Find the distribution function of f ( x ) .   

   

 

Or 

(b)    (i)  Derive the m.g.f of Poisson distribution and hence or otherwise deduce its mean and   variance. 

 

 

 

       (ii)  The marks obtained by a number of students in a certain subject are assumed to be   

rin g.n et

        normally distributed with mean 65 and standard deviation 5.  If 3 students are selected at   

        random from this set, what is the probability that exactly 2of them will have marks over 70?  12.   (a) (i)  If  X and  Y are independent Poisson random variables with respective parameters  λ1    and  λ2 . Calculate the conditional distribution of  X , given that  X + Y = n  . 

     (ii) The regression equation of  X on  Y is  3Y − 5 X + 108 = 0 . If the mean value of  Y  is 44   and the variance of  X is 9/16th of the variance of  Y . Find the mean value of  X and the   correlation coefficient.   

 

 

 

Or 

 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

  (b)   (i) If  X and  Y are independent random variables with density function 

y 1, 1 ≤ x ≤ 2  , 2≤ y≤4  and  fY ( y ) =  6 , find the density function of  f X ( x) =   0, otherwise  0, otherwise Z = XY .          (ii)  The life time of a particular variety of electric bulb may be considered as a random   variable with mean 1200 hours and standard deviation 250 hours. Using central limit   theorem, find the probability that the average life time of 60 bulbs exceeds 1250 hours.  13.   (a)   (i)  A random process X ( t ) defined by  X ( t ) = A cos t + B sin t , − ∞ < t < ∞ , where   

ww w.E

A and B are independent random variables each of which takes a value  −2  with  

probability  1 / 3 and a value  1 with probability  2 / 3 . Show that  X ( t ) is wide – sense  

stationary. 

asy En gin ee

        (ii)  A random process has sample functions of the form X ( t ) = A cos ( ω t + θ ) , where   

ω is constant,  A is a random variable with mean zero and variance one and  θ is a   random variable that is uniformly distributed between  0 and  2π . Assume that the   random variables  A and  θ are independent. Is  X ( t ) is a mean – ergodic process?   

 

 

 

rin g.n et

Or 

(b)  (i)  If  { X ( t )} is a Gaussian process with  µ ( t ) = 10  and  C ( t1 , t 2 ) = 16e probability that   (1) X (10) ≤ 8  

− t1 − t 2

, find the  

(2) X (10) − X (6) ≤ 4   (ii)  Prove that the interval between two successive occurrences of a Poisson process with          parameter λ has an exponential distribution with mean 

1

λ



 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

  14. (a)   (i)  The power spectral density function of a zero mean WSS process  X ( t ) is given by 

1, ω < ω 0  π  . Find R(τ ) and show that  X ( t ) and X  t + S (ω ) =   are uncorrelated.   ω0   0, otherwise 2         (ii)  The Auto correlation function of a WSS process is given by  R(τ ) = α e

−2 λ τ

determine   

the power spectral density of the process.  Or  (b)   (i)   State and prove Weiner – Khintchine Theorem.          (ii)  The cross – power spectrum of real random processes  { X ( t )} and  {Y ( t )} is given by  

ww w.E

 a + bjω , for ω < 1 . Find the cross correlation function.  S xy (ω ) =  elsewhere  0,

asy En gin ee

15. (a)  (i)  Consider a system with transfer function 

1 . An input signal with autocorrelation    1 + jω

function  mδ (τ ) + m 2 is fed as input to the system. Find the mean and mean – square   value of the output. 

        (ii)  A stationary random process  X ( t ) having the autocorrelation function  

rin g.n et

RXX (τ ) = Aδ (τ ) is applied to a linear system at time  t = 0 where  f (τ ) represent the  

impulse function. The linear system has the impulse response of  h( t ) = e − bt u( t ) where  

u( t ) represents the unit step function. Find  RYY (τ ) . Also find the mean and variance of   Y ( t ) .           

 

 

 

Or 

−t 1 RC e u( t ) . Assume an   (b)    (i)  A linear system is described by the impulse response  h( t ) = RC

input process whose Auto correlation function is  Bδ (τ ) . Find the mean and Auto   correlation function of the output process. 

 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

           (ii) If  { N ( t )} is a band limited white noise centered at a carrier frequency  ω 0  such that  

 N0 , for ω − ω0 < ω B  S NN (ω ) =  2 . Find the autocorrelation of  { N ( t )} .   0, elsewhere

ww w.E

asy En gin ee

rin g.n et

 

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

Downloaded From : www.EasyEngineering.net

ww w.E

asy En gin ee

rin g.n et

Downloaded From : www.EasyEngineering.net

MA6451 RP- By EasyEngineering.net.pdf

6 Binomial distribution 18. 7 Poisson distribution 21. 8 Geometric distribution 25. 9 Uniform distribution 27. 10 Exponential distribution 29. 11 Gamma distribution 31. UNIT II. TWO –DIMENSIONAL RANDOM VARIABLES. 11 Introduction 37. 12 Joint distribution 37. 13 Marginal and Conditional Distribution 38. 14 Covariance ...

6MB Sizes 1 Downloads 305 Views

Recommend Documents

MA6451 RP- By EasyEngineering.net 123.pdf
Find k, mean , varilance and the distribution function. 13. The cumulative distribution function of a random variable X is given by. 0, x < 0. x. 2. , 0 ≤ x ≤ 1⁄2. F(x) = 1 -. 3 2. (3 ). 25. x 1⁄2 ≤ x ≤ 3. 1 x 3. Find the pdf of X and eva

RP Capital Budgeting.pdf
pasti, seperti Payback Period, Net Present Value, Profitability Index dan Internal Rate of Return. Selain itu juga dikenalkan keberadaan resiko dalam suatu ...

RP SIP PDF.pdf
Sign in. Page. 1. /. 21. Loading… Page 1 of 21. Page 1 of 21. Page 2 of 21. Page 2 of 21. Page 3 of 21. Page 3 of 21. RP SIP PDF.pdf. RP SIP PDF.pdf. Open.

OBST 520 RP - Scripted Genius
Dec 8, 2013 - “can seem like listening to one side of a phone conversation or reading an e-mail ... “understanding the relevant aspects of the social atmosphere as the authors and the audience of .... how the translation best fits. ... William D.

modern digital electronics-RP JAIN.pdf
Page 3 of 159. modern digital electronics-RP JAIN.pdf. modern digital electronics-RP JAIN.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying modern ...

JtfO - RP-Finale Stuttgart_170126215729.pdf
73733 Esslingen. zu erreichen mit: S1 Station Esslingen/Mettingen ... September 2017 in Berlin statt. V. Meldung ... JtfO - RP-Finale Stuttgart_170126215729.pdf.

RP 9 - notice homeless students.pdf
Biddeford School Department. Liaison for ... Biddeford, Maine 04005. (207) 282-8280. RP 9 (revised 7/15). Page 1 of 1. RP 9 - notice homeless students.pdf.

25557_Irigasi Rp 4.880 juta.pdf
25557_Irigasi Rp 4.880 juta.pdf. 25557_Irigasi Rp 4.880 juta.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying 25557_Irigasi Rp 4.880 juta.pdf.

RP Handbook 17-18 pdf.pdf
student. ANTI-HARASSMENT​ ​POLICY. Greencastle Schools will vigorously enforce its prohibition against harassment based. on but not limited to,​ ​sex, race, ethnicity national origin, religion, disability, or physical. attributes. IMPORTANTâ€

RP Mailing List 2017.pdf
Page 1 of 18. March 23, 2017 Active Reporting Parties QFAIRCHILD. Page 1 of 18. 59. 30. 50. 144. 115. 1773. 678. 3303. ID. Blue Mountain Energy, Inc (Deserado) (COAL). Bowie Resources (COAL). Colowyo Coal Co (COAL). Elk Ridge Mining and Reclamation L

API RP 545 Update -
Oct 5, 2009 - Recommended Practice for Lightning Protection of. Above Ground Storage Tanks for Flammable or. Combustible Liquids. • API only document ...

digital electronics rp jain pdf
Sign in. Loading… Whoops! There was a problem loading more pages. Retrying... Whoops! There was a problem previewing this document. Retrying.

Math 7.RP.1.pdf
Page 1. Whoops! There was a problem loading more pages. Retrying... Math 7.RP.1.pdf. Math 7.RP.1.pdf. Open. Extract. Open with. Sign In. Main menu.

EHRC RP Intervention FINAL.pdf
Page 1 of 10. 1. APPLICATION NO 38245/08. IN THE EUROPEAN COURT OF HUMAN RIGHTS. BETWEEN: RP. Applicant. -and- UNITED KINGDOM. Respondent. THE EQUALITY AND HUMAN RIGHTS COMMISSION. Intervener. SUBMISSION ON BEHALF OF THE INTERVENER. INTRODUCTION. 1.

RP 2 - FERPA handout.pdf
Page 1 of 2. Federal law and school committee policy require the school department to annually inform all parents and those students. age 18 or over of certain rights relating to student records. The purpose of this letter is to comply with this annu

06. RP-SAP Supervisi Pembelajaran.pdf
Hand. Out, LK,. Video. 45'. 105'. 45'. Page 3 of 4. 06. RP-SAP Supervisi Pembelajaran.pdf. 06. RP-SAP Supervisi Pembelajaran.pdf. Open. Extract. Open with.

Math 6.RP.1.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Math 6.RP.1.pdf.

api rp 14j pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. api rp 14j pdf.

catolog-rp-3700.PDF
Page 1 of 2. R. P. 3. 7. 0. 0. h. i. g. h. -. s. p. e. e. d. p. r. i. n. t. e. r. -. d. u. pli. c. a. t. o. r. benefits. Economical 11" x 17" printing. Add affordable and economical high- speed printing to your network with the. RP3700 High-Speed Pri

Online RP 12-12-11
Page 1. Online RP 12-12-11. Campus-Only RP 12-12-11.

riso-rp-3700.PDF
Page 1 of 208. 1. RISO MAKES NO WARRANTY OF ANY KIND WITH REGARD. TO THIS USER'S GUIDE, INCLUDING, BUT NOT LIMITED TO,. THE IMPLIED WARRANTIES OF MERCHANTABILITY AND. FITNESS FOR A PARTICULAR PURPOSE. RISO SHALL NOT. BE LIABLE FOR ERRORS CONTAINED HE

0 rp(r0; t) = 0r1v(r0; t)
far field, an acoustic vector-sensor (located at the Cartesian coordi- nates' origin) has this array manifold, [3], [5], afar def. = u( s; s) v( s; s) w( s). 1 def. = sin s cos ...