Unit – III
Joint Probability and Morkov Chain
Joint Probability distribution: Joint Probability distribution for two discrete random
variables (both discrete and continuous cases), expectation, covariance, correlation coefficient.
Stochastic processes- Stochastic processes, probability vector, stochastic matrices, fixed
points, regular stochastic matrices, Markov chains, higher transition probability-simple
problems.
Joint Probability Distribution
If X and Y are two discrete random variables, we define the joint probability function of X and
Y by P(X = x, Y = y) = f(x, y) where f(x, y) satisfies the conditions
(i) f(x, y) 0, (ii) f ( x, y) 1
x y
Also, if X and Y are two continuous random variables, we define the joint probability function
for the random variables X and Y which is also called the joint density function by f(x, y)
where
(i) f(x, y) 0 (ii)
f ( x, y ) dxdy 1 .
Suppose X and Y takes any one of the values from the set of values {x1, x2, x3, …….xm} and
{y1, y2, y3, …….yn} respectively then P(X = xi, Y = yj) = f(xi, yj) is represented in the
following two way table
Y y1 y2 -- yn Total
X
x1 f ( x1 , y1 ) f ( x1 , y2 ) -- f ( x1 , yn ) f1 ( x1 )
x2 f ( x2 , y1 ) f ( x2 , y2 ) -- f ( x2 , yn ) f1 ( x 2 )
--
--
--
xm f ( xm , y1 ) f ( xm , y2 ) -- f ( xm , yn ) f1 ( xm )
Total f 2 ( y1 ) f 2 ( y2 ) -- f 2 ( yn ) 1
1
,f1 ( xi ) or simply f1 ( x ) and f 2 ( y j ) or simply f 2 ( y ) are known as the marginal probability
functions of X and Y respectively as it can be observed that
m n m n
f ( x ) 1 , f ( y ) 1 which can jointly written as f ( x , y ) 1.
i 1
1 i
j 1
2 j
i 1 j 1
i j
Expectation: (One Variable): The mean value of the distribution of a variate X is commonly
known as its expectation and is denoted by E(X). If f(x) is the probability density function of
the variate X then
E( X ) xi f (xi ) (Discrete distribution)
E( X ) xf ( x)dx
(Continuous distribution)
In general expectation of any function (x) is given by
E ( x) (xi ) f (xi ) (Discrete distribution)
E ( x ) ( x) f ( x)dx (Continuous distribution)
Expectation: (Two Variables): If X and Y are two discrete random variables having the joint
probability function f(x, y) then the expectations of X and Y are defined as
Mean of X x E( X ) xf ( x, y) x f ( x )
x y i
i 1 i
Mean of Y y E(Y ) yf ( x, y) y
x y j
j f2 ( y j )
and E( XY ) x y J
ij
i j ij
The Covariance of X and Y = Cov( X ,Y ) E( XY ) E( X )E(Y )
If X and Y are two continuous random variables having the joint probability function f(x, y)
then the expectations of X and Y are defined as
x E( X ) xf ( x, y)dxdy
y E (Y ) yf ( x, y)dxdy
2
,
Variance V ( X ) x2 (x
x ) 2 f ( x, y )dxdy
V (Y ) (y
2
y y )2 f ( x, y )dxdy
Cov ( X , Y ) ( x )( y
x y ) f ( x, y ) dxdy E ( XY ) E ( X ) E (Y )
Cov( X , Y )
Correlation Co efficient ( X , Y )
x y
Note 1: The probability that x [a, b] and y [c, d ] is defined as
b d
P ( a x b, c y d ) f ( x, y )dxdy
a c
Note 2: The variables x and y are said to be independent if f1 ( x) f 2 ( y ) f ( x, y ) .
Note 3: Marginal density function of X is f1 ( x)
f ( x, y )dy
Marginal density function of Y is f 2 ( y )
f ( x, y ) dx
Properties of Expectation:
(i) E(kX ) kE( X )
(ii) E( X k ) E( X ) k
(iii) E( X Y ) E( X ) E(Y )
(iv) E( XY ) E( X )E(Y )
Properties of Variance:
(i) V ( X ) E( X ) E( X )
2 2
(ii) V (kX ) k 2V ( X )
(iii) V ( X k ) V ( X )
(iv) V ( X Y ) V ( X ) V (Y )
3
, Problem: Find E(X), E(X2) and 2 for the probability function P(X) defined by the following
data
xi 1 2 3 --------- n
p(xi) k 2k 3k --------- nk
Here k is an appropriate constant.
Solution: The condition p ( x ) 1 gives
i
k 2 k 3 k 4 k ......... nk 1
k 1 2 3 4 ......... n 1
n( n 1)
k 1
2
2
k
n(n 1)
E(X ) x p(x )
i i
E( X ) (1)(k ) (2)(2k) 3(3k) ............ (n)(nk)
2 n(n 1)(2n 1) (2n 1)
E( X ) k (12 22 32 .......... n2 )
n(n 1) 6 3
E(X 2) x i
2
p ( xi )
E ( X 2 ) (12 )(k ) (22 )(2k ) (32 )(3k ) ............ ( n 2 )( nk )
2 n2 (n 1)2 n(n 1)
E ( X 2 ) k (13 23 33 .......... n3 )
n(n 1) 4 2
2 E( X 2 ) 2
2
n(n 1) 2n 1
2
2 3
4
Joint Probability and Morkov Chain
Joint Probability distribution: Joint Probability distribution for two discrete random
variables (both discrete and continuous cases), expectation, covariance, correlation coefficient.
Stochastic processes- Stochastic processes, probability vector, stochastic matrices, fixed
points, regular stochastic matrices, Markov chains, higher transition probability-simple
problems.
Joint Probability Distribution
If X and Y are two discrete random variables, we define the joint probability function of X and
Y by P(X = x, Y = y) = f(x, y) where f(x, y) satisfies the conditions
(i) f(x, y) 0, (ii) f ( x, y) 1
x y
Also, if X and Y are two continuous random variables, we define the joint probability function
for the random variables X and Y which is also called the joint density function by f(x, y)
where
(i) f(x, y) 0 (ii)
f ( x, y ) dxdy 1 .
Suppose X and Y takes any one of the values from the set of values {x1, x2, x3, …….xm} and
{y1, y2, y3, …….yn} respectively then P(X = xi, Y = yj) = f(xi, yj) is represented in the
following two way table
Y y1 y2 -- yn Total
X
x1 f ( x1 , y1 ) f ( x1 , y2 ) -- f ( x1 , yn ) f1 ( x1 )
x2 f ( x2 , y1 ) f ( x2 , y2 ) -- f ( x2 , yn ) f1 ( x 2 )
--
--
--
xm f ( xm , y1 ) f ( xm , y2 ) -- f ( xm , yn ) f1 ( xm )
Total f 2 ( y1 ) f 2 ( y2 ) -- f 2 ( yn ) 1
1
,f1 ( xi ) or simply f1 ( x ) and f 2 ( y j ) or simply f 2 ( y ) are known as the marginal probability
functions of X and Y respectively as it can be observed that
m n m n
f ( x ) 1 , f ( y ) 1 which can jointly written as f ( x , y ) 1.
i 1
1 i
j 1
2 j
i 1 j 1
i j
Expectation: (One Variable): The mean value of the distribution of a variate X is commonly
known as its expectation and is denoted by E(X). If f(x) is the probability density function of
the variate X then
E( X ) xi f (xi ) (Discrete distribution)
E( X ) xf ( x)dx
(Continuous distribution)
In general expectation of any function (x) is given by
E ( x) (xi ) f (xi ) (Discrete distribution)
E ( x ) ( x) f ( x)dx (Continuous distribution)
Expectation: (Two Variables): If X and Y are two discrete random variables having the joint
probability function f(x, y) then the expectations of X and Y are defined as
Mean of X x E( X ) xf ( x, y) x f ( x )
x y i
i 1 i
Mean of Y y E(Y ) yf ( x, y) y
x y j
j f2 ( y j )
and E( XY ) x y J
ij
i j ij
The Covariance of X and Y = Cov( X ,Y ) E( XY ) E( X )E(Y )
If X and Y are two continuous random variables having the joint probability function f(x, y)
then the expectations of X and Y are defined as
x E( X ) xf ( x, y)dxdy
y E (Y ) yf ( x, y)dxdy
2
,
Variance V ( X ) x2 (x
x ) 2 f ( x, y )dxdy
V (Y ) (y
2
y y )2 f ( x, y )dxdy
Cov ( X , Y ) ( x )( y
x y ) f ( x, y ) dxdy E ( XY ) E ( X ) E (Y )
Cov( X , Y )
Correlation Co efficient ( X , Y )
x y
Note 1: The probability that x [a, b] and y [c, d ] is defined as
b d
P ( a x b, c y d ) f ( x, y )dxdy
a c
Note 2: The variables x and y are said to be independent if f1 ( x) f 2 ( y ) f ( x, y ) .
Note 3: Marginal density function of X is f1 ( x)
f ( x, y )dy
Marginal density function of Y is f 2 ( y )
f ( x, y ) dx
Properties of Expectation:
(i) E(kX ) kE( X )
(ii) E( X k ) E( X ) k
(iii) E( X Y ) E( X ) E(Y )
(iv) E( XY ) E( X )E(Y )
Properties of Variance:
(i) V ( X ) E( X ) E( X )
2 2
(ii) V (kX ) k 2V ( X )
(iii) V ( X k ) V ( X )
(iv) V ( X Y ) V ( X ) V (Y )
3
, Problem: Find E(X), E(X2) and 2 for the probability function P(X) defined by the following
data
xi 1 2 3 --------- n
p(xi) k 2k 3k --------- nk
Here k is an appropriate constant.
Solution: The condition p ( x ) 1 gives
i
k 2 k 3 k 4 k ......... nk 1
k 1 2 3 4 ......... n 1
n( n 1)
k 1
2
2
k
n(n 1)
E(X ) x p(x )
i i
E( X ) (1)(k ) (2)(2k) 3(3k) ............ (n)(nk)
2 n(n 1)(2n 1) (2n 1)
E( X ) k (12 22 32 .......... n2 )
n(n 1) 6 3
E(X 2) x i
2
p ( xi )
E ( X 2 ) (12 )(k ) (22 )(2k ) (32 )(3k ) ............ ( n 2 )( nk )
2 n2 (n 1)2 n(n 1)
E ( X 2 ) k (13 23 33 .......... n3 )
n(n 1) 4 2
2 E( X 2 ) 2
2
n(n 1) 2n 1
2
2 3
4