Written by students who passed Immediately available after payment Read online or as PDF Wrong document? Swap it for free 4.6 TrustPilot
logo-home
Exam (elaborations)

MATH 220 The Transition Matrix

Rating
-
Sold
-
Pages
6
Grade
A+
Uploaded on
07-02-2022
Written in
2021/2022

MATH 220 The Transition Matrix2.1 Basic Principles. 2.1.1 The Transition Matrix. A stochastic process is a mathematical model of a situation in the real world that evolves in time in a probabilistic fashion, i.e. we don't seem to be able to completely predict the future. The situation in the real world is often called a system. For the time being we model the system in discrete time as opposed to continuous time. For example, this might be because we observe the system at discrete time intervals. We let n = 0 be the initial time that we observe the system, n = 1 the next time and so on. We are interested in calculating probabilities that the system is in various states at various times. Mathematically, we describe the system by a sequence of random variables X0, X1, ..., Xn where X0 = the state of the system at time n = 0, X1 = the state of the system at time n = 1, . . .Xn = the state of the system at time n At any particular time the system is in one of s possible states that we number 1, 2, …, s. Example 1. An office copier is either in 1. good condition (G or 1), 2. poor condition (P or 2), or 3. broken (B or 3). So the possible states of the system are G = 1, P = 2 and B = 3. At the start of each day we check the condition of the copier. If we do this for a sequence of four days, we might observe that today it is G, tomorrow it is G, the next day it is P and the fourth day it is B. In this case we would have X0 = 1, X1 = 1, X2 = 2 and X3 = 3. Unfortunately, we can't predict the state of the copier in the future. One of the most basic probabilities that one might want to know is the probability of a sequence of observations x0, x1, ..., xn. We would denote this probability by Pr{X0 = x0, X1 = x1, ..., Xn = xn} For example, in the context of Example 1 we might want to know the probability that today it is G, tomorrow it is G, the next day it is P and the fourth day it is B. This would be denoted by Pr{ X0 = 1, X1 = 1, X2 = 2, X3 = 3}. For a general stochastic process, one might provide the values of Pr{ X0 = x0. X1 = x1 , Xn = xn} for each sequence x0, x1, , xn. This is a lot of data to provide. Things are a lot easier if we make the following two simplifying assumptions. First the probability of the state being something at one observation depends only on the previous observation. This can be described by the equation (1) Pr{Xn+1 = xn+1 | X0 = x0, X1 = x1, , Xn = xn} = Pr{Xn+1 = xn+1 | Xn = xn} This is called the Markov assumption and a stochastic process satisfying this assumption is called a Markov process. The second assumption is that the quantity appearing on the right of (1) depends only on xn and xn+1 and not on n, i.e. (2) Pr{Xn+1 = j | Xn = i} = pij A Markov process statisfying (2) is called stationary or time-homogeneous. A stochastic process satisfying (1) and (2) is called a Markov chain. As an example of what (1) implies, suppose we wanted to know the probability that tomorrow the copier is broken. Consider the following two scenarios. First, we are told the copier is in poor condition today. Then the probability the copier is broken tomorrow would be Pr{ X2 = 3 | X1 = 2} assuming today is time n = 1 and tomorrow is time n = 2. On the other hand suppose we are told both the copier was in good condition yesterday and is in poor condition today. Then the probability the copier is broken tomorrow would be Pr{X2 = 3 | X0 = 1 X1 = 2} assuming yesterday is time n = 0. The Markov assumption would imply that knowing the copier was in good condition yesterday didn't add any information to the fact that it is in poor condition today, i.e. Pr{X2 = 3 | X0 = 1, X1 = 2} = Pr{X2 = 3 | X1 = 2}. The probabilities pij appearing in (2) are called transition probabilities and the matrix P = is called the transition matrix. Corresponding to the transition matrix is a transition diagram where the states are represented by circles and the transitions are represented by arrows from one circle to another labeled by the transition probabilities. Example 1 (continued). In the context of Example 1 suppose the transition matrix is P =

Show more Read less
Institution
Course

Content preview

MATH 220 The
Transition Matrix
2 Markov Chains

2.1 Basic Principles.
2.1.1 The Transition Matrix.

A stochastic process is a mathematical model of a situation in the real world that evolves in time in a
probabilistic fashion, i.e. we don't seem to be able to completely predict the future. The situation in the
real world is often called a system. For the time being we model the system in discrete time as opposed
to continuous time. For example, this might be because we observe the system at discrete time
intervals. We let n = 0 be the initial time that we observe the system, n = 1 the next time and so on.

We are interested in calculating probabilities that the system is in various states at various times.
Mathematically, we describe the system by a sequence of random variables X0, X1, ..., Xn where

X0 = the state of the system at time n = 0,

X1 = the state of the system at time n = 1,

. . .Xn = the state of the system at time n

At any particular time the system is in one of s possible states that we number 1, 2, …, s.

Example 1. An office copier is either in

1. good condition (G or 1),

2. poor condition (P or 2), or

3. broken (B or 3).

So the possible states of the system are G = 1, P = 2 and B = 3. At the start of each day we check the
condition of the copier. If we do this for a sequence of four days, we might observe that today it is G,
tomorrow it is G, the next day it is P and the fourth day it is B. In this case we would have X0 = 1, X1 = 1,
X2 = 2 and X3 = 3. Unfortunately, we can't predict the state of the copier in the future.

One of the most basic probabilities that one might want to know is the probability of a sequence of
observations x0, x1, ..., xn. We would denote this probability by

Pr{X0 = x0, X1 = x1, ..., Xn = xn}

, For example, in the context of Example 1 we might want to know the probability that today it is G,
tomorrow it is G, the next day it is P and the fourth day it is B. This would be denoted by
Pr{ X0 = 1, X1 = 1, X2 = 2, X3 = 3}.

For a general stochastic process, one might provide the values of Pr{ X0 = x0. X1 = x1........., Xn = xn} for
each sequence x0, x1,...., xn. This is a lot of data to provide. Things are a lot easier if we make the
following two simplifying assumptions. First the probability of the state being something at one
observation depends only on the previous observation. This can be described by the equation

(1) Pr{Xn+1 = xn+1 | X0 = x0, X1 = x1,...., Xn = xn} = Pr{Xn+1 = xn+1 | Xn = xn}

This is called the Markov assumption and a stochastic process satisfying this assumption is called a
Markov process.

The second assumption is that the quantity appearing on the right of (1) depends only on xn and xn+1 and
not on n, i.e.

(2) Pr{Xn+1 = j | Xn = i} = pij

A Markov process statisfying (2) is called stationary or time-homogeneous. A stochastic process
satisfying (1) and (2) is called a Markov chain.

As an example of what (1) implies, suppose we wanted to know the probability that tomorrow the
copier is broken. Consider the following two scenarios. First, we are told the copier is in poor
condition today. Then the probability the copier is broken tomorrow would be Pr{ X2 = 3 | X1 = 2}
assuming today is time n = 1 and tomorrow is time n = 2. On the other hand suppose we are told both
the copier was in good condition yesterday and is in poor condition today. Then the probability the
copier is broken tomorrow would be Pr{X2 = 3 | X0 = 1 X1 = 2} assuming yesterday is time n = 0. The
Markov assumption would imply that knowing the copier was in good condition yesterday didn't add
any information to the fact that it is in poor condition today, i.e.

Pr{X2 = 3 | X0 = 1, X1 = 2} = Pr{X2 = 3 | X1 = 2}.

The probabilities pij appearing in (2) are called transition probabilities and the matrix

P=

is called the transition matrix. Corresponding to the transition matrix is a transition diagram where the
states are represented by circles and the transitions are represented by arrows from one circle to another
labeled by the transition probabilities.

Example 1 (continued). In the context of Example 1 suppose the transition matrix is

P=

Written for

Institution
Course

Document information

Uploaded on
February 7, 2022
Number of pages
6
Written in
2021/2022
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

$14.50
Get access to the full document:

Wrong document? Swap it for free Within 14 days of purchase and before downloading, you can choose a different document. You can simply spend the amount again.
Written by students who passed
Immediately available after payment
Read online or as PDF

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
10SALES South University
Follow You need to be logged in order to follow users or courses
Sold
228
Member since
5 year
Number of followers
210
Documents
2685
Last sold
2 months ago

3.6

39 reviews

5
16
4
10
3
2
2
3
1
8

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Working on your references?

Create accurate citations in APA, MLA and Harvard with our free citation generator.

Working on your references?

Frequently asked questions