Quantitative Finance
Max Batstra
March 29, 2026
1
, 1. Stochastic Processes
1. Terminology
2. Poisson process
3. Brownian motion/ Wiener precesses
4. Martingales
5. Markov processes
1.1 Terminology
Definition 1. [Stochastic Process]
A stochastic process is a collection of random variables/vectors X = (Xt (ω), t ∈ T , ω ∈ Ω), where T is
a given set an (Ω, T , P) is a probability space.
Remarks
t typically represents time:
• T = N: discrete-time
• T = [0, ∞): continuous-time −→ X is called a continuous-time (stochastic) process
• In case T = N or T = Z −→ X is called a time series
Definition 2. [σ-algebra]
A collection F of subsets of Ω is called a σ-algebra if:
1. ∅ ∈ F and Ω ∈ F
2. For all A ∈ F, Ac ∈ F. Note Ac = Ω\A
∞
[
3. If A1 , A2 , . . . ∈ F (a sequence of elements in F ), then Ak ∈ F
k=1
Definition 3. [Filtration]
A collection of σ-algebras F = (Ft )t∈T is called a filtration if:
Fs ⊂ Ft for all s ≤ t
Interpretation: A filtration is a sequence of sub-σ-algebras that are nested and indexed by time. It repre-
sents a growing body of information as time progresses.
Definition 4. [Natural Filtration]
A filtration is called a natural filtration if the filtration is defined via
Ft = σ (Xs , s ≤ t) , where (Xt ) is a stochastic process
2
,Interpretation: A natural filtration is the smallest σ-algebra (or collection of σ-algebras) at each point
of time that completely captures the past behaviour or the information generated by the stochastic process
(Xt ). Because it captures both past and current information, the σ-algebras automatically form sequential
subsets.
Definition 5. [Adapted Process]
A process X = (Xt )t∈T is said to be adapted to a filtration F = (Ft )t∈T if:
Xt is Ft -measurable for all t
Interpretation: The value of Xt is known at time t in case information Ft is provided. The process’s value
is observable based on the information available up to that time, meaning it does not ”see into the future”
1.2 Poisson Process
Definition 6. [Poisson Process]
A stochastic process N = (Nt )t≥0 is a Poisson process with intensity (distribution parameter) λ > 0 if:
1. N0 = 0
2. N has independent increments, i.e., for each choice tt ≥ 0 with t1 < . . . < tn and n ≥ 2:
Nt2 − Nt1 , . . . , Ntn − Ntn−1 are independent random variables
3. ∀t, h ≥ 0 : Nt+h − Nt ∼ Poi(h · λ)
Remarks:
• For Nt ∼ Poi(tλ) : E[Nt ] = tλ and Var(Nt ) = tλ
k
• For Nt ∼ Poi(tλ) : MNt (k) = etλ(e −1)
Theorem 1. [Sum of Exponential RV’s]
i.i.d.
Let λ > 0 and A1 , A2 , . . . ∼ Exp(λ) (so that E[Ai ] = λ1 ). Define A0 = 0 and for t > 0:
X k
Nt = max k : Ej ≤ t
j=1
Then N = (Nt ) is a Poisson process with intensity λ
Interpretation:
Pk The exponential distribution is used to model waiting times until the first occurrence.
Therefore, j=1 Aj represents the time we have to wait for k occurrences to happen. Thus, Nt represents
the maximum number of occurrences that have happened before time t.
3
,1.3 Brownian motion/Wiener process
Definition 7. [Brownian Motion/Wiener Process]
A stochastic process W = (Wt )t≥0 is called a Brownian motion/Wiener process with variance σ 2 per
unit of time, if:
1. W0 = 0
2. W has independent increments, i.e. for each choice ti ≥ 0 with t1 < . . . < tn and n ≥ 2:
Wt2 − Wt1 , . . . , Wtn − Wtn−1 are independent random variables
3. ∀t, h ≥ 0 : Wt+h − Wt ∼ N (0, hσ 2 )
4. The sample paths t 7→ Wt (ω) are continuous
Let the stochastic process W be a Brownian motion, then it has the following properties:
• E[Wt ] = 0
• Var(Wt ) = tσ 2
• E[Wt |Ws ] = Ws for t ≥ s
• Cov(Wt , Ws ) = σ 2 · min{t, s}
Theorem 2. [Wiener (1923)]
Brownian motion exists
Theorem 3. Let (Ω, F, P) be a probability space. Let X = (Xt )t≥0 be a stochastic process with X0 = 0.
For each t ≥ 0, define σ-algebras Ft and Gt by F = σ(Xs , s ≤ t) and Gt = σ(Xt+h − Xt , h ≥ 0).
Then the following conditions are equivalent:
1. X has independent increments (definition from Brownian motion)
2. For each t ≥ 0, Ft and Gt are independent
Interpretation: This theorem gives us two different but equivalent ways to state the ”independent incre-
ments” property of a stochastic process X. The first statement is the same as the definition of independent
increments. The second statement allows us to check if the increments are independent by checking if past
and future increments are independent:
Ft := information up until time t, Gt := information from the future after t
If these two algebras are independent, we automatically get independent increments and vice versa.
Theorem 4. The sample paths of Brownian motion are not differentiable
Note: This is intuitively logical, because if they were differentiable, we could calculate the derivative and
know what direction a stock price would move towards in the future. This would then allow for arbitrage.
Intuitively what happens is:
√
′ Wt+∆t − Wt σ ∆t σ
Wt ≈ ≈ =√ −−−−→ ∞
∆t ∆t ∆t ∆t→0
4
, 1.4 Martingales
Definition 8. [Martingale]
A (Ft )-adapted process M = (Mt )t≥0 is called a martingale (w.r.t. filtration F = (Ft )t∈T ) if:
E[Mt |Ft ] = Ms for all t ≥ s ≥ 0
Technical/additional condition: E[|Mt |] < ∞ for all t ≥ 0
Interpretation: At time s, the best guess for a process at time t ≥ s is its current value Ms . This means
that we do not have information on where the value of the process will be in the future.
1.5 Markov Processes
Definition 9. [Markov Process]
An adapted stochastic process X = (Xt )t∈T (w.r.t. the filtration F = (Ft )t∈T ) is a Markov process if
∀t, h > 0:
P (Xt+h ≤ z|Ft ) = P (Xt+h ≤ z|Xt ), ∀z ∈ R
Interpretation: The future only depends on the past through the present. This is logical, because this
aligns with the no-memory property of stochastic processes.
5
Max Batstra
March 29, 2026
1
, 1. Stochastic Processes
1. Terminology
2. Poisson process
3. Brownian motion/ Wiener precesses
4. Martingales
5. Markov processes
1.1 Terminology
Definition 1. [Stochastic Process]
A stochastic process is a collection of random variables/vectors X = (Xt (ω), t ∈ T , ω ∈ Ω), where T is
a given set an (Ω, T , P) is a probability space.
Remarks
t typically represents time:
• T = N: discrete-time
• T = [0, ∞): continuous-time −→ X is called a continuous-time (stochastic) process
• In case T = N or T = Z −→ X is called a time series
Definition 2. [σ-algebra]
A collection F of subsets of Ω is called a σ-algebra if:
1. ∅ ∈ F and Ω ∈ F
2. For all A ∈ F, Ac ∈ F. Note Ac = Ω\A
∞
[
3. If A1 , A2 , . . . ∈ F (a sequence of elements in F ), then Ak ∈ F
k=1
Definition 3. [Filtration]
A collection of σ-algebras F = (Ft )t∈T is called a filtration if:
Fs ⊂ Ft for all s ≤ t
Interpretation: A filtration is a sequence of sub-σ-algebras that are nested and indexed by time. It repre-
sents a growing body of information as time progresses.
Definition 4. [Natural Filtration]
A filtration is called a natural filtration if the filtration is defined via
Ft = σ (Xs , s ≤ t) , where (Xt ) is a stochastic process
2
,Interpretation: A natural filtration is the smallest σ-algebra (or collection of σ-algebras) at each point
of time that completely captures the past behaviour or the information generated by the stochastic process
(Xt ). Because it captures both past and current information, the σ-algebras automatically form sequential
subsets.
Definition 5. [Adapted Process]
A process X = (Xt )t∈T is said to be adapted to a filtration F = (Ft )t∈T if:
Xt is Ft -measurable for all t
Interpretation: The value of Xt is known at time t in case information Ft is provided. The process’s value
is observable based on the information available up to that time, meaning it does not ”see into the future”
1.2 Poisson Process
Definition 6. [Poisson Process]
A stochastic process N = (Nt )t≥0 is a Poisson process with intensity (distribution parameter) λ > 0 if:
1. N0 = 0
2. N has independent increments, i.e., for each choice tt ≥ 0 with t1 < . . . < tn and n ≥ 2:
Nt2 − Nt1 , . . . , Ntn − Ntn−1 are independent random variables
3. ∀t, h ≥ 0 : Nt+h − Nt ∼ Poi(h · λ)
Remarks:
• For Nt ∼ Poi(tλ) : E[Nt ] = tλ and Var(Nt ) = tλ
k
• For Nt ∼ Poi(tλ) : MNt (k) = etλ(e −1)
Theorem 1. [Sum of Exponential RV’s]
i.i.d.
Let λ > 0 and A1 , A2 , . . . ∼ Exp(λ) (so that E[Ai ] = λ1 ). Define A0 = 0 and for t > 0:
X k
Nt = max k : Ej ≤ t
j=1
Then N = (Nt ) is a Poisson process with intensity λ
Interpretation:
Pk The exponential distribution is used to model waiting times until the first occurrence.
Therefore, j=1 Aj represents the time we have to wait for k occurrences to happen. Thus, Nt represents
the maximum number of occurrences that have happened before time t.
3
,1.3 Brownian motion/Wiener process
Definition 7. [Brownian Motion/Wiener Process]
A stochastic process W = (Wt )t≥0 is called a Brownian motion/Wiener process with variance σ 2 per
unit of time, if:
1. W0 = 0
2. W has independent increments, i.e. for each choice ti ≥ 0 with t1 < . . . < tn and n ≥ 2:
Wt2 − Wt1 , . . . , Wtn − Wtn−1 are independent random variables
3. ∀t, h ≥ 0 : Wt+h − Wt ∼ N (0, hσ 2 )
4. The sample paths t 7→ Wt (ω) are continuous
Let the stochastic process W be a Brownian motion, then it has the following properties:
• E[Wt ] = 0
• Var(Wt ) = tσ 2
• E[Wt |Ws ] = Ws for t ≥ s
• Cov(Wt , Ws ) = σ 2 · min{t, s}
Theorem 2. [Wiener (1923)]
Brownian motion exists
Theorem 3. Let (Ω, F, P) be a probability space. Let X = (Xt )t≥0 be a stochastic process with X0 = 0.
For each t ≥ 0, define σ-algebras Ft and Gt by F = σ(Xs , s ≤ t) and Gt = σ(Xt+h − Xt , h ≥ 0).
Then the following conditions are equivalent:
1. X has independent increments (definition from Brownian motion)
2. For each t ≥ 0, Ft and Gt are independent
Interpretation: This theorem gives us two different but equivalent ways to state the ”independent incre-
ments” property of a stochastic process X. The first statement is the same as the definition of independent
increments. The second statement allows us to check if the increments are independent by checking if past
and future increments are independent:
Ft := information up until time t, Gt := information from the future after t
If these two algebras are independent, we automatically get independent increments and vice versa.
Theorem 4. The sample paths of Brownian motion are not differentiable
Note: This is intuitively logical, because if they were differentiable, we could calculate the derivative and
know what direction a stock price would move towards in the future. This would then allow for arbitrage.
Intuitively what happens is:
√
′ Wt+∆t − Wt σ ∆t σ
Wt ≈ ≈ =√ −−−−→ ∞
∆t ∆t ∆t ∆t→0
4
, 1.4 Martingales
Definition 8. [Martingale]
A (Ft )-adapted process M = (Mt )t≥0 is called a martingale (w.r.t. filtration F = (Ft )t∈T ) if:
E[Mt |Ft ] = Ms for all t ≥ s ≥ 0
Technical/additional condition: E[|Mt |] < ∞ for all t ≥ 0
Interpretation: At time s, the best guess for a process at time t ≥ s is its current value Ms . This means
that we do not have information on where the value of the process will be in the future.
1.5 Markov Processes
Definition 9. [Markov Process]
An adapted stochastic process X = (Xt )t∈T (w.r.t. the filtration F = (Ft )t∈T ) is a Markov process if
∀t, h > 0:
P (Xt+h ≤ z|Ft ) = P (Xt+h ≤ z|Xt ), ∀z ∈ R
Interpretation: The future only depends on the past through the present. This is logical, because this
aligns with the no-memory property of stochastic processes.
5