Written by students who passed Immediately available after payment Read online or as PDF Wrong document? Swap it for free 4.6 TrustPilot
logo-home
Summary

Samenvatting - Advanced Econometrics 1 (6414M0005Y)

Rating
-
Sold
4
Pages
37
Uploaded on
27-09-2024
Written in
2023/2024

Extensive summary of the course Advanced Econometrics 1.

Institution
Course

Content preview

Advanced Ecomometrics
Herhaling linear models
Remember the standard regression model y
=

XB + E




Conditioning
7
Conditioning is important in econometrics
2
VB what is variance today, given yesterday
Remember that an assumption of the classic linear regression model is that should be fixed therefore we
condition on

Some important formulas
·
Marginal density

f(y) =
JA(x y)dx
,
or f(x) =
(f(x y)dy ,




·
Conditional density
b(y x)f(y x)
, ,




f(yx) =
f(x) Sh(y x) by=

,





Conditional expectation

Elyx] =

Syb(y(x) dy
·
Conditional variance [y(x] E[(y ETy(x])"(x]
·
var : -




·
Law of iterated expectations E[y] Ex[Eyix [y(x]]
· :




·
Marginal variance :
(y) E[var (y(x)]
var =
(E[y(x]) + var

,Regressions and loss functions
Remember that the residuals are e =

y
-




y
7
Predictor : =
Xb Expected loss:
2
Real value :
y
=

XB + E
E[L(y y)() -




& We have different loss functions L(e) =
((y -y)
E[y(x]
2
8 :
8



Squared error .

e
y

Absolute error let :
Y =

med(y(x)
(1 x) if
E
-
e eso

8
Asymptotic absolute error ~
X e
if ezo
j =


q(y(x)
&
Step loss ·
Cite e-O
y =
mod(y(x)
The goal is to minimize the error, therefore we need an optimal predictor
to minimize the error. Every loss function has an optimal predictor.

Linear prediction




Ordinary least squares: goal again to minimize errors =
minei) mine-min -
(yiyi)



3
Y
I


XB xie ..


·)
xik
E[nX]
·


Yo · x
:
,
P =



: :
:
I :




xine
i


Yo
·
Xn2
...



Bu

OLS estimator minimizes - (yi -xib)2 =
ni =

(y XB)'(y XB)
-
-




boe-2Xy 2XX
=

+ 0




Bas (XX)"Xig - is the estimator of B

, -Y
P X(XX)"X'
n Matrix P projects Y on S(x)
:




e =

My M 1 P
and matrix M projects Y on So(x)
= -




D

I



> S(x) S
Both symmetric
y =
Xb =
Py and indempotent

Assumptions OLS
I
Fixed regressors: all elements of matrix X are fixed/non-stochastic rank (X) : :
b

2
Random disturbances Elui] : =
o




3
Homoskedasticity (disturbances have constant variance) Var (vi) z In : : =




4
No correlation between disturbances Cov (vi uj) ·
,
= o




5
Constant parameters B constant ·




6
Linear relation y XB : = +
u




uX N(0 In)
7
Normality: is normally distributed
E : -
,




X



Under these assumptions we have:
Unbiased: Variance:
E(B(X) B (XX)XEZuIX B : + =


Var(B(x) : (XX)"X 'Var(u(X)x(X(X)
· (x(x)"xX(X(X)" j(XX)" =




BLUE:
v(B(x) j(XX)" = -
any other estimator Distribution:
b(X -
N(B , (XX)")
N


Asymptotic theory
T
In asymptotic theory the assumption of normality is dropped, however we can still get the same result
by R -D




We first repeat some theories
8
i.i.d: independent and identically distributed
O

i.n.i.d: independent and not identically distributed
Modes of convergence



3
O
Converges in distribution Xn° X if im Fr(x) : -
F(x)) = o




O
Converges in probability Xn"-X plimXn X if :
or
= im PXn-X1 > = Yn **
X = > Xn
:
/


Converges almost surely Xn Xif P in /Xn-X1 Xn X
M S
O ** .


..
:



X
=
0 =




8
Converges in mean square XnXif nhmE (Xn-X)2 : =

, Law of Large Numbers
-n-gr -8


&
Weak (WLLN): in probability
&
Strong (SLLN): almost surely
e
Khintchine WLLN EXiBis id Mi ·
,
=

pe
O
Chebyshev WLLN :
[XiDiz him =
, ind

O
Markov SLLN EXi ·
is ,
indo

Central Limit Theorem

Zi
M -Wo , we



&
Lindeberg-Levy CLT EXiSiz id Mi p i 82 ·
,



inid
= =




Lindeberg-Feller CLT [Xibic hi E (Xi-mi(Xi mis)]
6 ·
, =


)

Liapounov CLT [Xi inid him (2 Ei
+
O ·
is ,




Transformation theory
If Xn X and Yn se If Xn"X and An " A

·
Xn + Yn X + e
·
AnXn AX

·
XnYn eX · An"Xn"A"X
·
Xn/Yn -
X/e




Delta method
·




·
n



~N(A n20)
(g(fn)
,
-




g(fo))d
g(n)
N(o
-
,




N(g(f)
nGe)]G(8)
Gi) Goe .




,
:
ag(t)
at


2
Instead of the normality assumption we assume that n is large and add new assumptions

Stability of X
plim ( * XX) plim (n Exixi') Mxx
= =





Orthogonality of X and u
plim (Xa) = o




&
Stability of u
plim (in'u) = and
plines" =



N


Using these assumptions we have

Written for

Institution
Study
Course

Document information

Uploaded on
September 27, 2024
Number of pages
37
Written in
2023/2024
Type
SUMMARY

Subjects

$8.45
Get access to the full document:

Wrong document? Swap it for free Within 14 days of purchase and before downloading, you can choose a different document. You can simply spend the amount again.
Written by students who passed
Immediately available after payment
Read online or as PDF

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
maaikekoens Universiteit van Amsterdam
Follow You need to be logged in order to follow users or courses
Sold
42
Member since
4 year
Number of followers
0
Documents
9
Last sold
1 month ago

4.3

3 reviews

5
1
4
2
3
0
2
0
1
0

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Working on your references?

Create accurate citations in APA, MLA and Harvard with our free citation generator.

Working on your references?

Frequently asked questions