Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Panel Data Econometrics: Conditional Mean, Projection, and Regression, Slides of Econometrics and Mathematical Economics

An in-depth analysis of econometric panel data, focusing on the concepts of conditional mean, projection, and regression. It covers topics such as statistical models, causality and covariation, conditional mean function, other conditional characteristics, and using the model for understanding relationships, estimation of quantities of interest, prediction, and control. The document also includes examples and applications to help illustrate these concepts.

Typology: Slides

2011/2012

Uploaded on 11/10/2012

uzman
uzman 🇮🇳

4.8

(12)

148 documents

1 / 39

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Econometric Analysis of Panel Data
Docsity.com
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27

Partial preview of the text

Download Panel Data Econometrics: Conditional Mean, Projection, and Regression and more Slides Econometrics and Mathematical Economics in PDF only on Docsity!

Econometric Analysis of Panel Data

Econometric Analysis of Panel Data

2. Statistical Models

Models

 Conditional mean function: E[y | x ]

 Other conditional characteristics – what is ‘the

model?’

 Conditional variance function: Var[y | x ]  Conditional quantiles, e.g., median [y | x ]  Other conditional moments

Using the Model

 Understanding the relationship:

 Estimation of quantities of interest such as elasticities

 Prediction of the outcome of interest

 Control of the path of the outcome of interest

Projection and Regression

 Linear projection is not the regression, and is

not the Taylor series.

Example:

α β ≤ ≤

f(y|x)=[1/λ(x)]exp[-y/λ(x)] λ(x)=exp( + x)=E[y|x] x~U[0,1]; f(x)=1, 0 x 1

0 1

0

1

ˆg(x|x=E[x])=δ +δ (x-1/2)

exp( / 2)

exp( / 2)

δ = α + β

δ = β α + β

The Linear Projection

1 x x (^0) x x 1 1 0 0

g*(x)=E[y]+ Covx,y Var[x] E[x]=1/2 Var[x]=1/ E[y]=E E[y|x]=E [exp(α+βx)]= exp(α+βx)1dx Cov[x,y]=Cov[x,E[y|x]]=E [xE[y|x]]-E[x]E E[y|x] = x exp(α+βx)1dx 1 exp(α+βx)1dx 2 Us

∫ ∫ 2

0 1

ing exp( x)dx=[exp( x)]/ and xexp( x)dx={[exp( x)]/ }( x 1), γ =exp( )/ -1], γ [exp( )/ ]{exp( )(1/2 - 1/ )+1/2+1/ }

β β β β β β β − α β β = α β β β β

∫ ∫

Omitted details of the proof are left for the reader. Docsity.com

(FYI)

Calc ; alpha=1;beta=2 $ Samp ; 1-200$ Crea ; x=trn(0,.005)$ Crea ; ey_x = exp(alpha + betax) $ Calc ; gamma1 =exp(alpha)/beta ( exp(beta) (.5-1/beta) +.5+1/beta )12 $ Calc ; gamma0 =exp(alpha)/beta * (exp(beta)-1) $ Calc ; delta1 = betaexp(alpha+beta/2) $ Calc ; delta0 = exp(alpha+beta/2) $ Crea ; projectn=gamma0+gamma1(x-.5)$ Crea ; taylor =delta0 +delta1*(x-.5)$ Plot ; lhs=x;endpoints=0,1;rhs=ey_x,projectn,taylor;fill$

What About the Linear Projection?

 What we do when we linearly regress a variable

on a set of variables

 Assuming there exists a conditional mean

 There usually exists a linear projection. Requires finite variance of y.  Approximation to the conditional mean

 If the conditional mean is linear

 Taylor series equals the conditional mean  Linear projection equals the conditional mean

Conditional Mean and Projection

Notice the problem with the linear approach. Negative predictions.

Doctor Visits: Conditional Mean and Linear Projection

INCOME

.

1.

2.

3.

4.

-.26 0 4 8 12 16 20 CONDMEAN PROJECTN

DocVisit

Most of the data are in here

This area is outside the range of the data

Partial Effects

 What did the model tell us?

 Covariation and partial effects

 Marginal Effects: Effect on what?????

 For continuous variables

 For dummy variables

 Elasticities: ε(x)=δ(x) * x / E[y|x]

∂ E[y|x]/ x=∂ δ(x), usually not coefficients

E[y|x,d=1]-E[y|x,d=0]

APE and PE at the Mean

2 2 x

δ(x)= E[y|x]/ x, =E[x] δ(x) δ( )+δ ( )(x- )+(1/2)δ ( )(x- ) + E[δ(x)]=APE δ( ) + (1/2)δ ( )

∂ ∂ μ ≈ μ ′^ μ μ ′′ μ μ ε ≈ μ ′′ μ σ

Implication: Computing the APE by averaging over observations (and counting on the LLN and the Slutsky theorem) vs. computing partial effects at the means of the data. In the earlier example: Sample APE = -. Approximation = -.

The Linear Model

 y = X β +ε , N observations, K columns in X ,

including a column of ones.

 Standard assumptions about X  Standard assumptions about ε|XE[ε|X]=0, E[ε]=0 and Cov[ε,x]=

 Regression?

 If E[ y | X ] = X β  Approximation: Then this is an LP, not a Taylor series.

Endogeneity

 Definition: E[ε| x ]≠

 Why not?

 Omitted variables  Unobserved heterogeneity (equivalent to omitted variables)  Measurement error on the RHS (equivalent to omitted variables)  Simultaneity (?)

Structure and Regression

 Simultaneity?

 y=xβ+ε, x=δy+u. Cov[x, ε]≠

 xβ is not the regression?  What is the regression?  Reduced form: Assume ε and u are uncorrelated.  y = [β/(1- βδ)]u + [1/(1- βδ)]ε  x= [1/(1- βδ)]u + [δ /(1- βδ)]ε  Cov[x,y]/Var[x] =λ

 The regression is y = λx + v, where E[v|x]=

2 2 2 2 2 2 2 2 2

[ ] /[ ] (1 )(1 / ) where w= /[ ]

u u w w u u

ε ε ε

= βσ + δσ σ + δ σ = β + − δ σ σ + δ σ