Econometrics – lecture 5 – assumptions

20 189 0
Econometrics – lecture 5 – assumptions

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

WHAT IF ASSUMPTIONS ARE INVALID? Chapter 5, (selected) - S&W Dr Tu Thuy Anh Faculty of International Economics BASIC ASSUMPTIONS E(ui) = Var(ui) = σ2 cov(ui, uj) = for i #j ui~ N(0, σ2) No perfect collinearity among independent vars Xj: non random, Y: random Assumptions     Multicollinearity (p 201-203, 204-207 S&W) Normality Heteroscedasticity (p158-164; 178-180 S&W) Autocorrelation HIGH COLINEARITY- EXAMPLE  Recall: perfect co-linearity  => what if collinearity among independent variables is high?  Examples:  price of beef/ pork  K and L of a firm  VNindex and HSTC index  Money supply and GDP  If there are independent variables: r23 ~+/- HIGH COLINEARITY- CONSEQUENCE      large variance of estimates wide confidence interval =>? small tob => less chance to reject H0 “wrong” sign of the estimates however, still unbiased estimates only a problem if consequences are serious HIGH COLINEARITY- SYMPTOMS  Example:  Exp^ = 1.5 - 0.2Income +0.1Wealth Wrong sign  Variable C P PA Coefficient Std Error t-Statistic Prob 1207.06 1575.06 0.77 0.45 -146.90 479.15 -0.31 0.76 0.15 6.34 0.02 0.98 R2 =0.91 R2 large, but few significant ratio HIGH COLINEARITY- DETECTION  Run an auxiliary model: Dependent variable: P Variable Coefficient Std Error t-Statistic Prob C 3.28 0.05 61.35 0.00 PA 0.01 0.00 38.19 0.00 R-squared 0.99 Mean dependent var 5.24 Log likelihood 26.27 F-statistic 1458.58 Durbin-Watson stat 1.27 Prob(F-statistic) 0.00  If R2 is large?  if VIF>10? VIF = 1/(1-R2)) HIGH COLINEARITY- CAUSE/ CURE  Cause:  lack of data  nature of variables of interest  Cure:  Collect more data  Use other source of information: CRS, …  Delete one or more variables  Transform the variables: taking log, ratio, growth NORMALITY- CONSEQUENCES  ui~ N(0, σ2)  If this assumption does not hold, then estimates are still unbiased, but we will not be able to assess which parameters are significant - the normality of the estimates will not hold - the significance tests will not follow a t-Student distribution - and the joint significance test will not follow an F distribution Histogram 10 NORMALITY- DETECTION The Jarque-Bera test: JB = N/6 (S2 + K2 /4) where S and K are the sample Skewness and Kurtosis statistics The JB test has an asymptotic chi-square distribution with two degrees of freedom 11 NORMALITY- CAUSE/CURE  Nature of data  Use the Central Limit Theorem to assure that even that u (hence Y) does not come from a normal distribution, the parameters estimates will be asymptotically normal and consequently we will be able to perform the usual inference  But, how large has to be the sample size to achieve normality in the parameter estimates? - Some econometricians propose N>30, but it can be not enough in some situations - The quality of the approximation depends not only on N, but also on N-K, the degrees of freedom 12 HETEROSCEDASTICITY- CONSEQUENCES  What does it mean: var(ui) = σ2i  OLS Estimates are still unbiased  Biased estimation of var(aˆ j )  => Confidence Intervals are invalid  => Invalid t, F test Need to be cured 13 HETEROSCEDASTICITY- DETECTION  White test:  H0 : Var(ui) = σ2 for all i  Regress the original model =>obtain ei  Run (with cross-term): e  1   X   X   X 22   X 32   X X  u  If nR (1)   2 (k  1) => R2(1) ( R (1)) / (k  1) F  f (k  1, n  k ) (1  R (1)) / (n  k ) k: n0 of coeffs in model Reject H0  Do the same with “no cross term” 14 HETEROSCEDASTICITY- CAUSE/CURE  Causes:  Nature of data  Wrong functional form  etc  Cure:  Use “option Robust standard error” in the software  Transform variables depending on the form of heteroscedasticity  Example: if σ2i = aK2i  => dividing both sides of the model by K 15 AUTOCORRELATION  If the assumption does not hold: cov(ui; uj) # for i # j  Form of autocorrelation:  ut = ρut-1 + vt =>AR(1);  v(t): random error, satisfies assumptions 1-4  If ρ >0: positive autocorrelation  If ρ AR(p) 16 AUTOCORRELATION -CONSEQUENCES  Consequence:  OLS Estimates are still unbiased  Biased estimation of var(aˆ j ) => invalid Confidence Interval  Invalid t, F test  Biased estimation of ˆ NEED TO BE CURED 17 AUTOCORRELATION - DETECTION + _ no dL dU 4-dU 4-dL NO CONCLUSION  Durbin Watson test, can be used in situations:  AR (1)  No lag value of the dependent variable in the model  No missing observation 18 AUTOCORRELATION - DETECTION  B-G test: et = a1 + a2 Xt + ρ1et-1+ + ρp et-p +vt => R2(1) => R2(2) et = a1 + a2 Xt + vt If: 2 nR (1)    ( p) or (R2(1)R2(2))/ p F  f(p,nk*) (1R (1))/(nk*) Autocorrelation of order p 19 AUTOCORRELATION - CURE  AR(1): ut = ρut-1 +vt  Estimate ρ, then run GLS as follows:  Set:  Y* = Y – ρ’Y(-1);  X* = X – ρ’X(-1)  Run OLS for the following regression: Y* = β1+ β2X* + v 20 ...  Example:  Exp^ = 1 .5 - 0.2Income +0.1Wealth Wrong sign  Variable C P PA Coefficient Std Error t-Statistic Prob 1207.06 157 5.06 0.77 0. 45 -146.90 479. 15 -0.31 0.76 0. 15 6.34 0.02 0.98 R2 =0.91... Coefficient Std Error t-Statistic Prob C 3.28 0. 05 61. 35 0.00 PA 0.01 0.00 38.19 0.00 R-squared 0.99 Mean dependent var 5. 24 Log likelihood 26.27 F-statistic 1 458 .58 Durbin-Watson stat 1.27 Prob(F-statistic)...BASIC ASSUMPTIONS E(ui) = Var(ui) = σ2 cov(ui, uj) = for i #j ui~ N(0, σ2) No perfect collinearity among independent vars Xj: non random, Y: random Assumptions     Multicollinearity

Ngày đăng: 30/10/2015, 14:35

Từ khóa liên quan

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan