Econometrics – lecture 8 – time series

24 169 0
Econometrics – lecture 8 – time series

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

KTEE 310-FINANCIAL ECONOMETRICS Lecture 6: TIME SERIES ANALYSIS AND APPLICATIONS IN FINANCE Dr TU Thuy Anh Faculty of International Economics QUARTERLY GDP 160000 140000 120000 100000 80000 60000 40000 20000 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 COMPONENTS OF A TIME SERIES  Time series: An ordered sequence of values of a variable at equally spaced time intervals  Such as: index, inflation, gdp growth rate, etc  Components:  Trend  Seasonality  Cycle  Irregular  The components may make up a TS in two ways:  additive model: Xt = Tt + St+Ct+It  multiplicative model: Xt = Tt * St *Ct*It ASSUMPTIONS FOR TIME SERIES MODEL C.1 The model is linear in parameters and correctly specified Y = b + b2 X + … + b k X k + u C.2 The time series for the regressors are weakly persistent C.3 There does not exist an exact linear relationship among the regressors C.4 The disturbance term has zero expectation C.5 The disturbance term is homoscedastic ASSUMPTIONS FOR TIME SERIES MODEL C.6 The values of the disturbance term have independent distributions ut is distributed independently of ut' for t' ≠ t C.7 The disturbance term is distributed independently of the regressors ut is distributed independently of Xjt' for all t' (including t) and j C.8 The disturbance term has a normal distribution Assumption C.6 is rarely an issue with cross-sectional data When observations are generated randomly, there is no reason to suppose that there should be any connection between the value of the disturbance term in one observation and its value in any other AUTOCORRELATION Y X In the graph above, it is clear that disturbance terms are not generated independently of each other Positive values tend to be followed by positive ones, and negative values by negative ones Successive values tend to have the same sign This is described as positive autocorrelation AUTOCORRELATION Y In this graph, positive values tend to be followed by negative ones, and negative values by positive ones This is an example of negative autocorrelation X AUTOCORRELATION Yt  b  b X t  ut First-order autoregressive autocorrelation: AR(1) ut  rut 1  e t Fifth-order autoregressive autocorrelation: AR(5) ut  r ut 1  r ut 2  r ut 3  r ut 4  r ut 5  e t A particularly common type of autocorrelation is first-order autoregressive autocorrelation, usually denoted AR(1) autocorrelation It is autoregressive, because ut depends on lagged values of itself, and firstorder, because it depends only on its previous value ut also depends on et, an injection of fresh randomness at time t, often described as the innovation at time t Fifth-order autocorrelation AR(5): it depends on lagged values of ut up to the fifth lag AUTOCORRELATION Yt  b  b X t  ut First-order autoregressive autocorrelation: AR(1) ut  rut 1  e t Fifth-order autoregressive autocorrelation: AR(5) ut  r ut 1  r ut 2  r ut 3  r ut 4  r ut 5  e t Third-order moving average autocorrelation: MA(3) ut  0e t  1e t 1  2e t 2  3e t 3 Moving average autocorrelation: the disturbance term is a linear combination of the current innovation and a finite number of previous ones MA(3): it depends on the three previous innovations as well as the current one AUTOCORRELATION 1 -1 -2 -3 ut  rut 1  e t The rest of this sequence gives examples of the patterns that are generated when the disturbance term is subject to AR(1) autocorrelation The object is to provide some bench-mark images to help you assess plots of residuals in time series regressions AUTOCORRELATION 1 -1 -2 -3 ut  0.0ut 1  e t We have started with r equal to 0, so there is no autocorrelation We will increase r progressively in steps of 0.1 AUTOCORRELATION 1 -1 -2 -3 ut  0.1ut 1  e t AUTOCORRELATION 1 -1 -2 -3 ut  0.2ut 1  e t AUTOCORRELATION 1 -1 -2 -3 ut  0.3ut 1  e t With r equal to 0.3, a pattern of positive autocorrelation is beginning to be apparent AUTOCORRELATION 1 -1 -2 -3 ut  0.4ut 1  e t AUTOCORRELATION 1 -1 -2 -3 ut  0.5ut 1  e t AUTOCORRELATION 1 -1 -2 -3 ut  0.6ut 1  e t With r equal to 0.6, it is obvious that u is subject to positive autocorrelation Positive values tend to be followed by positive ones and negative values by negative ones AUTOCORRELATION 1 -1 -2 -3 ut  0.7 ut 1  e t AUTOCORRELATION 1 -1 -2 -3 ut  0.8ut 1  e t AUTOCORRELATION 1 -1 -2 -3 ut  0.9ut 1  e t With r equal to 0.9, the sequences of values with the same sign have become long and the tendency to return to has become weak AUTOCORRELATION 1 -1 -2 -3 ut  0.95ut 1  e t The process is now approaching what is known as a random walk, where r is equal to and the process becomes nonstationary The terms random walk and nonstationarity will be defined in the next chapter For the time being we will assume | r | < STATIONARY PROCESSES Xt is stationary if E(Xt), s X2 t , and the population covariance of Xt and Xt+s are independent of t X t  b X t 1  e t   b2  E ( X t )  b 2t X  s Xt  b 22 t 2  s  s e e  b 22  b 22 population covariance of X t and X t  s b 2s  s e  b 22 A time series Xt is said to be stationary if its expected value and population variance are independent of time and if the population covariance between its values at time t and time t + s depends on s but not on t An example of a stationary time series is an AR(1) process Xt = b2Xt–1 + et, provided that –1 < b2 < 1, where et is a random variable with mean and constant variance and not subject to autocorrelation NONSTATIONARY PROCESSES Random walk X t  X t 1  e t X t  X  e   e t 1  e t E ( X t )  X  E (e )   E (e n )  X s X2  population variance of ( X  e   e t 1  e t ) t  population variance of (e   e t 1  e t )  s e2   s e2  s e2  ts e2 The condition –1 < b2 < was crucial for stationarity If b2 = 1, the series becomes a nonstationary process known as a random walk, e ~ (0,se) E(Xt) is independent of t and the first condition for stationarity remains satisfied However, the condition that the variance of Xt be independent of time is not satisfied NONSTATIONARY PROCESSES 20 15 10 Random walk 11 21 31 41 51 61 71 81 91 -5 -10 -15 The chart shows a typical random walk If it were a stationary process, there would be a tendency for the series to return to periodically Here there is no such tendency ... GDP 160000 140000 120000 100000 80 000 60000 40000 20000 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 COMPONENTS OF A TIME SERIES  Time series: An ordered sequence of values... b 22 A time series Xt is said to be stationary if its expected value and population variance are independent of time and if the population covariance between its values at time t and time t +... = Tt * St *Ct*It ASSUMPTIONS FOR TIME SERIES MODEL C.1 The model is linear in parameters and correctly specified Y = b + b2 X + … + b k X k + u C.2 The time series for the regressors are weakly

Ngày đăng: 30/10/2015, 15:34

Từ khóa liên quan

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan