Top Banner
Unit Roots & Forecasting Methods of Economic Investigation Lecture 20
22

Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Dec 16, 2015

Download

Documents

Valerie Simmons
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Unit Roots & Forecasting

Methods of Economic Investigation

Lecture 20

Page 2: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Last Time Descriptive Time Series

Processes Estimating with exogenous serial correlation Estimating with endogenous processes

Page 3: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Today’s Class Non-stationaryTime Series

Unit Roots and Spurious Regressions Orders of Integration

Returning to Causal Effects Impulse Response Functions Forecasting

Page 4: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Random Walk Processes Definition:

Et[xt+1] = xt that is today’s value of X is the best predictor of tomorrow’s value.

This looks very similar to our AR(1) processes, where φ = 1.

Autocovariances of a random walk are not well defined in a technical sense, but imagine AR(1) process with φ1: we have nearly perfect autocorrelation for any two time periods.

persistence dies out too slowly so most of variance will largely be due to very low-frequency “shocks.”

Page 5: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Permanence of Shocks in Unit Root An innovation (a shock at t ) to a stationary

AR process dies out eventually (the autocorrelation function declines eventually to zero).

A shock to a random walk is permanent

Variance is increasing over time Var(xt) = Var(x0) + tσ2

t

ii

ttt

ttt

x

x

xx

110

12

1

Page 6: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.
Page 7: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Drifts and Trends Deterministic trend

yt = δt + xt + εt

xt is some stationary process

yt is “trend” stationary

It’s easy to add a deterministic trend to a random walk

t

ii

ttt

uty

uyy

110

1

Page 8: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.
Page 9: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Orders of Integration A series is integrated of order p if a p

differences render it stationary. If a time series is integrated and differencing

once renders the time series stationary, then it is integrated of order 1 or I(1).

If it is necessary to difference twice before a time series is stationary, then it is I(2), and so forth.

Page 10: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Integrated Series If a time series has a unit root, it is said to be

integrated. First differencing the time series removes the unit root. E.g.

in the case of a random walk

yt = yt-1 + ut, ut ~ N(0, σ2)

Δyt = ut

the first difference is white noise, which is stationary.

For an AR(p) a unit root implies1 – β1L – β2L2 – ... – βpLp = (1 – L) (1 – λ1L – λ2L2 ... – λpLp-1) = 0

and as a result first differencing also removes the unit root.

Page 11: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Non-stationarity Nonstationarity can have important

consequences for regression modelsand inference. Autoregressive coefficients are biased t-statistics have non-normal distributions even

in large samples Spurious regression

Page 12: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Problem: Spurious Regression imagine we now have two series are

generated by independent random walks,

Suppose we run yt on xt using OLS, that is we estimate yt = α + βxt + νt.

In this case, you tend to see ”significant” β because the low-frequency changes make it seem as if the two series are in some way associated.

Page 13: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.
Page 14: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Unit Root Tests Standard Dickey-Fuller test appropriate for

AR(1) processes Many economic and financial time series have

a more complicated dynamic structure than is captured by a simple AR(1) model.

Said and Dickey (1984) augment the basic autoregressive unit root test to accommodate general ARMA(p, q) models with unknown orders and

Called the augmented Dickey-Fuller (ADF) test

Page 15: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

ADF Test – 1 The ADF test tests the null hypothesis that

a time series yt is I(1) against the alternative that it is I(0), assuming that the dynamics in the data have an ARMA structure.

The ADF test is based on estimating the test regression

p

jtjtjttt yyxy

11

Deterministic variables Potential unit root

Other serial correlation

Page 16: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

ADF Test - 2 To see why:

Subtract yt-1 from both sides and define

Φ = (α1+ α2+…+ αp – 1) and we get

Test Φ= 0 against alternative Φ<0 Use special DF upperbound and lowerbound Under alternative, test statistic is not normally distributed

ttptp

tttpt

tptptttt

yy

yyyx

yyyxy

)...(

...)()..(

...

1

122121

2211

p

jtjtjttt yyxy

11

Page 17: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Estimating in Time Series Non-stationary time series can lead to a lot of

problems in econometric analysis. In order to work with time series, particular in

regression models, we should therefore transform our variables to stationary time series first. First differencing removes unit roots or trends. Hence,

difference a time series until it is I(0). Differencing too often is less of a problem since a

differenced stationary series is still stationary. Regressions of one stationary variable on another is less

problematic.

Although observations may not be independent, we can expect regression to have similar properties as with cross sectional data.

Page 18: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Impulse Response Function One of the most interesting things to do with an

ARMA model is form predictions of the variable given its past. we want to know what is Et(xt+j )

Can do inference with Vart(xt+j)

The impulse response function is a simpel way to do that Follow te path that x follows if it is kicked by unit shock characterization of the behavior of our models. allows us to start thinking about “causes” and “effects”.

Page 19: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Impulse Response and MA(∞) 1. The MA(∞) representation is the same thing as

the impulse response function.i.e.

The easiest way to calculate an MA(∞) representation is to simulate the impulse-response function.

The impulse response function is the same as Et(xt+j) − Et−1(xt+j).

0jjtjty

Page 20: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Causality and Impulse Response Can either forecast or simulate the effect of a given shock

Try to pick a shock time/level to simulate and try to replicate observed data

Issue of whether that shock is what really happened Know a shock happened in time time t

See if observed change (more on this next time)

Granger causality implies a correlation between the current value of one variable and the past values of others it does not necessarily imply that changes in one variable

“causes” changes in another. Use a F-test to jointly test for the significance of the lags on

the explanatory variables, this in effect tests for ‘Granger causality’ between these variables.

Can visually see correlation in impulse response functions

Page 21: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Source: Cochrane, QJE (1994)

Page 22: Unit Roots & Forecasting Methods of Economic Investigation Lecture 20.

Next Time Estimating Causality in Time Series

Some additional forecasting stuff Testing for breaks Regression discontinuity/Event study