Main Content

Regression Models with Time Series Errors

What Are Regression Models with Time Series Errors?

Regression models with time series errors attempt to explain the mean behavior of a response series (yt, t = 1,...,T) by accounting for linear effects of predictors (Xt) using a multiple linear regression (MLR). However, the errors (ut), called unconditional disturbances, are time series rather than white noise, which is a departure from the linear model assumptions. Unlike the ARIMA model that includes exogenous predictors, regression models with time series errors preserve the sensitivity interpretation of the regression coefficients (β) [2].

These models are particularly useful for econometric data. Use these models to:

  • Analyze the effects of a new policy on a market indicator (an intervention model).

  • Forecast population size adjusting for predictor effects, such as expected prevalence of a disease.

  • Study the behavior of a process adjusting for calendar effects. For example, you can analyze traffic volume by adjusting for the effects of major holidays. For details, see [3].

  • Estimate the trend by including time (t) in the model.

  • Forecast total energy consumption accounting for current and past prices of oil and electricity (distributed lag model).

Use these tools in Econometrics Toolbox™ to:

  • Specify a regression model with ARIMA errors (see regARIMA).

  • Estimate parameters using a specified model, and response and predictor data (see estimate).

  • Simulate responses using a model and predictor data (see simulate).

  • Forecast responses using a model and future predictor data (see forecast).

  • Infer residuals and estimated unconditional disturbances from a model using the model and predictor data (see infer).

  • filter innovations through a model using the model and predictor data

  • Generate impulse responses (see impulse).

  • Compare a regression model with ARIMA errors to an ARIMAX model (see arima).

Conventions

A regression model with time series errors has the following form (in lag operator notation):

yt=c+Xtβ+uta(L)A(L)(1L)D(1Ls)ut=b(L)B(L)εt,(1)
where

  • t = 1,...,T.

  • yt is the response series.

  • Xt is row t of X, which is the matrix of concatenated predictor data vectors. That is, Xt is observation t of each predictor series.

  • c is the regression model intercept.

  • β is the regression coefficient.

  • ut is the disturbance series.

  • εt is the innovations series.

  • Ljyt=ytj.

  • a(L)=(1a1L...apLp), which is the degree p, nonseasonal autoregressive polynomial.

  • A(L)=(1A1L...ApsLps), which is the degree ps, seasonal autoregressive polynomial.

  • (1L)D, which is the degree D, nonseasonal integration polynomial.

  • (1Ls), which is the degree s, seasonal integration polynomial.

  • b(L)=(1+b1L+...+bqLq), which is the degree q, nonseasonal moving average polynomial.

  • B(L)=(1+B1L+...+BqsLqs), which is the degree qs, seasonal moving average polynomial.

Following Box and Jenkins methodology, ut is a stationary or unit root nonstationary, regular, linear time series. However, if ut is unit root nonstationary, then you do not have to explicitly difference the series as they recommend in [1]. You can simply specify the seasonal and nonseasonal integration degree using the software. For details, see Create Regression Models with ARIMA Errors.

Another deviation from the Box and Jenkins methodology is that ut does not have a constant term (conditional mean), and therefore its unconditional mean is 0. However, the regression model contains an intercept term, c.

Note

If the unconditional disturbance process is nonstationary (i.e., the nonseasonal or seasonal integration degree is greater than 0), then the regression intercept, c, is not identifiable. For details, see Intercept Identifiability in Regression Models with ARIMA Errors.

The software enforces stability and invertibility of the ARMA process. That is,

ψ(L)=b(L)B(L)a(L)A(L)=1+ψ1L+ψ2L2+...,

where the series {ψt} must be absolutely summable. The conditions for {ψt} to be absolutely summable are:

  • a(L) and A(L) are stable (i.e., the eigenvalues of a(L) = 0 and A(L) = 0 lie inside the unit circle).

  • b(L) and B(L) are invertible (i.e., their eigenvalues lie of b(L) = 0 and B(L) = 0 inside the unit circle).

The software uses maximum likelihood for parameter estimation. You can choose either a Gaussian or Student’s t distribution for the innovations, εt.

The software treats predictors as nonstochastic variables for estimation and inference.

References

[1] Box, G. E. P., G. M. Jenkins, and G. C. Reinsel. Time Series Analysis: Forecasting and Control. 3rd ed. Englewood Cliffs, NJ: Prentice Hall, 1994.

[2] Hyndman, R. J. (2010, October). “The ARIMAX Model Muddle.” Rob J. Hyndman. Retrieved May 4, 2017 from https://robjhyndman.com/hyndsight/arimax/.

[3] Ruey, T. S. “Regression Models with Time Series Errors.” Journal of the American Statistical Association. Vol. 79, Number 385, March 1984, pp. 118–124.

See Also

| | | | | | |

Related Examples

More About