Objective
Even when autocorrelation is present the OLS coefficients are unbiased, but they are not necessarily the estimates of the population coefficients that have the smallest variance. We now demonstrate the generalized least squares (GLS) method for estimating the regression coefficients with the smallest variance.
GLS Approach
Suppose that the population linear regression model is
and so for each observation i
Now suppose that all the linear regression assumptions hold, except that there is autocorrelation, i.e. E[εiεi+h] ≠ 0 where h ≠ 0. Let’s assume, in particular, that we have first-order autocorrelation, and so for all i, we can express εi by
εi = ρεi-1 + δi
where ρ is the first-order autocorrelation coefficient, i.e. the correlation coefficient between ε1, ε2, …, εn-1 and ε2, ε3, …, εn and the ui is an error term that satisfies the standard OLS assumptions, namely E[δi] = 0, var(δi) = σδ, a constant, and cov(δi,δj) = 0 for all i ≠ j. Note that since ρ is a correlation coefficient, it follows that -1 ≤ ρ ≤ 1.
We will assume further that -1 < ρ < 1, which means that the sequence ε1, ε2, …, εn has some desirable properties (namely that when viewed as a time series it is stationary).
Differencing
Now
and so
Multiplying both sides of the second equation by ρ and subtracting it from the first equation yields
Note that εi – ρεi-1 = δi, and if we set
for all j > 0, then this equation can be expressed as the generalized difference equation:
This equation satisfies all the OLS assumptions and so an estimate of the parameters β0′, β1, …, βk can be found using the standard OLS approach provided we know the value of ρ.
Prais-Winsten Transformation
Note that we lose one sample element when we utilize this differencing approach since y1 and the x1j have no predecessors. For large samples, this is not a problem, but it can be a problem with small samples. We can use the Prais-Winsten transformation to obtain a first observation, namely
Second-order Autocorrelation
The same approach can be used with p-order autocorrelation. E.g. for second-order autocorrelation, we have
Thus
We can multiply the second equation by ρ1 and the third equation by ρ2, and then subtract both of these from the first equation to once again get the equation
But this time
As before, this equation satisfies all the OLS assumptions and so an estimate of the parameters β0′, β1, …, βk can be found using the standard OLS approach provided we know the value of ρ1 and ρ2. Except for the intercept, the resulting coefficients are coefficients for the original regression model. The intercept coefficient of the original model is simply β0′/(1 – ρ1 – ρ2). The same approach is used for the standard errors of the original regression model.
References
Wooldridge, J. M. (2013) Introductory econometrics, a modern approach (5th Ed). Cengage Learning
Wikipedia (2018) Generalized least squares
https://en.wikipedia.org/wiki/Generalized_least_squares
Hi. I have a time series data covering 30 years period. Means 30 observations. Considering two Y as a dependent and X as independent variable. Problems with DW test which was falling between 0 and 2 and below the critical value. Applied Prais Weinsten test, achieved DW test with more than 2. Is that fine or something else is needed for further model robustness? Thanks
Hi Manu,
Ideally, you want the DW statistics to be near 2. See
Durbin-Watson Test
Charles