**Introduction**

In CLRM, there are several assumptions are made which are difficult
to fulfil in every data or regressional model. This article examines one of
these assumptions that ** “Independence of the error terms”**. Specifically, it will
critically look at the following aspects.

1. Understanding of the problem of autocorrelation.

2. Causes of autocorrelation

3. Consequences of autocorrelation

## **What is Autocorrelation?**

The term autocorrelation means ** “the
correlation between the error term”**. Or in other words, it is the violation of
the one of crucial assumption of the OLS
that, the correlation between the two disturbance terms must be zero.
Symbolically

\[cov(u{i},u_{j}|X_{i},X_{j})=0\] (Equation (1)

Where cov means covariance and I and j
shows two different observations.

The autocorrelation occurs when distortion in one time period affects the output in the next period. In such a situation, the assumption of the independent error terms violate, symbolically,

\[E(u{i},u_{j})\not\equiv 0\] Equation (2)

**Different terminologies
for correlation between the disturbance terms**

There are some different terminologies for
this phenomenon. **Autocorrelation** is the correction between the disturbance term
of the same series (as shown in equation 2). Whereas, **serial correlation** is the
correlation between the disturbance term of two different series (as shown in
equation 3).

\[E(u{i},v_{i})\not\equiv 0\] Equation (3)

The autocorrelation is an issue of the
time-series data. But there is an also possibility of the dependence in the
cross-sectional data termed as **“Spatial Autocorrelation”**. Spatial correlation
in the correlation between the space rather than time.

** ****Causes of Autocorrelation**

There can be different causes of autocorrelation.

**Intertia**

The variables that follow the business cycle, for example, show an upward trend in recovery and downstream in recession. Thus, interdependence is likely to occur in such time-series data.

** ****Specification Bias**

If some important variable
is excluded from the model known as “Excludibility bias” that excluded variable
appears in the error terms thus start following the specific pattern leads to
autocorrelation.

Also, specification bias from wrong functional foam can create the problem of autocorrelation

** ****Cobweb Phenomenon**

Cobweb is the phenomenon usually occurs in the agricultural sector in which supply reacts to prices changes with a lag of one period. Due to which disturbance term represents a systematic pattern

** ****Manipulation of Data**

Manipulation of the data that smoothen the raw time series data often smooths the disturbance terms thereby introduce autocorrelation

**Data Transformation**

Consider the equation

\[Y_{t}=\beta_{1}+\beta_{2}X_{t}+u_{t}\] Equation (4)

As this equation holds for every period. Taking a lag of one period, the new equation will be

\[Y_{t-1}=\beta_{1}+\beta_{2}X_{t-1}+u_{t-1}\] Equation (5)

By subtracting equation 5 from 4, the resultant equation will be,

\[\Delta Y_{t}=\beta _{2}\Delta X_{t}+\Delta u_{t}\] Equation (6)

Or \[\Delta Y_{t}=\beta _{2}\Delta X_{t}+v_{t}\] Equation (7)

Where \[u_{t}-u_{t-1}=\Delta
u_{t}=v_{t}\]

If disturbance term in equation 4 exhibit no autocorrelation then the error term in equation 6 is autocorrelated.

**Non-Stationarity**

If the dependent and independent variables are non-stationary and their corresponding error term would be stationary, thus, autocorrelation prevails.

Consider the following equation

\[u_{t}=\rho u_{t-1}+\varepsilon _{t}\] Equation (8)

The stochastic error term satisfies the
following OLS assumptions.

\[E(\varepsilon_{t})=0\]

\[var(\varepsilon_{t})=\sigma
^{2}\]

\[cov(\varepsilon
_{t},\varepsilon _{t+s})=0\]

where, \[s\not\equiv 0\]

Equation (8) is known as **Markov First-order Autoregressive Scheme** or f**irst-order autoregressive scheme**. \[\rho\] is the autocorrelation coefficient. It lies between
-1 and +1. If the absolute value of \[\rho\] is less than 1 equation (8) is stationary
otherwise not. If \[|\rho |< 1\]then the disturbance term is homoscedastic but
it is correlated.

## Consequences of Autocorrelation

In such a situation, by using the OLS will lead to the following consequences

1. 1. Underestimation of true
variance.

2. 2. Overestimation of \[R^{2}\]

3. 3. Underestimation of the
variances of coefficients.

4. 4. No longer validity of Fand
T-test.

“Thus in the presence of autocorrelation, one should use GLS rather than OLS as it provides BLUE estimators"

** Conclusion**

Autocorrelation is the fundamental
issue of the existence of a correlation between the error terms. It can self
exist or different econometric issue or function can lead to this problem. In
the existence of autocorrelation, the OLS estimators remain no longer BLUE, thus,
GLS is used to get efficient and reliable estimates.

**Also Read:**

Remedial Measures to Heteroscedasticity

Heteroscedasticity - Causes, Consequences and Detection

Multicollinearity Causes, Detection, Consequences and Remedial Measures

## No comments:

## Post a Comment