# 2.1: Estimation of the Mean

$$\newcommand{\vecs}{\overset { \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

Assume that an appropriate guess for the unknown mean $$\mu$$ of some weakly stationary stochastic process $$(X_t\colon t\in\mathbb{Z})$$ has to be found. The sample mean $$\bar{x}$$, easily computed as the average of $$n$$ observations $$x_1,\ldots,x_n$$ of the process, has been identified as suitable in Section 1.2. To investigate its theoretical properties, one needs to analyze the random variable associated with it, that is,

$\bar{X}_n=\frac 1n(X_1+\ldots+X_n).$

Two facts can be quickly established.

• $$\bar{X}_n$$ is an unbiased estimator for $$\mu$$, since

$E[\bar{X}_n]=E\left[\frac 1n\sum_{t=1}^nX_t\right]=\frac 1n\sum_{t=1}^nE[X_t]=\frac 1n n\mu=\mu.$

This means that "on average'', the true but unknown $$\mu$$ is correctly estimated. Notice that there is no difference in the computations between the standard case of independent and identically distributed random variables and the more general weakly stationary process considered here.

• If $$\gamma(n)\to 0$$ as $$n\to\infty$$, then $$\bar{X}_n$$ is a consistent estimator for $$\mu$$, since

\begin{align*}
\mathrm{Var}(\bar{X}_n)&=\mathrm{Cov}\left(\frac 1n\sum_{s=1}^nX_s,\frac 1n\sum_{t=1}^nX_t\right)
=\frac{1}{n^2}\sum_{s=1}^n\sum_{t=1}^n\mathrm{Cov}(X_s,X_t)\.2cm] &=\frac{1}{n^2}\sum_{s-t=-n}^n(n-|s-t|)\gamma(s-t) =\frac 1n\sum_{h=-n}^n\left(1-\frac{|h|}{n}\right)\gamma(h). \end{align*} Now, the quantity on the right-hand side converges to zero as $$n\to\infty$$ because $$\gamma(n)\to 0$$ as $$n\to\infty$$ by assumption. The first equality sign in the latter equation array follows from the fact that $$\mathrm{Var}(X)=\mathrm{Cov}(X,X)$$ for any random variable $$X$$, the second equality sign uses that the covariance function is linear in both arguments. For the third equality, one can use that $$\mathrm{Cov}(X_s,X_t)=\gamma(s-t)$$ and that each $$\gamma(s-t)$$ appears exactly $$n-|s-t|$$ times in the double summation. Finally, the right-hand side is obtained by replacing $$s-t$$ with $$h$$ and pulling one $$n^{-1}$$ inside the summation. In the standard case of independent and identically distributed random variables $$n\mathrm{Var}(\bar{X})=\sigma^2$$. The condition $$\gamma(n)\to 0$$ is automatically satisfied. However, in the general case of weakly stationary processes, it cannot be omitted. More can be proved using an appropriate set of assumptions. The results are formulated as a theorem without giving the proofs. Theorem $$\PageIndex{1}$$ Let $$(X_t\colon t\in\mathbb{Z})$$ be a weakly stationary stochastic process with mean $$\mu$$ and ACVF $$\gamma$$. Then, the following statements hold true as $$n\to\infty$$. 1. If $$\sum_{h=-\infty}^\infty|\gamma(h)|<\infty$$, then \[ n\mathrm{Var}(\bar{X}_n)\to \sum_{h=-\infty}^\infty\gamma(h)=\tau^2;
2. If the process is "close to Gaussianity'', then $\sqrt{n}(\bar{X}_n-\mu)\sim AN(0,\tau_n^2), \qquad \tau_n^2=\sum_{h=-n}^n\left(1-\frac{|h|}{n}\right)\gamma(h).$

Here, $$\sim AN(0,\tau_n^2)$$ stands for approximately normally distributed with mean zero and variance $$\tau_n^2$$.

Theorem $$\PageIndex{1}$$ can be utilized to construct confidence intervals for the unknown mean parameter $$\mu$$. To do so, one must, however, estimate the unknown variance parameter $$\tau_n$$. For a large class of stochastic processes, it holds that $$\tau_n^2$$ converges to $$\tau^2$$ as $$n\to\infty$$. Therefore, we can use $$\tau^2$$ as an approximation for $$\tau_n^2$$. Moreover, $$\tau^2$$ can be estimated by

$\hat{\tau}_n^2=\sum_{h=-\sqrt{n}}^{\sqrt{n}}\left(1-\frac{|h|}{n}\right)\hat{\gamma}(h),$

where $$\hat{\gamma}(h)$$ denotes the ACVF estimator defined in (1.2.1). An approximate 95% confidence interval for $$\mu$$ can now be constructed as

$\left(\bar{X}_n-1.96\frac{\hat{\tau}_n}{\sqrt{n}},\bar{X}_n+1.96\frac{\hat{\tau}_n}{\sqrt{n}}\right).$

Example $$\PageIndex{1}$$: Autoregressive Processes

Let $$(X_t\colon t\in\mathbb{Z})$$ be given by the equations

\begin{equation}\label{eq:2.1.1}
\end{equation}

where $$(Z_t\colon t\in\mathbb{Z})\sim\mathrm{WN}(0,\sigma^2)$$ and $$|\phi|<1$$. It will be shown in Chapter 3 that $$(X_t\colon t\in\mathbb{Z})$$ defines a weakly stationary process. Utilizing the stochastic difference Equations \ref{2.1.1}, both mean and autocovariances can be determined. It holds that $$E[X_t]=\phi E[X_{t-1}]+\mu(1-\phi)$$. Since, by stationarity, $$E[X_{t-1}]$$ can be substituted with $$E[X_t]$$, it follows that

$E[X_t]=\mu,\qquad t\in\mathbb{Z}.$

In the following we shall work with the process $$(X_t^c\colon t\in\mathbb{Z})$$ given by letting $$X_t^c=X_t-\mu$$. Clearly, $$E[X_t^c]=0$$. From the definition, it follows also that the covariances of $$(X_t\colon t\in\mathbb{Z})$$ and $$(X_t^c\colon t\in\mathbb{Z})$$ coincide. First computing the second moment of $$X_t^c$$, gives

$E[\{X_t^c\}^2]=E\big[(\phi X_{t-1}^c+Z_t)^2\big]=\phi^2E[\{X_{t-1}^c\}^2]+\sigma^2$

and consequently, since $$E[\{X_{t-1}^c\}^2]=E[\{X_t^c\}^2]$$ by weak stationarity of $$(X_t^c\colon t\in\mathbb{Z})$$,

$E[\{X_t^c\}^2]=\frac{\sigma^2}{1-\phi^2},\qquad t\in\mathbb{Z}.$

It becomes apparent from the latter equation, why the condition $$|\phi|<1$$ was needed in display (2.1.1). In the next step, the autocovariance function is computed. For $$h>0$$, it holds that

$\gamma(h)=E[X_{t+h}^cX_t^c]=E\big[(\phi X_{t+h-1}^c+Z_{t+h})X_t^c\big]=\phi E[X_{t+h-1}^cX_t^c]=\phi\gamma(h-1)=\phi^{h}\gamma(0)$

after $$h$$ iterations. But since $$\gamma(0)=E[\{X_t^c\}^2]$$, by symmetry of the ACVF, it follows that

$\gamma(h)=\frac{\sigma^2\phi^{|h|}}{1-\phi^2},\qquad h\in\mathbb{Z}.$

After these theoretical considerations, a 95% (asymptotic) confidence interval for the mean parameter $$\mu$$ can be constructed. To check if Theorem 2.1.1 is applicable here, one needs to check if the autocovariances are absolutely summable:

\begin{align*}
\tau^2&=\sum_{h=-\infty}^\infty\gamma(h)=\frac{\sigma^2}{1-\phi^2}\left(1+2\sum_{h=1}^\infty\phi^h\right)
=\frac{\sigma^2}{1-\phi^2}\left(1+\frac{2}{1-\phi}-2\right) \.2cm] &=\frac{\sigma^2}{1-\phi^2}\frac{1}{1-\phi}(1+\phi)=\frac{\sigma^2}{(1-\phi)^2}<\infty. \end{align*} Therefore, a 95% confidence interval for $$\mu$$ which is based on the observed values $$x_1,\ldots,x_n$$ is given by \[ \left(\bar{x}-1.96\frac{\sigma}{\sqrt{n}(1-\phi)},\bar{x}+1.96\frac{\sigma}{\sqrt{n}(1-\phi)}\right).

Therein, the parameters $$\sigma$$ and $$\phi$$ have to be replaced with appropriate estimators. These will be introduced in Chapter 3.

2.1: Estimation of the Mean is shared under a not declared license and was authored, remixed, and/or curated by Alexander Aue.