# 18.4: Geometric Brownian Motion

- Page ID
- 10406

## Basic Theory

Geometric Brownian motion, and other stochastic processes constructed from it, are often used to model population growth, financial processes (such as the price of a stock over time), subject to random noise

.

### Definition

Suppose that \( \bs{Z} = \{Z_t: t \in [0, \infty)\} \) is standard Brownian motion and that \( \mu \in \R \) and \( \sigma \in (0, \infty) \). Let \[ X_t = \exp\left[\left(\mu - \frac{\sigma^2}{2}\right) t + \sigma Z_t\right], \quad t \in [0, \infty) \] The stochastic process \( \bs{X} = \{X_t: t \in [0, \infty)\} \) is geometric Brownian motion with drift parameter \( \mu \) and volatility parameter \( \sigma \).

Note that the stochastic process \[ \left\{\left(\mu - \frac{\sigma^2}{2}\right) t + \sigma Z_t: t \in [0, \infty) \right\} \] is Brownian motion with drift parameter \( \mu - \sigma^2 / 2 \) and scale parameter \( \sigma \), so geometric Brownian motion is simply the exponential of this process. In particular, the process is always positive, one of the reasons that geometric Brownian motion is used to model financial and other processes that cannot be negative. Note also that \( X_0 = 1 \), so the process starts at 1, but we can easily change this. For \( x_0 \in (0, \infty) \), the process \(\{x_0 X_t: t \in [0, \infty)\}\) is geometric Brownian motion starting at \( x_0 \). You may well wonder about the particular combination of parameters \( \mu - \sigma^2 / 2 \) in the definition. The short answer to the question is given in the following theorem:

Geometric Brownian motion \( \bs{X} = \{X_t: t \in [0, \infty)\} \) satisfies the stochastic differential equation \[ d X_t = \mu X_t \, dt + \sigma X_t \, dZ_t \]

Note that the deterministic part of this equation is the standard differential equation for exponential growth or decay, with rate parameter \( \mu \).

Run the simulation of geometric Brownian motion several times in single step mode for various values of the parameters. Note the behavior of the process.

### Distributions

For \( t \in (0, \infty) \), \( X_t \) has the lognormal distribution with parameters \( \left(\mu - \frac{\sigma^2}{2}\right)t \) and \( \sigma \sqrt{t} \). The probability density function \( f_t \) is given by \[ f_t(x) = \frac{1}{\sqrt{2 \pi t} \sigma x} \exp \left(-\frac{\left[\ln(x) - \left(\mu - \sigma^2 / 2\right)t \right]^2}{2 \sigma^2 t} \right), \quad x \in (0, \infty) \]

- \( f \) increases and then decreases with mode at \( x = \exp\left[\left(\mu - \frac{3}{2} \sigma^2\right)t\right]\)
- \( f \) is concave upward, then downward, then upward again with inflection points at \( x = \exp\left[(\mu - \sigma^2) t \pm \frac{1}{2} \sigma \sqrt{\sigma^2 t^2 + 4 t}\right] \)

## Proof

Since the variable \(U_t = \left(\mu - \sigma^2 / 2\right) t + \sigma Z_t\) has the normal distribution with mean \( (\mu - \sigma^2/2)t \) and standard deviation \( \sigma \sqrt{t} \), it follows that \( X_t = \exp(U_t) \) has the lognormal distribution with these parameters. These result for the PDF then follow directly from the corresponding results for the lognormal PDF.

In particular, geometric Brownian motion is not a Gaussian process.

Open the simulation of geometric Brownian motion. Vary the parameters and note the shape of the probability density function of \( X_t \). For various values of the parameters, run the simulation 1000 times and compare the empirical density function to the true probability density function.

For \( t \in (0, \infty) \), the distribution function \( F_t \) of \( X_t \) is given by \[ F_t(x) = \Phi\left[\frac{\ln(x) - (\mu - \sigma^2/2)t}{\sigma \sqrt{t}}\right], \quad x \in (0, \infty) \] where \( \Phi \) is the standard normal distribution function.

## Proof

Again, this follows directly from the CDF of the lognormal distribution.

For \( t \in (0, \infty) \), the quantile function \( F_t^{-1} \) of \( X_t \) is given by \[ F_t^{-1}(p) = \exp\left[(\mu - \sigma^2 / 2)t + \sigma \sqrt{t} \Phi^{-1}(p)\right], \quad p \in (0, 1) \] where \( \Phi^{-1} \) is the standard normal quantile function.

## Proof

This follows directly from the lognormal quantile function.

### Moments

For \( n \in \N \) and \( t \in [0, \infty) \), \[ \E\left(X_t^n\right) = \exp\left\{\left[n \mu + \frac{\sigma^2}{2}(n^2 - n)\right] t\right\} \]

## Proof

This follows from the formula for the moments of the lognormal distribution.

In terms of the order of the moment \( n \), the dominant term inside the exponential is \( \sigma^2 n^2 / 2 \). If \( n \gt 1 - 2 \mu / \sigma^2 \) then \( n \mu + \frac{\sigma^2}{2}(n^2 - n) \gt 0 \) so \( \E(X_t^n) \to \infty \) as \( t \to \infty \). The mean and variance follow easily from the general moment result.

For \( t \in [0, \infty) \),

- \( \E(X_t) = e^{\mu t} \)
- \( \var(X_t) = e^{2 \mu t} \left(e^{\sigma^2 t} - 1\right) \)

In particular, note that the mean function \( m(t) = \E(X_t) = e^{\mu t} \) for \( t \in [0, \infty) \) satisfies the deterministic part of the stochastic differential equation above. If \( \mu \gt 0 \) then \( m(t) \to \infty \) as \( t \to \infty \). If \( \mu = 0 \) then \( m(t) = 1 \) for all \( t \in [0, \infty) \). If \( \mu \lt 0 \) then \( m(t) \to 0 \) as \( t \to \infty \).

Open the simulation of geometric Brownian motion. The graph of the mean function \( m \) is shown as a blue curve in the main graph box. For various values of the parameters, run the simulation 1000 times and note the behavior of the random process in relation to the mean function.

Open the simulation of geometric Brownian motion. Vary the parameters and note the size and location of the mean\( \pm \)standard deviation bar for \( X_t \). For various values of the parameter, run the simulation 1000 times and compare the empirical mean and standard deviation to the true mean and standard deviation.

### Properties

The parameter \( \mu - \sigma^2 / 2 \) determines the asymptotic behavior of geometric Brownian motion.

Asymptotic behavior:

- If \( \mu \gt \sigma^2 / 2 \) then \( X_t \to \infty \) as \( t \to \infty \) with probability 1.
- If \( \mu \lt \sigma^2 / 2 \) then \( X_t \to 0 \) as \( t \to \infty \) with probability 1.
- If \( \mu = \sigma^2 / 2 \) then \( X_t \) has no limit as \( t \to \infty \) with probability 1.

## Proof

These results follow from the law of the iterative logarithm. Asymptotically, the term \( \left(\mu - \sigma^2 / 2\right) t \) dominates the term \( \sigma Z_t \) as \( t \to \infty \).

It's interesting to compare this result with the asymptotic behavior of the mean function, given above, which depends only on the parameter \( \mu \). When the drift parameter is 0, geometric Brownian motion is a martingale.

If \( \mu = 0 \), geometric Brownian motion \( \bs{X} \) is a martingale with respect to the underlying Brownian motion \( \bs{Z} \).

## Proof from stochastic integrals

This is the simplest proof. When \( \mu = 0 \), \( \bs{X} \) satisfies the stochastic differential equation \( d X_t = \sigma X_t \, dZ_t \) and therefore \[ X_t = 1 + \sigma \int_0^t X_s \, dZ_s, \quad t \ge 0 \] The process associated with a stochastic integral is always a martingale, assuming the usual assumptions on the integrand process (which are satisfied here).

## Direct proof

Let \( \mathscr{F}_t = \sigma\{Z_s: 0 \le s \le t\} \) for \( t \in [0, \infty) \), so that \( \mathfrak{F} = \{\mathscr{F}_t: t \in [0, \infty)\} \) is the natural filtration associated with \( \bs{Z} \). Let \( s, \, t \in [0, \infty) \) with \( s \le t \). We use our usual trick of writing \( Z_t = Z_s + (Z_t - Z_s) \), to take advantage of the stationary and independent increments properties of Brownian motion. Thus, \[ X_t = \exp\left[-\frac{\sigma^2}{2} t + \sigma Z_s + \sigma (Z_t - Z_s)\right] \] Since \( Z_s \) is measurable with respect to \( \mathscr{F}_s \) and \( Z_t - Z_s \) is independent of \( \mathscr{F}_s \) we have \[ \E\left(X_t \mid \mathscr{F}_s\right) = \exp\left(-\frac{\sigma^2}{2} t + \sigma Z_s\right) \E\left\{\exp\left[\sigma(Z_t - Z_s)\right]\right\} \] But \( Z_t - Z_s \) has the normal distribution with mean 0 and variance \( t - s \), so from the formula for the moment generating function of the normal distribution, we have \[ \E\left\{\exp\left[\sigma(Z_t - Z_s)\right]\right\} = \exp\left[\frac{\sigma^2}{2}(t - s)\right] \] Substituting gives \[ \E\left(X_t \mid \mathscr{F}_s\right) = \exp\left(-\frac{\sigma^2}{2} s + \sigma Z_s\right) = X_s \]