Skip to main content
Statistics LibreTexts

4.4: Expected Value and Variance of Continuous Random Variables

  • Page ID
    3268
  • Definition \(\PageIndex{1}\)

    If \(X\) is a continuous random variable with density function \(f(x)\), then the expected value (or mean) of \(Y\) is given by

    $$\mu = \mu_X = E[X] = \int\limits^{\infty}_{-\infty}\! x\cdot f(x)\, dx.\notag$$

    The formula for the expected value of a continuous random variable is the continuous analogue of the expected value of a discrete random variable, where instead of summing over all possible values we integrate, recall section 3.7. This interpretation of the expected value as a weighted average explains why it is also referred to as the mean of the random variable.

    Example \(\PageIndex{1}\)

    Consider again the context of Example 17, where we defined the continuous random variable \(X\) to denote the time a person waits for an elevator to arrive. The pdf of \(X\) was given by
    $$f(x) = \left\{\begin{array}{l l}
    x, & \text{for}\ 0\leq x\leq 1 \\
    2-x, & \text{for}\ 1< x\leq 2 \\
    0, & \text{otherwise}
    \end{array}\right.\notag$$
    Applying Definition 4.4.1, we compute the expected value of \(X\):
    $$E[X] = \int\limits^1_0\! x\cdot x\, dx + \int\limits^2_1\! x\cdot (2-x)\, dx = \int\limits^1_0\! x^2\, dx + \int\limits^2_1\! (2x - x^2)\, dx = \frac{1}{3} + \frac{2}{3} = 1.\notag$$
    Thus, we expect a person will wait 1 minute for the elevator on average. Figure 1 demonstrates the graphical representation of the expected value as the center of mass of the pdf.

    expec2.jpg

    Figure 1: Graph of \(f\): The red arrow represents the center of mass, or the expected value of \(X\).

    If continuous random variable \(X\) has a normal distribution with parameters \(\mu\) and \(\sigma\), then \(E[X] = \mu\). The normal case is why the notation \(\mu\) is often used for the expected value. Again, this fact can be derived using Definition 4.4.1; however, the integral calculation requires many tricks.

    The expected value may not be exactly equal to a parameter of the probability distribution, but rather it may be a function of the parameters as the next example with the uniform distribution shows.

    Example \(\PageIndex{2}\)

    Suppose the random variable \(X\) has a uniform distribution on the interval \([a,b]\). Then the pdf of \(X\) is given by
    $$f(x) = \frac{1}{b-a}, \quad\text{for}\ a\leq x\leq b.\notag$$
    Applying Definition 4.4.1, we compute the expected value of \(X\):
    $$E[X] = \int\limits^b_a\! x\cdot\frac{1}{b-a}\, dx = \frac{b^2 - a^2}{2}\cdot\frac{1}{b-a} = \frac{(b-a)(b+a)}{2}\cdot\frac{1}{b-a} = \frac{b+ a}{2}.\notag$$
    Thus, the expected value of the uniform\([a,b]\) distribution is given by the average of the parameters \(a\) and \(b\), or the midpoint of the interval \([a,b]\). This is readily apparent when looking at a graph of the pdf. Since the pdf is constant over \([a,b]\), the center of mass is simply given by the midpoint.

     

    If \(X\) is a continuous random variable with pdf \(f(x)\), then the expected value of \(Y\) is given by CONTINUOUS CASE

    $$E[Y] = \int\limits^{\infty}_{-\infty}\! g(x)\cdot f(x)\, dx.\notag$$

     

    2. If \(X_1, \ldots, X_n\) are continuous random variables with joint density function \(p(x_1, \ldots, x_n)\), then the expected value of \(Y\) is given by CONTINUOUS CASE
    $$E[Y] = \int\limits^{\infty}_{-\infty}\!\cdots\int\limits^{\infty}_{-\infty}\! g(x_1, \ldots, x_n)\cdot f(x_1, \ldots, x_n)\, dx_1\, \ldots\, dx_n.\notag$$