Skip to main content
Statistics LibreTexts

2.3: Conditional Probability II

  • Page ID
    325
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Example 2.55 highlights why doctors often run more tests regardless of a first positive test result. When a medical condition is rare, a single positive test isn't generally definitive. Consider again the last equation of Example 2.55. Using the tree diagram, we can see that the numerator (the top of the fraction) is equal to the following product:

    \[P((has BC and mammogram^+) = P(mammogram^+ | has BC)P(has BC)\]

    The denominator - the probability the screening was positive - is equal to the sum of probabilities for each positive screening scenario:

    \[P(mammogram^+) = P(mammogram^+ and no BC) + P(mammogram^+ and has BC)\]

    In the example, each of the probabilities on the right side was broken down into a product of a conditional probability and marginal probability using the tree diagram.

    \[P(\underline {mammogram^+}) = P(\underline {mammogram^+} and no BC) + P(\underline {mammogram^+} and has BC\]

    \[ = P(mammogram^+ | no BC)P(no BC) + P(mammogram^+| has BC)P(has BC)\]

    We can see an application of Bayes' Theorem by substituting the resulting probability expressions into the numerator and denominator of the original conditional probability.

    \[ P (has BC | mammogram^+) = \frac {P (mammogram^+ | has BC) P (has BC)}{P(mammogram^+ | no BC) + P(mammogram^+ | has BC)P(has BC)}\]

    Bayes' Theorem: inverting probabilities

    Consider the following conditional probability for variable 1 and variable 2:

    \[ P (outcome A_1 of variable 1| outcome B of variable 2)\]

    Bayes' Theorem states that this conditional probability can be identified as the following fraction:

    \[ \frac {P(B|A_1)P(A_1)}{P(B|A_1)P(A_1) + P(B|A_2)P(A_2) + \dots + P(B|A_k)P(A_k)} \label {2.56}\]

    where A2, A3, ..., and Ak represent all other possible outcomes of the rst variable.

    Bayes' Theorem is just a generalization of what we have done using tree diagrams. The numerator identi es the probability of getting both A1 and B. The denominator is the marginal probability of getting B. This bottom component of the fraction appears long and complicated since we have to add up probabilities from all of the different ways to get B. We always completed this step when using tree diagrams. However, we usually did it in a separate step so it didn't seem as complex.

    To apply Bayes' Theorem correctly, there are two preparatory steps:

    (1) First identify the marginal probabilities of each possible outcome of the first variable:

    \[P(A1), P(A2), ..., P(Ak).\]

    (2) Then identify the probability of the outcome B, conditioned on each possible scenario for the first variable:

    \[P(B|A1), P(B|A2), ..., P(B|Ak).\]

    Once each of these probabilities are identi ed, they can be applied directly within the formula.

    TIP: Only use Bayes' Theorem when tree diagrams are difficult

    Drawing a tree diagram makes it easier to understand how two variables are connected. Use Bayes' Theorem only when there are so many scenarios that drawing a tree diagram would be complex.

    Exercise 2.57 Jose visits campus every Thursday evening. However, some days the parking garage is full, often due to college events. There are academic events on 35% of evenings, sporting events on 20% of evenings, and no events on 45% of evenings. When there is an academic event, the garage fills up about 25% of the time, and it lls up 70% of evenings with sporting events. On evenings when there are no events, it only fills up about 5% of the time. If Jose comes to campus and finds the garage full, what is the probability that there is a sporting event? Use a tree diagram to solve this problem.40

    Example 2.58 Here we solve the same problem presented in Exercise 2.57, except this time we use Bayes' Theorem.

    The outcome of interest is whether there is a sporting event (call this A1), and the condition is that the lot is full (B). Let A2 represent an academic event and A3 represent there being no event on campus. Then the given probabilities can be written as

    \[P (A_1) = 0.2 P (A_2) = 0.35 P (A_3) = 0.45 \]

    \[P (B|A_1) = 0.7 P (B|A_2) = 0.25 P (B|A_3) = 0.05\]

    Bayes' Theorem can be used to compute the probability of a sporting event (A1) under the condition that the parking lot is full (B):

    \[P(A_1|B) = \frac {P(B|A_1)P(A_1)}{P(B|A_1)P(A_1) + P(B|A_2)P(A_2) + P(B|A_3)P(A_3)}\]

    \[= \frac {(0.7)(0.2)}{(0.7)(0.2) + (0.25)(0.35) + (0.05)(0.45)}\]

    \[= 0.56 \]

    Based on the information that the garage is full, there is a 56% probability that a sporting event is being held on campus that evening.

    Exercise 2.59 Use the information in the previous exercise and example to verify the probability that there is an academic event conditioned on the parking lot being full is 0.35.41

    40The tree diagram, with three primary branches, is shown below. Next, we identify two probabilities from the tree diagram. (1) The probability that there is a sporting event and the garage is full: 0.14. (2) The probability the garage is full: \(0.0875 + 0.14 + 0.0225 = 0.25\). Then the solution is the ratio of these probabilities: \(\frac {0.14}{0.25} = 0.56\). If the garage is full, there is a 56% probability that there is a sporting event.

    alt

    41Short answer: \[P(A2|B) = \frac {P(B|A2)P(A1)}{P(B|A1)P(A1) + P(B|A2)P(A2) + P(B|A3)P(A3)}\]

    \[= \frac {(0.25)(0.35)}{(0.7)(0.2) + (0.25)(0.35) + (0.05)(0.45)}\]

    \[= 0.35\]

    Exercise \(\PageIndex{1}\)

    Exercise 2.60 In Exercise 2.57 and 2.59, you found that if the parking lot is full, the probability a sporting event is 0.56 and the probability there is an academic event is 0.35. Using this information, compute P(no event j the lot is full).42

    The last several exercises offered a way to update our belief about whether there is a sporting event, academic event, or no event going on at the school based on the information that the parking lot was full. This strategy of updating beliefs using Bayes' Theorem is actually the foundation of an entire section of statistics called Bayesian statistics. While Bayesian statistics is very important and useful, we will not have time to cover much more of it in this book.


    This page titled 2.3: Conditional Probability II is shared under a CC BY-SA 3.0 license and was authored, remixed, and/or curated by David Diez, Christopher Barr, & Mine Çetinkaya-Rundel via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.