Skip to main content
Statistics LibreTexts

4.2: Theoretical Probability

  • Page ID
    5179
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    It is not always feasible to conduct an experiment over and over again, so it would be better to be able to find the probabilities without conducting the experiment. These probabilities are called Theoretical Probabilities.

    To be able to do theoretical probabilities, there is an assumption that you need to consider. It is that all of the outcomes in the sample space need to be equally likely outcomes. This means that every outcome of the experiment needs to have the same chance of happening.

    Example \(\PageIndex{1}\) Equally likely outcomes

    Which of the following experiments have equally likely outcomes?

    1. Rolling a fair die.
    2. Flip a coin that is weighted so one side comes up more often than the other.
    3. Pull a ball out of a can containing 6 red balls and 8 green balls. All balls are the same size.
    4. Picking a card from a deck.
    5. Rolling a die to see if it is fair.

    Solution

    1. Since the die is fair, every side of the die has the same chance of coming up. The outcomes are the different sides, so each outcome is equally likely.
    2. Since the coin is weighted, one side is more likely to come up than the other side. The outcomes are the different sides, so each outcome is not equally likely.
    3. Since each ball is the same size, then each ball has the same chance of being chosen. The outcomes of this experiment are the individual balls, so each outcome is equally likely. Don’t assume that because the chances of pulling a red ball are less than pulling a green ball that the outcomes are not equally likely. The outcomes are the individual balls and they are equally likely.
    4. If you assume that the deck is fair, then each card has the same chance of being chosen. Thus the outcomes are equally likely outcomes. You do have to make this assumption. For many of the experiments you will do, you do have to make this kind of assumption.
    5. In this case you are not sure the die is fair. The only way to determine if it is fair is to actually conduct the experiment, since you don’t know if the outcomes are equally likely. If the experimental probabilities are fairly close to the theoretical probabilities, then the die is fair.

    If the outcomes are not equally likely, then you must do experimental probabilities. If the outcomes are equally likely, then you can do theoretical probabilities.

    Definition \(\PageIndex{1}\): Theoretical Probabilities

    If the outcomes of an experiment are equally likely, then the probability of event A happening is

    \(P(A)=\dfrac{\# \text { of outcomes in event space }}{\# \text { of outcomes in sample space }}\)

    Example \(\PageIndex{2}\) calculating theoretical probabilities

    Suppose you conduct an experiment where you flip a fair coin twice.

    1. What is the sample space?
    2. What is the probability of getting exactly one head?
    3. What is the probability of getting at least one head?
    4. What is the probability of getting a head and a tail?
    5. What is the probability of getting a head or a tail?
    6. What is the probability of getting a foot?
    7. What is the probability of each outcome? What is the sum of these probabilities?

    Solution

    a. There are several different sample spaces you can do. One is SS={0, 1, 2} where you are counting the number of heads. However, the outcomes are not equally likely since you can get one head by getting a head on the first flip and a tail on the second or a tail on the first flip and a head on the second. There are 2 ways to get that outcome and only one way to get the other outcomes. Instead it might be better to give the sample space as listing what can happen on each flip. Let H = head and T = tail, and list which can happen on each flip.

    SS={HH, HT, TH, TT}

    b. Let A = getting exactly one head. The event space is A = {HT, TH}. So

    \(P(A)=\dfrac{2}{4} \text { or } \dfrac{1}{2}\)

    It may not be advantageous to reduce the fractions to lowest terms, since it is easier to compare fractions if they have the same denominator.

    c. Let B = getting at least one head. At least one head means get one or more. The event space is B = {HT, TH, HH} and

    \(P(B)=\dfrac{3}{4}\)

    Since P(B) is greater than the P(A), then event B is more likely to happen than event A.

    d. Let C = getting a head and a tail = {HT, TH} and

    \(P(C)=\dfrac{2}{4}\)

    This is the same event space as event A, but it is a different event. Sometimes two different events can give the same event space.

    e. Let D = getting a head or a tail. Since or means one or the other or both and it doesn’t specify the number of heads or tails, then D = {HH, HT, TH, TT} and

    \(P(D)=\dfrac{4}{4}=1\)

    f. Let E = getting a foot. Since you can’t get a foot, E = {} or the empty set and

    \(P(E)=\dfrac{0}{4}=0\)

    g. \(P(H H)=P(H T)=P(T H)=P(T T)=\dfrac{1}{4}\). If you add all of these probabilities together you get 1.

    This example had some results in it that are important concepts. They are summarized below:

    Probability Properties

    1. \(0 \leq P(\text { event }) \leq 1\)
    2. If the P(event)=1, then it will happen and is called the certain event.
    3. If the P(event)=0, then it cannot happen and is called the impossible event.
    4. \(\sum P(\text { outcome })=1\)

    Example \(\PageIndex{3}\) calculating theoretical probabilities

    Suppose you conduct an experiment where you pull a card from a standard deck.

    1. What is the sample space?
    2. What is the probability of getting a Spade?
    3. What is the probability of getting a Jack?
    4. What is the probability of getting an Ace?
    5. What is the probability of not getting an Ace?
    6. What is the probability of getting a Spade and an Ace?
    7. What is the probability of getting a Spade or an Ace?
    8. What is the probability of getting a Jack and an Ace?
    9. What is the probability of getting a Jack or an Ace?

    Solution

    a. SS = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS, 2C, 3C, 4C, 5C, 6C, 7C, 8C, 9C, 10C, JC, QC, KC, AC, 2D, 3D, 4D, 5D, 6D, 7D, 8D, 9D, 10D, JD, QD, KD, AD, 2H, 3H, 4H, 5H, 6H, 7H, 8H, 9H, 10H, JH, QH, KH, AH}

    b. Let A = getting a spade = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS} so

    \(P(A)=\dfrac{13}{52}\)

    c. Let B = getting a Jack = {JS, JC, JH, JD} so

    \(P(B)=\dfrac{4}{52}\)

    d. Let C = getting an Ace = {AS, AC, AH, AD} so

    \(P(C)=\dfrac{4}{52}\)

    e. Let D = not getting an Ace = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, 2C, 3C, 4C, 5C, 6C, 7C, 8C, 9C, 10C, JC, QC, KC, 2D, 3D, 4D, 5D, 6D, 7D, 8D, 9D, 10D, JD, QD, KD, 2H, 3H, 4H, 5H, 6H, 7H, 8H, 9H, 10H, JH, QH, KH} so

    \(P(D)=\dfrac{48}{52}\)

    Notice, \(P(D)+P(C)=\dfrac{48}{52}+\dfrac{4}{52}=1\), so you could have found the probability of D by doing 1 minus the probability of C \(P(D)=1-P(C)=1-\dfrac{4}{52}=\dfrac{48}{52}\).

    f. Let E = getting a Spade and an Ace = {AS} so

    \(P(E)=\dfrac{1}{52}\)

    g. Let F = getting a Spade and an Ace ={2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS, AC, AD, AH} so

    \(P(F)=\dfrac{16}{52}\)

    h. Let G = getting a Jack and an Ace = { } since you can’t do that with one card. So

    \(P(G)=0\)

    i. Let H = getting a Jack or an Ace = {JS, JC, JD, JH, AS, AC, AD, AH} so

    \(P(H)=\dfrac{8}{52}\)

    Example \(\PageIndex{4}\) calculating theoretical probabilities

    Suppose you have an iPod Shuffle with the following songs on it: 5 Rolling Stones songs, 7 Beatles songs, 9 Bob Dylan songs, 4 Faith Hill songs, 2 Taylor Swift songs, 7 U2 songs, 4 Mariah Carey songs, 7 Bob Marley songs, 6 Bunny Wailer songs, 7 Elton John songs, 5 Led Zeppelin songs, and 4 Dave Mathews Band songs. The different genre that you have are rock from the 60s which includes Rolling Stones, Beatles, and Bob Dylan; country includes Faith Hill and Taylor Swift; rock of the 90s includes U2 and Mariah Carey; Reggae includes Bob Marley and Bunny Wailer; rock of the 70s includes Elton John and Led Zeppelin; and bluegrasss/rock includes Dave Mathews Band.

    The way an iPod Shuffle works, is it randomly picks the next song so you have no idea what the next song will be. Now you would like to calculate the probability that you will hear the type of music or the artist that you are interested in. The sample set is too difficult to write out, but you can figure it from looking at the number in each set and the total number. The total number of songs you have is 67.

    1. What is the probability that you will hear a Faith Hill song?
    2. What is the probability that you will hear a Bunny Wailer song?
    3. What is the probability that you will hear a song from the 60s?
    4. What is the probability that you will hear a Reggae song?
    5. What is the probability that you will hear a song from the 90s or a bluegrass/rock song?
    6. What is the probability that you will hear an Elton John or a Taylor Swift song?
    7. What is the probability that you will hear a country song or a U2 song?

    Solution

    a. There are 4 Faith Hill songs out of the 67 songs, so

    \(P(\text { Faith Hill song })=\dfrac{4}{67}\)

    b. There are 6 Bunny Wailer songs, so

    \(P(\text { Bunny Wailer })=\dfrac{6}{67}\)

    c. There are 5, 7, and 9 songs that are classified as rock from the 60s, which is 21 total, so

    \(P(\text { rock from the } 60 \mathrm{s})=\dfrac{21}{67}\)

    d. There are 6 and 7 songs that are classified as Reggae, which is 13 total, so

    \(P(\text { Reggae })=\dfrac{13}{67}\)

    e. There are 7 and 4 songs that are songs from the 90s and 4 songs that are bluegrass/rock, for a total of 15, so

    \(P(\text { rock from the } 90 \text { s or bluegrass/rock })=\dfrac{15}{67}\)

    f. There are 7 Elton John songs and 2 Taylor Swift songs, for a total of 9, so

    \(P(\text { Elton John or Taylor Swift song })=\dfrac{9}{67}\)

    g. There are 6 country songs and 7 U2 songs, for a total of 13, so

    \(P(\text { country or } \mathrm{U} 2 \text { song })=\dfrac{13}{67}\)

    Of course you can do any other combinations you would like.

    Notice in Example \(\PageIndex{3}\) part e, it was mentioned that the probability of event D plus the probability of event C was 1. This is because these two events have no outcomes in common, and together they make up the entire sample space. Events that have this property are called complementary events.

    Definition \(\PageIndex{2}\): complementary events

    If two events are complementary events then to find the probability of one just subtract the probability of the other from one. Notation used for complement of A is not A or \(A^{c}\).

    \(P(A)+P\left(A^{c}\right)=1, \text { or } P(A)=1-P\left(A^{c}\right)\)

    Example \(\PageIndex{5}\) complementary events

    1. Suppose you know that the probability of it raining today is 0.45. What is the probability of it not raining?
    2. Suppose you know the probability of not getting the flu is 0.24. What is the probability of getting the flu?
    3. In an experiment of picking a card from a deck, what is the probability of not getting a card that is a Queen?

    Solution

    a. Since not raining is the complement of raining, then

    \(P(\text { not raining })=1-P(\text { raining })=1-0.45=0.55\)

    b. Since getting the flu is the complement of not getting the flu, then

    \(P(\text { getting the flu })=1-P(\text { not getting the flu })=1-0.24=0.76\)

    c. You could do this problem by listing all the ways to not get a queen, but that set is fairly large. One advantage of the complement is that it reduces the workload. You use the complement in many situations to make the work shorter and easier. In this case it is easier to list all the ways to get a Queen, find the probability of the Queen, and then subtract from one. Queen = {QS, QC, QD, QH} so

    \(P(\text { Queen })=\dfrac{4}{52}\) and

    \(P(\text { not Queen })=1-P(\text { Queen })=1-\dfrac{4}{52}=\dfrac{48}{52}\)

    The complement is useful when you are trying to find the probability of an event that involves the words at least or an event that involves the words at most. As an example of an at least event is suppose you want to find the probability of making at least $50,000 when you graduate from college. That means you want the probability of your salary being greater than or equal to $50,000. An example of an at most event is suppose you want to find the probability of rolling a die and getting at most a 4. That means that you want to get less than or equal to a 4 on the die. The reason to use the complement is that sometimes it is easier to find the probability of the complement and then subtract from 1. Example \(\PageIndex{6}\) demonstrates how to do this.

    Example \(\PageIndex{6}\) using the complement to find probabilities

    1. In an experiment of rolling a fair die one time, find the probability of rolling at most a 4 on the die.
    2. In an experiment of pulling a card from a fair deck, find the probability of pulling at least a 5 (ace is a high card in this example).

    Solution

    a. The sample space for this experiment is {1, 2, 3, 4, 5, 6}. You want the event of getting at most a 4, which is the same as thinking of getting 4 or less. The event space is {1, 2, 3, 4}. The probability is

    \(P(\text { at most } 4)=\dfrac{4}{6}\)

    Or you could have used the complement. The complement of rolling at most a 4 would be rolling number bigger than 4. The event space for the complement is {5, 6}. The probability of the complement is \(\dfrac{2}{6}\). The probability of at most 4 would be

    \(P(\text { at most } 4)=1-P(\text { more than } 4)=1-\dfrac{2}{6}=\dfrac{4}{6}\)

    Notice you have the same answer, but the event space was easier to write out. On this example it probability wasn’t that useful, but in the future there will be events where it is much easier to use the complement.

    b. The sample space for this experiment is

    SS = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS, 2C, 3C, 4C, 5C, 6C, 7C, 8C, 9C, 10C, JC, QC, KC, AC, 2D, 3D, 4D, 5D, 6D, 7D, 8D, 9D, 10D, JD, QD, KD, AD, 2H, 3H, 4H, 5H, 6H, 7H, 8H, 9H, 10H, JH, QH, KH, AH}

    Pulling a card that is at least a 5 would involve listing all of the cards that are a 5 or more. It would be much easier to list the outcomes that make up the complement. The complement of at least a 5 is less than a 5. That would be the event of 4 or less. The event space for the complement would be {2S, 3S, 4S, 2C, 3C, 4C, 2D, 3D, 4D, 2H, 3H, 4H}. The probability of the complement would be \(\dfrac{12}{52}\). The probability of at least a 5 would be

    \(P(\text { at least } \mathbf{a} 5)=1-P(4 \text { or less })=1-\dfrac{12}{52}=\dfrac{40}{52}\)

    Another concept was show in Example \(\PageIndex{3}\) parts g and i. The problems were looking for the probability of one event or another. In part g, it was looking for the probability of getting a Spade or an Ace. That was equal to \(\dfrac{16}{52}\). In part i, it was looking for the probability of getting a Jack or an Ace. That was equal to \(\dfrac{8}{52}\). If you look back at the parts b, c, and d, you might notice the following result:

    \(P(\text { Jack })+P(\text { Ace })=P(\text { Jack or Ace }) \text { but } P(\text { Spade })+P(\text { Ace }) \neq P(\text { Spade or } \text { Ace })\)

    Why does adding two individual probabilities together work in one situation to give the probability of one or another event and not give the correct probability in the other?

    The reason this is true in the case of the Jack and the Ace is that these two events cannot happen together. There is no overlap between the two events, and in fact the \(P(\text { Jack and } \mathrm{Acc})=0\). However, in the case of the Spade and Ace, they can happen together. There is overlap, mainly the ace of spades. The \(P(\text { Spade and } \mathrm{Ace}) \neq 0\).

    When two events cannot happen at the same time, they are called mutually exclusive. In the above situation, the events Jack and Ace are mutually exclusive, while the events Spade and Ace are not mutually exclusive.

    Addition Rules:

    If two events A and B are mutually exclusive, then

    \(P(A \text { or } B)=P(A)+P(B) \text { and } P(A \text { and } B)=0\)

    If two events A and B are not mutually exclusive, then

    \(P(A \text { or } B)=P(A)+P(B)-P(A \text { and } B)\)

    Example \(\PageIndex{7}\) using addition rules

    Suppose your experiment is to roll two fair dice.

    1. What is the sample space?
    2. What is the probability of getting a sum of 5?
    3. What is the probability of getting the first die a 2?
    4. What is the probability of getting a sum of 7?
    5. What is the probability of getting a sum of 5 and the first die a 2?
    6. What is the probability of getting a sum of 5 or the first die a 2?
    7. What is the probability of getting a sum of 5 and sum of 7?
    8. What is the probability of getting a sum of 5 or sum of 7?

    Solution

    a. As with the other examples you need to come up with a sample space that has equally likely outcomes. One sample space is to list the sums possible on each roll. That sample space would look like: SS = {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}. However, there are more ways to get a sum of 7 then there are to get a sum of 2, so these outcomes are not equally likely. Another thought is to list the possibilities on each roll. As an example you could roll the dice and on the first die you could get a 1. The other die could be any number between 1 and 6, but say it is a 1 also. Then this outcome would look like (1,1). Similarly, you could get (1, 2), (1, 3), (1,4), (1, 5), or (1, 6). Also, you could get a 2, 3, 4, 5, or 6 on the first die instead. Putting this all together, you get the sample space:

    \(\begin{array}{r}{\mathrm{SS}=\{(1,1),(1,2),(1,3),(1,4),(1,5),(1,6)} \\ {(2,1),(2,2),(2,3),(2,4),(2,5),(2,6)} \\ {(3,1),(3,2),(3,3),(3,4),(3,5),(3,6)} \\ {(4,1),(4,2),(4,3),(4,4),(4,5),(4,6)} \\ {(5,1),(5,2),(5,3),(5,4),(5,5),(5,6)} \\ {(6,1),(6,2),(6,3),(6,4),(6,5),(6,6) \}}\end{array}\)

    Notice that a (2,3) is different from a (3,2), since the order that you roll the die is important and you can tell the difference between these two outcomes. You don’t need any of the doubles twice, since these are not distinguishable from each other in either order. This will always be the sample space for rolling two dice.

    b. Let A = getting a sum of 5 = {(4,1), (3,2), (2,3), (1,4)} so

    \(P(A)=\dfrac{4}{36}\)

    c. Let B = getting first die a 2 = {(2,1), (2,2), (2,3), (2,4), (2,5), (2,6)} so

    \(P(B)=\dfrac{6}{36}\)

    d. Let C = getting a sum of 7 = {(6,1), (5,2), (4,3), (3,4), (2,5), (1,6)} so

    \(P(C)=\dfrac{6}{36}\)

    e. This is events A and B which contains the outcome {(2,3)} so

    \(P(A \text { and } B)=\dfrac{1}{36}\)

    f. Notice from part e, that these two events are not mutually exclusive, so

    \(P(A \text { or } B)=P(A)+P(B)-P(A \text { and } B)\)

    \(=\dfrac{4}{36}+\dfrac{6}{36}-\dfrac{1}{36}\)

    \(=\dfrac{9}{36}\)

    g. These are the events A and C, which have no outcomes in common. Thus A and C = { } so

    \(P(A \text { and } C)=0\)

    h. From part g, these two events are mutually exclusive, so

    \(P(A \text { or } C)=P(A)+P(C)\)

    \(=\dfrac{4}{36}+\dfrac{6}{36}\)

    \(=\dfrac{10}{36}\)

    Odds

    Many people like to talk about the odds of something happening or not happening. Mathematicians, statisticians, and scientists prefer to deal with probabilities since odds are difficult to work with, but gamblers prefer to work in odds for figuring out how much they are paid if they win.

    Definition \(\PageIndex{3}\)

    The actual odds against event A occurring are the ratio \(P\left(A^{c}\right) / P(A)\), usually expressed in the form a:b or a to b, where a and b are integers with no common factors.

    Definition \(\PageIndex{4}\)

    The actual odds in favor event A occurring are the ratio \(P(A) / P\left(A^{c}\right)\), which is the reciprocal of the odds against. If the odds against event A are a:b, then the odds in favor event A are b:a.

    Definition \(\PageIndex{5}\)

    The payoff odds against event A occurring are the ratio of the net profit (if you win) to the amount bet.

    payoff odds against event A = (net profit) : (amount bet)

    Example \(\PageIndex{8}\) odds against and payoff odds

    In the game of Craps, if a shooter has a come-out roll of a 7 or an 11, it is called a natural and the pass line wins. The payoff odds are given by a casino as 1:1.

    1. Find the probability of a natural.
    2. Find the actual odds for a natural.
    3. Find the actual odds against a natural.
    4. If the casino pays 1:1, how much profit does the casino make on a $10 bet?

    Solution

    a. A natural is a 7 or 11. The sample space is

    \(\begin{array}{r}{\mathrm{SS}=\{(1,1),(1,2),(1,3),(1,4),(1,5),(1,6)} \\ {(2,1),(2,2),(2,3),(2,4),(2,5),(2,6)} \\ {(3,1),(3,2),(3,3),(3,4),(3,5),(3,6)} \\ {(4,1),(4,2),(4,3),(4,4),(4,5),(4,6)} \\ {(5,1),(5,2),(5,3),(5,4),(5,5),(5,6)} \\ {(6,1),(6,2),(6,3),(6,4),(6,5),(6,6) \}}\end{array}\)

    The event space is {(1,6), (2,5), (3,4), (4,3), (5,2), (6,1), (5,6), (6,5)}

    So \(P(7 \text { or } 11)=\dfrac{8}{36}\)

    b.

    odd for a natural \(=\dfrac{P(7 \text { or } 11)}{P(\text {not} 7 \text { or } 11)}\)

    \(=\dfrac{8 / 36}{1-8 / 36}\)

    \(=\dfrac{8 / 36}{28 / 36}\)

    \(=\dfrac{8}{28}\)

    \(=\dfrac{2}{7}\)

    c.

    odds against a natural \(=\dfrac{P(\text { not } 7 \text { or } 11)}{P(7 \text { or } 11)}=\dfrac{28}{8}=\dfrac{7}{2}=\dfrac{3.5}{1}\)

    d. The actual odds are 3.5 to 1 while the payoff odds are 1 to 1. The casino pays you $10 for your $10 bet. If the casino paid you the actual odds, they would pay $3.50 on every $1 bet, and on $10, they pay \(3.5 * \$ 10=\$ 35\). Their profit is \(\$ 35-\$ 10=\$ 25\).

    Homework

    Exercise \(\PageIndex{1}\)

    1. Example \(\PageIndex{1}\) contains the number of M&M’s of each color that were found in a case (Madison, 2013).
      Blue Brown Green Orange Red Yellow Total
      481 371 483 544 372 369 2620

      Table \(\PageIndex{1}\): M&M Distribution
      a. Find the probability of choosing a green or red M&M.
      b. Find the probability of choosing a blue, red, or yellow M&M.
      c. Find the probability of not choosing a brown M&M.
      d. Find the probability of not choosing a green M&M.

    2. Eyeglassomatic manufactures eyeglasses for different retailers. They test to see how many defective lenses they made in a time period. Example \(\PageIndex{2}\) gives the defect and the number of defects.
      Defect type Number of defects
      Scratch 5865
      Right shaped - small 4613
      Flaked 1992
      Wrong axis 1838
      Chamfer wrong 1596
      Crazing, cracks 1546
      Wrong shape 1485
      Wrong PD 1398
      Spots and bubbles 1371
      Wrong height 1130
      Right shape - big 1105
      Lost in lab 976
      Spots/bubble 976

      Table \(\PageIndex{2}\): Number of Defective Lenses
      a. Find the probability of picking a lens that is scratched or flaked.
      b. Find the probability of picking a lens that is the wrong PD or was lost in lab.
      c. Find the probability of picking a lens that is not scratched.
      d. Find the probability of picking a lens that is not the wrong shape.

    3. An experiment is to flip a fair coin three times.
      1. State the sample space.
      2. Find the probability of getting exactly two heads. Make sure you state the event space.
      3. Find the probability of getting at least two heads. Make sure you state the event space.
      4. Find the probability of getting an odd number of heads. Make sure you state the event space.
      5. Find the probability of getting all heads or all tails. Make sure you state the event space.
      6. Find the probability of getting exactly two heads or exactly two tails.
      7. Find the probability of not getting an odd number of heads.
    4. An experiment is rolling a fair die and then flipping a fair coin.
      1. State the sample space.
      2. Find the probability of getting a head. Make sure you state the event space.
      3. Find the probability of getting a 6. Make sure you state the event space.
      4. Find the probability of getting a 6 or a head.
      5. Find the probability of getting a 3 and a tail.
    5. An experiment is rolling two fair dice.
      1. State the sample space.
      2. Find the probability of getting a sum of 3. Make sure you state the event space.
      3. Find the probability of getting the first die is a 4. Make sure you state the event space.
      4. Find the probability of getting a sum of 8. Make sure you state the event space.
      5. Find the probability of getting a sum of 3 or sum of 8.
      6. Find the probability of getting a sum of 3 or the first die is a 4.
      7. Find the probability of getting a sum of 8 or the first die is a 4.
      8. Find the probability of not getting a sum of 8.
    6. An experiment is pulling one card from a fair deck.
      1. State the sample space.
      2. Find the probability of getting a Ten. Make sure you state the event space.
      3. Find the probability of getting a Diamond. Make sure you state the event space.
      4. Find the probability of getting a Club. Make sure you state the event space.
      5. Find the probability of getting a Diamond or a Club.
      6. Find the probability of getting a Ten or a Diamond.
    7. An experiment is pulling a ball from an urn that contains 3 blue balls and 5 red balls.
      1. Find the probability of getting a red ball.
      2. Find the probability of getting a blue ball.
      3. Find the odds for getting a red ball.
      4. Find the odds for getting a blue ball.
    8. In the game of roulette, there is a wheel with spaces marked 0 through 36 and a space marked 00.
      1. Find the probability of winning if you pick the number 7 and it comes up on the wheel.
      2. Find the odds against winning if you pick the number 7.
      3. The casino will pay you $20 for every dollar you bet if your number comes up. How much profit is the casino making on the bet?
    Answer

    1. a. P(green or red) = 0.326, b. P(blue, red, or yellow) = 0.466, c. P(not brown) = 0.858, d. P(not green) = 0.816

    3. a. See solutions, b. P(2 heads) = 0.375, c. P(at least 2 heads) = 0.50, d. P(odd number of heads) = 0.50, e. P(all heads or all tails) = 0.25, f. P(two heads or two tails) = 0.75, g. P(no an odd number of heads) = 0.50

    5. a. See solutions, b. P(sum of 3) = 0.056, c. P(1st die a 4) = 0.167, d. P(sum of 8) = 0.139, e. P(sum of 3 or sum of 8) = 0.194, f. P(sum of 3 or 1st die a 4) = 0.222, g. P(sum of 8 or 1st die a 4) = 0.278, h. P(not getting a sum of 8) = 0.861

    7. a. P(red ball) = 0.625, b. P(blue ball) = 0.375, c. 5 to 3 d. 3 to 5


    This page titled 4.2: Theoretical Probability is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Kathryn Kozak via source content that was edited to the style and standards of the LibreTexts platform.