It is not always feasible to conduct an experiment over and over again, so it would be better to be able to find the probabilities without conducting the experiment. These probabilities are called Theoretical Probabilities.
To be able to do theoretical probabilities, there is an assumption that you need to consider. It is that all of the outcomes in the sample space need to be equally likely outcomes. This means that every outcome of the experiment needs to have the same chance of happening.
If the outcomes are not equally likely, then you must do experimental probabilities. If the outcomes are equally likely, then you can do theoretical probabilities.
Example \(\PageIndex{2}\) calculating theoretical probabilities
Suppose you conduct an experiment where you flip a fair coin twice.
- What is the sample space?
- What is the probability of getting exactly one head?
- What is the probability of getting at least one head?
- What is the probability of getting a head and a tail?
- What is the probability of getting a head or a tail?
- What is the probability of getting a foot?
- What is the probability of each outcome? What is the sum of these probabilities?
Solution
a. There are several different sample spaces you can do. One is SS={0, 1, 2} where you are counting the number of heads. However, the outcomes are not equally likely since you can get one head by getting a head on the first flip and a tail on the second or a tail on the first flip and a head on the second. There are 2 ways to get that outcome and only one way to get the other outcomes. Instead it might be better to give the sample space as listing what can happen on each flip. Let H = head and T = tail, and list which can happen on each flip.
SS={HH, HT, TH, TT}
b. Let A = getting exactly one head. The event space is A = {HT, TH}. So
\(P(A)=\dfrac{2}{4} \text { or } \dfrac{1}{2}\)
It may not be advantageous to reduce the fractions to lowest terms, since it is easier to compare fractions if they have the same denominator.
c. Let B = getting at least one head. At least one head means get one or more. The event space is B = {HT, TH, HH} and
\(P(B)=\dfrac{3}{4}\)
Since P(B) is greater than the P(A), then event B is more likely to happen than event A.
d. Let C = getting a head and a tail = {HT, TH} and
\(P(C)=\dfrac{2}{4}\)
This is the same event space as event A, but it is a different event. Sometimes two different events can give the same event space.
e. Let D = getting a head or a tail. Since or means one or the other or both and it doesn’t specify the number of heads or tails, then D = {HH, HT, TH, TT} and
\(P(D)=\dfrac{4}{4}=1\)
f. Let E = getting a foot. Since you can’t get a foot, E = {} or the empty set and
\(P(E)=\dfrac{0}{4}=0\)
g. \(P(H H)=P(H T)=P(T H)=P(T T)=\dfrac{1}{4}\). If you add all of these probabilities together you get 1.
This example had some results in it that are important concepts. They are summarized below:
Probability Properties
- \(0 \leq P(\text { event }) \leq 1\)
- If the P(event)=1, then it will happen and is called the certain event.
- If the P(event)=0, then it cannot happen and is called the impossible event.
- \(\sum P(\text { outcome })=1\)
Example \(\PageIndex{3}\) calculating theoretical probabilities
Suppose you conduct an experiment where you pull a card from a standard deck.
- What is the sample space?
- What is the probability of getting a Spade?
- What is the probability of getting a Jack?
- What is the probability of getting an Ace?
- What is the probability of not getting an Ace?
- What is the probability of getting a Spade and an Ace?
- What is the probability of getting a Spade or an Ace?
- What is the probability of getting a Jack and an Ace?
- What is the probability of getting a Jack or an Ace?
Solution
a. SS = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS, 2C, 3C, 4C, 5C, 6C, 7C, 8C, 9C, 10C, JC, QC, KC, AC, 2D, 3D, 4D, 5D, 6D, 7D, 8D, 9D, 10D, JD, QD, KD, AD, 2H, 3H, 4H, 5H, 6H, 7H, 8H, 9H, 10H, JH, QH, KH, AH}
b. Let A = getting a spade = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS} so
\(P(A)=\dfrac{13}{52}\)
c. Let B = getting a Jack = {JS, JC, JH, JD} so
\(P(B)=\dfrac{4}{52}\)
d. Let C = getting an Ace = {AS, AC, AH, AD} so
\(P(C)=\dfrac{4}{52}\)
e. Let D = not getting an Ace = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, 2C, 3C, 4C, 5C, 6C, 7C, 8C, 9C, 10C, JC, QC, KC, 2D, 3D, 4D, 5D, 6D, 7D, 8D, 9D, 10D, JD, QD, KD, 2H, 3H, 4H, 5H, 6H, 7H, 8H, 9H, 10H, JH, QH, KH} so
\(P(D)=\dfrac{48}{52}\)
Notice, \(P(D)+P(C)=\dfrac{48}{52}+\dfrac{4}{52}=1\), so you could have found the probability of D by doing 1 minus the probability of C \(P(D)=1-P(C)=1-\dfrac{4}{52}=\dfrac{48}{52}\).
f. Let E = getting a Spade and an Ace = {AS} so
\(P(E)=\dfrac{1}{52}\)
g. Let F = getting a Spade and an Ace ={2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS, AC, AD, AH} so
\(P(F)=\dfrac{16}{52}\)
h. Let G = getting a Jack and an Ace = { } since you can’t do that with one card. So
\(P(G)=0\)
i. Let H = getting a Jack or an Ace = {JS, JC, JD, JH, AS, AC, AD, AH} so
\(P(H)=\dfrac{8}{52}\)
Example \(\PageIndex{4}\) calculating theoretical probabilities
Suppose you have an iPod Shuffle with the following songs on it: 5 Rolling Stones songs, 7 Beatles songs, 9 Bob Dylan songs, 4 Faith Hill songs, 2 Taylor Swift songs, 7 U2 songs, 4 Mariah Carey songs, 7 Bob Marley songs, 6 Bunny Wailer songs, 7 Elton John songs, 5 Led Zeppelin songs, and 4 Dave Mathews Band songs. The different genre that you have are rock from the 60s which includes Rolling Stones, Beatles, and Bob Dylan; country includes Faith Hill and Taylor Swift; rock of the 90s includes U2 and Mariah Carey; Reggae includes Bob Marley and Bunny Wailer; rock of the 70s includes Elton John and Led Zeppelin; and bluegrasss/rock includes Dave Mathews Band.
The way an iPod Shuffle works, is it randomly picks the next song so you have no idea what the next song will be. Now you would like to calculate the probability that you will hear the type of music or the artist that you are interested in. The sample set is too difficult to write out, but you can figure it from looking at the number in each set and the total number. The total number of songs you have is 67.
- What is the probability that you will hear a Faith Hill song?
- What is the probability that you will hear a Bunny Wailer song?
- What is the probability that you will hear a song from the 60s?
- What is the probability that you will hear a Reggae song?
- What is the probability that you will hear a song from the 90s or a bluegrass/rock song?
- What is the probability that you will hear an Elton John or a Taylor Swift song?
- What is the probability that you will hear a country song or a U2 song?
Solution
a. There are 4 Faith Hill songs out of the 67 songs, so
\(P(\text { Faith Hill song })=\dfrac{4}{67}\)
b. There are 6 Bunny Wailer songs, so
\(P(\text { Bunny Wailer })=\dfrac{6}{67}\)
c. There are 5, 7, and 9 songs that are classified as rock from the 60s, which is 21 total, so
\(P(\text { rock from the } 60 \mathrm{s})=\dfrac{21}{67}\)
d. There are 6 and 7 songs that are classified as Reggae, which is 13 total, so
\(P(\text { Reggae })=\dfrac{13}{67}\)
e. There are 7 and 4 songs that are songs from the 90s and 4 songs that are bluegrass/rock, for a total of 15, so
\(P(\text { rock from the } 90 \text { s or bluegrass/rock })=\dfrac{15}{67}\)
f. There are 7 Elton John songs and 2 Taylor Swift songs, for a total of 9, so
\(P(\text { Elton John or Taylor Swift song })=\dfrac{9}{67}\)
g. There are 6 country songs and 7 U2 songs, for a total of 13, so
\(P(\text { country or } \mathrm{U} 2 \text { song })=\dfrac{13}{67}\)
Of course you can do any other combinations you would like.
Notice in Example \(\PageIndex{3}\) part e, it was mentioned that the probability of event D plus the probability of event C was 1. This is because these two events have no outcomes in common, and together they make up the entire sample space. Events that have this property are called complementary events.
Definition \(\PageIndex{2}\): complementary events
If two events are complementary events then to find the probability of one just subtract the probability of the other from one. Notation used for complement of A is not A or \(A^{c}\).
\(P(A)+P\left(A^{c}\right)=1, \text { or } P(A)=1-P\left(A^{c}\right)\)
Example \(\PageIndex{5}\) complementary events
- Suppose you know that the probability of it raining today is 0.45. What is the probability of it not raining?
- Suppose you know the probability of not getting the flu is 0.24. What is the probability of getting the flu?
- In an experiment of picking a card from a deck, what is the probability of not getting a card that is a Queen?
Solution
a. Since not raining is the complement of raining, then
\(P(\text { not raining })=1-P(\text { raining })=1-0.45=0.55\)
b. Since getting the flu is the complement of not getting the flu, then
\(P(\text { getting the flu })=1-P(\text { not getting the flu })=1-0.24=0.76\)
c. You could do this problem by listing all the ways to not get a queen, but that set is fairly large. One advantage of the complement is that it reduces the workload. You use the complement in many situations to make the work shorter and easier. In this case it is easier to list all the ways to get a Queen, find the probability of the Queen, and then subtract from one. Queen = {QS, QC, QD, QH} so
\(P(\text { Queen })=\dfrac{4}{52}\) and
\(P(\text { not Queen })=1-P(\text { Queen })=1-\dfrac{4}{52}=\dfrac{48}{52}\)
The complement is useful when you are trying to find the probability of an event that involves the words at least or an event that involves the words at most. As an example of an at least event is suppose you want to find the probability of making at least $50,000 when you graduate from college. That means you want the probability of your salary being greater than or equal to $50,000. An example of an at most event is suppose you want to find the probability of rolling a die and getting at most a 4. That means that you want to get less than or equal to a 4 on the die. The reason to use the complement is that sometimes it is easier to find the probability of the complement and then subtract from 1. Example \(\PageIndex{6}\) demonstrates how to do this.
Example \(\PageIndex{6}\) using the complement to find probabilities
- In an experiment of rolling a fair die one time, find the probability of rolling at most a 4 on the die.
- In an experiment of pulling a card from a fair deck, find the probability of pulling at least a 5 (ace is a high card in this example).
Solution
a. The sample space for this experiment is {1, 2, 3, 4, 5, 6}. You want the event of getting at most a 4, which is the same as thinking of getting 4 or less. The event space is {1, 2, 3, 4}. The probability is
\(P(\text { at most } 4)=\dfrac{4}{6}\)
Or you could have used the complement. The complement of rolling at most a 4 would be rolling number bigger than 4. The event space for the complement is {5, 6}. The probability of the complement is \(\dfrac{2}{6}\). The probability of at most 4 would be
\(P(\text { at most } 4)=1-P(\text { more than } 4)=1-\dfrac{2}{6}=\dfrac{4}{6}\)
Notice you have the same answer, but the event space was easier to write out. On this example it probability wasn’t that useful, but in the future there will be events where it is much easier to use the complement.
b. The sample space for this experiment is
SS = {2S, 3S, 4S, 5S, 6S, 7S, 8S, 9S, 10S, JS, QS, KS, AS, 2C, 3C, 4C, 5C, 6C, 7C, 8C, 9C, 10C, JC, QC, KC, AC, 2D, 3D, 4D, 5D, 6D, 7D, 8D, 9D, 10D, JD, QD, KD, AD, 2H, 3H, 4H, 5H, 6H, 7H, 8H, 9H, 10H, JH, QH, KH, AH}
Pulling a card that is at least a 5 would involve listing all of the cards that are a 5 or more. It would be much easier to list the outcomes that make up the complement. The complement of at least a 5 is less than a 5. That would be the event of 4 or less. The event space for the complement would be {2S, 3S, 4S, 2C, 3C, 4C, 2D, 3D, 4D, 2H, 3H, 4H}. The probability of the complement would be \(\dfrac{12}{52}\). The probability of at least a 5 would be
\(P(\text { at least } \mathbf{a} 5)=1-P(4 \text { or less })=1-\dfrac{12}{52}=\dfrac{40}{52}\)
Another concept was show in Example \(\PageIndex{3}\) parts g and i. The problems were looking for the probability of one event or another. In part g, it was looking for the probability of getting a Spade or an Ace. That was equal to \(\dfrac{16}{52}\). In part i, it was looking for the probability of getting a Jack or an Ace. That was equal to \(\dfrac{8}{52}\). If you look back at the parts b, c, and d, you might notice the following result:
\(P(\text { Jack })+P(\text { Ace })=P(\text { Jack or Ace }) \text { but } P(\text { Spade })+P(\text { Ace }) \neq P(\text { Spade or } \text { Ace })\)
Why does adding two individual probabilities together work in one situation to give the probability of one or another event and not give the correct probability in the other?
The reason this is true in the case of the Jack and the Ace is that these two events cannot happen together. There is no overlap between the two events, and in fact the \(P(\text { Jack and } \mathrm{Acc})=0\). However, in the case of the Spade and Ace, they can happen together. There is overlap, mainly the ace of spades. The \(P(\text { Spade and } \mathrm{Ace}) \neq 0\).
When two events cannot happen at the same time, they are called mutually exclusive. In the above situation, the events Jack and Ace are mutually exclusive, while the events Spade and Ace are not mutually exclusive.
Addition Rules:
If two events A and B are mutually exclusive, then
\(P(A \text { or } B)=P(A)+P(B) \text { and } P(A \text { and } B)=0\)
If two events A and B are not mutually exclusive, then
\(P(A \text { or } B)=P(A)+P(B)-P(A \text { and } B)\)
Example \(\PageIndex{7}\) using addition rules
Suppose your experiment is to roll two fair dice.
- What is the sample space?
- What is the probability of getting a sum of 5?
- What is the probability of getting the first die a 2?
- What is the probability of getting a sum of 7?
- What is the probability of getting a sum of 5 and the first die a 2?
- What is the probability of getting a sum of 5 or the first die a 2?
- What is the probability of getting a sum of 5 and sum of 7?
- What is the probability of getting a sum of 5 or sum of 7?
Solution
a. As with the other examples you need to come up with a sample space that has equally likely outcomes. One sample space is to list the sums possible on each roll. That sample space would look like: SS = {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}. However, there are more ways to get a sum of 7 then there are to get a sum of 2, so these outcomes are not equally likely. Another thought is to list the possibilities on each roll. As an example you could roll the dice and on the first die you could get a 1. The other die could be any number between 1 and 6, but say it is a 1 also. Then this outcome would look like (1,1). Similarly, you could get (1, 2), (1, 3), (1,4), (1, 5), or (1, 6). Also, you could get a 2, 3, 4, 5, or 6 on the first die instead. Putting this all together, you get the sample space:
\(\begin{array}{r}{\mathrm{SS}=\{(1,1),(1,2),(1,3),(1,4),(1,5),(1,6)} \\ {(2,1),(2,2),(2,3),(2,4),(2,5),(2,6)} \\ {(3,1),(3,2),(3,3),(3,4),(3,5),(3,6)} \\ {(4,1),(4,2),(4,3),(4,4),(4,5),(4,6)} \\ {(5,1),(5,2),(5,3),(5,4),(5,5),(5,6)} \\ {(6,1),(6,2),(6,3),(6,4),(6,5),(6,6) \}}\end{array}\)
Notice that a (2,3) is different from a (3,2), since the order that you roll the die is important and you can tell the difference between these two outcomes. You don’t need any of the doubles twice, since these are not distinguishable from each other in either order. This will always be the sample space for rolling two dice.
b. Let A = getting a sum of 5 = {(4,1), (3,2), (2,3), (1,4)} so
\(P(A)=\dfrac{4}{36}\)
c. Let B = getting first die a 2 = {(2,1), (2,2), (2,3), (2,4), (2,5), (2,6)} so
\(P(B)=\dfrac{6}{36}\)
d. Let C = getting a sum of 7 = {(6,1), (5,2), (4,3), (3,4), (2,5), (1,6)} so
\(P(C)=\dfrac{6}{36}\)
e. This is events A and B which contains the outcome {(2,3)} so
\(P(A \text { and } B)=\dfrac{1}{36}\)
f. Notice from part e, that these two events are not mutually exclusive, so
\(P(A \text { or } B)=P(A)+P(B)-P(A \text { and } B)\)
\(=\dfrac{4}{36}+\dfrac{6}{36}-\dfrac{1}{36}\)
\(=\dfrac{9}{36}\)
g. These are the events A and C, which have no outcomes in common. Thus A and C = { } so
\(P(A \text { and } C)=0\)
h. From part g, these two events are mutually exclusive, so
\(P(A \text { or } C)=P(A)+P(C)\)
\(=\dfrac{4}{36}+\dfrac{6}{36}\)
\(=\dfrac{10}{36}\)
Odds
Many people like to talk about the odds of something happening or not happening. Mathematicians, statisticians, and scientists prefer to deal with probabilities since odds are difficult to work with, but gamblers prefer to work in odds for figuring out how much they are paid if they win.
Definition \(\PageIndex{3}\)
The actual odds against event A occurring are the ratio \(P\left(A^{c}\right) / P(A)\), usually expressed in the form a:b or a to b, where a and b are integers with no common factors.
Definition \(\PageIndex{4}\)
The actual odds in favor event A occurring are the ratio \(P(A) / P\left(A^{c}\right)\), which is the reciprocal of the odds against. If the odds against event A are a:b, then the odds in favor event A are b:a.
Definition \(\PageIndex{5}\)
The payoff odds against event A occurring are the ratio of the net profit (if you win) to the amount bet.
payoff odds against event A = (net profit) : (amount bet)
Example \(\PageIndex{8}\) odds against and payoff odds
In the game of Craps, if a shooter has a come-out roll of a 7 or an 11, it is called a natural and the pass line wins. The payoff odds are given by a casino as 1:1.
- Find the probability of a natural.
- Find the actual odds for a natural.
- Find the actual odds against a natural.
- If the casino pays 1:1, how much profit does the casino make on a $10 bet?
Solution
a. A natural is a 7 or 11. The sample space is
\(\begin{array}{r}{\mathrm{SS}=\{(1,1),(1,2),(1,3),(1,4),(1,5),(1,6)} \\ {(2,1),(2,2),(2,3),(2,4),(2,5),(2,6)} \\ {(3,1),(3,2),(3,3),(3,4),(3,5),(3,6)} \\ {(4,1),(4,2),(4,3),(4,4),(4,5),(4,6)} \\ {(5,1),(5,2),(5,3),(5,4),(5,5),(5,6)} \\ {(6,1),(6,2),(6,3),(6,4),(6,5),(6,6) \}}\end{array}\)
The event space is {(1,6), (2,5), (3,4), (4,3), (5,2), (6,1), (5,6), (6,5)}
So \(P(7 \text { or } 11)=\dfrac{8}{36}\)
b.
odd for a natural \(=\dfrac{P(7 \text { or } 11)}{P(\text {not} 7 \text { or } 11)}\)
\(=\dfrac{8 / 36}{1-8 / 36}\)
\(=\dfrac{8 / 36}{28 / 36}\)
\(=\dfrac{8}{28}\)
\(=\dfrac{2}{7}\)
c.
odds against a natural \(=\dfrac{P(\text { not } 7 \text { or } 11)}{P(7 \text { or } 11)}=\dfrac{28}{8}=\dfrac{7}{2}=\dfrac{3.5}{1}\)
d. The actual odds are 3.5 to 1 while the payoff odds are 1 to 1. The casino pays you $10 for your $10 bet. If the casino paid you the actual odds, they would pay $3.50 on every $1 bet, and on $10, they pay \(3.5 * \$ 10=\$ 35\). Their profit is \(\$ 35-\$ 10=\$ 25\).