# 2.1: Equally Likely Outcomes and Counting Techniques (Combinatorics)

- Page ID
- 3251

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)In this chapter, we consider the problem of assigning specific probabilities to outcomes in a sample space. As we saw in Section 1.2, the axiomatic definition of probability (Definition 1.2.1) does not tell us how to compute probabilities. So in this section we consider the commonly encountered scenario referred to as *equally likely outcomes* and develop methods for computing probabilities in this special case.

## Finite Sample Spaces

Before focusing on equally likely outcomes, we consider the more general case of *finite *sample spaces. In other words, suppose that a sample space \(S\) has a finite number of outcomes, which we can denote as \(N\). In this case, we can represent the outcomes in \(S\) as follows:

$$S = \{s_1, s_2, \ldots, s_N\}.\notag$$

Suppose further that we denote the probability assigned to each outcome in \(S\) as \(P(s_i) = p_i\), for \(i=1, \ldots, N\). Then the probability of any event \(A\) in \(S\) is given by adding the probabilities corresponding to the outcomes contained in \(A\) and we can write

$$P(A) = \sum_{s_i \in A} p_i. \label{finitess}$$

This follows from the third axiom of probability (Definition 1.2.1), since we can write any event as a disjoint union of the outcomes contained in the event. For example, if event \(A\) contains three outcomes, then we can write \(A = \{s_1, s_2, s_3\} = \{s_1\} \cup \{s_2\} \cup \{s_3\}\). So the probability of \(A\) is given by simply summing up the probabilities assigned to \(s_1, s_2, s_3\). This fact will be useful in the special case of equally likely outcomes, which we consider next.

## Equally Likely Outcomes

First, let's state a formal definition of what it means for the outcomes in a sample space to be *equally likely*.

### Definition \(\PageIndex{1}\)

The outcomes in a sample space \(S\) are * equally likely* if each outcome has the

*same probability*of occurring.

In general, if outcomes in a sample space \(S\) are equally likely, then computing the probability of a single outcome or an event is very straightforward, as the following exercise demonstrates. You are encouraged to first try to answer the questions for yourself, and then click "Answer" to see the solution.

### Exercise \(\PageIndex{1}\)

Suppose that there are \(N\) outcomes in the sample space \(S\) and that the outcomes are equally likely.

- What is the probability of a single outcome in \(S\)?
- What is the probability of an event \(A\) in \(S\)?

**Answer**-
First, note that we can represent the outcomes in \(S\) as follows:

$$S = \{s_1, s_2, \ldots, s_N\}.\notag$$

For each outcome in \(S\), note that we can denote its probability as

$$P(s_i) = c,\ \text{for}\ i=1, 2, \ldots, N,\notag$$

where \(c\) is some constant. This follows from the fact that the outcomes of \(S\) are equally likely and so have the same probability of occurring. With this set-up and using the axioms of probability (Definition 1.2.1), we have the following:

\begin{align*}

1 = P(S) & = P(\{s_1\}\cup\cdots\cup\{s_N\}) \notag \\

& = P(s_1) + \cdots + P(s_N) \notag \\

& = c + \cdots + c \notag \\

& = N\times c \notag \\

\Rightarrow c & = \frac{1}{N} \notag . \end{align*}Thus, the probability of a single outcome is given by \(1\) divided by the number of outcomes in \(S\).

Now, for an event \(A\) in \(S\), suppose it has \(n\) outcomes, where \(n\) is an integer such that \(0\leq n\leq N\). We can represent the outcomes in \(A\) as follows:

$$A = \{a_1, \ldots, a_n\}.\notag$$

Using Equation \ref{finitess}, we compute the probability of \(A\) as follows:

\begin{align*}

P(A) & = \sum^n_{i=1} P(a_i) = \sum^n_{i=1}\frac{1}{N} \notag \\

& = \frac{1}{N} + \cdots + \frac{1}{N} \notag \\

& = n\left(\frac{1}{N}\right) \notag \\

& = \frac{n}{N} \notag .

\end{align*}Thus, the probability of an event in \(S\) is equal to the number of outcomes in the event divided by the total number of outcomes in \(S\).

We have already seen an example of a sample space with equally likely outcomes in Example 1.2.1. You are encouraged to revisit that example and connect it to the results of Exercise 2.1.1.

In general, Exercise 2.1.1 shows that if a sample space \(S\) has equally likely outcomes, then the probability of an event \(A\) in the sample space is given by

$$\boxed{P(A) = \frac{\text{number of outcomes in}\ A}{\text{number of outcomes in}\ S}.}\label{eqlik}$$

From this result, we see that in the context of equally likely outcomes calculating probabilities of events reduces to simply *counting* the number of outcomes in the event and the sample space. So, we* *take a break from our discussion of probability, and briefly introduce some counting techniques.

## Counting Techniques

First, let's consider the general context of performing multi-step experiments. The following tells us how to count the number of outcomes in such scenarios.

### Multiplication Principle

If one probability experiment has \(m\) outcomes and another has \(n\) outcomes, then there are \(m \times n\) total outcomes for the two experiments.

More generally, if there are \(k\) many probability experiments with the first experiment having \(n_1\) outcomes, the second with \(n_2\), etc., then there are \(n_1 \times n_2 \times \cdots \times n_k\) total outcomes for the \(k\) experiments.

### Example \(\PageIndex{1}\)

To demonstrate the Multiplication Principle, consider again the example of tossing a coin twice (see Example 1.2.1). Each toss is a probability experiment and on each toss, there are two possible outcomes: \(h\) or \(t\). Thus, for two tosses, there are \(2 \times 2 = 4\) total outcomes.

If we toss the coin a third time, there are \(2\times 2 \times 2 = 8\) total outcomes.

Next we define two commonly encountered situations, *permutations *and *combinations*, and consider how to count the number of ways in which they can occur.

### Definition \(\PageIndex{2}\)

A * permutation* is an

*ordered*arrangement of objects. For example, "MATH'' is a permutation of four letters from the alphabet.

A * combination* is an

*unordered*collection of \(r\) objects from \(n\) total objects. For example, a group of three students chosen from a class of 10 students.

In order to count the number of possible permutations in a given setting, the Multiplication Principle is applied. For example, if we want to know the number of possible permutations of the four letters in "MATH'', we compute

$$4\times3\times2\times1 = 4! = 24,\notag$$

since there are four letters to select for the first position, three letters for the second, two for the third, leaving only one letter for the last. In other words, we treat each letter selection as an experiment in a multi-step process.

### Counting Permutations

The number of permutations of \(n\) distinct objects is given by the following:

$$n\times(n-1)\times\cdots\times 2\times1 = n!\notag$$

Counting combinations is a little more complicated, since we are not interested in the order in which objects are selected and so the Multiplication Principle does not directly apply. Consider the example that a group of three students are chosen from a class of 10. The group is the same regardless of the order in which the three students are selected. This implies that if we want to count the number of possible combinations, we need to be careful not to include permutations, i.e., rearrangements, of a certain selection. This leads to the following result that the number of possible combinations of size \(r\) selected from a total of \(n\) objects is given by binomial coefficients.

### Counting Combinations

The number of combinations of \(r\) objects selected without replacement from \(n\) distinct objects is given by

$$\binom{n}{r} = \frac{n!}{r!\times(n-r)!}.\notag$$

Note that \(\binom{n}{r}\), read as "\(n\) *choose* \(r\)", is also referred to as a * binomial coefficient*, since it appears in the Binomial Theorem.

Using the above, we can compute the number of possible ways to select three students from a class of 10:

$$\binom{10}{3} = \frac{10!}{3!\times7!} = 120 \notag$$

### Example \(\PageIndex{2}\)

Consider the example of tossing a coin three times. Note that an outcome is a sequence of heads and tails. Suppose that we are interested in the number of outcomes with exactly two heads, not in the actual sequence. To find the number of outcomes with exactly two heads, we need to determine the number of ways to select positions in the sequence for the heads, then the remaining position will be a tails. If we toss the coin three times, there are three positions to select from, and we want to select two. Since the order that we make the selection of placements does not matter, we are counting the number of combinations of 2 positions from a total of 3 positions, i.e.,

$$\binom{3}{2} = \frac{3!}{2!\times1!} = 3.\notag$$

Of course, this example is small enough that we could have arrived at the answer of 3 using brute force by just listing the possibilities. However, if we toss the coin a higher number of times, say 50, then the brute force approach becomes infeasible and we need to make use of binomial coefficients.

## Strategies for Analyzing a Counting Problem

Before we return to our discussion of probability, the following outlines an approach for tackling problems in which we need to count the number of possible outcomes.

*Are there cases? To describe all possible outcomes, must one consider specific ways the desired event can occur?*If so, make a note of each case and add the number of possibilities for each case.

**Example:**How many ways can a hand of 5 cards have at least 3 hearts?*Solution*: To examine the ways this can happen, we observe that there are 3 specific ways a hand could have at least 3 hearts:(i) a hand could have exactly 3 hearts and 2 other cards (that are not hearts) OR

(ii) a hand could have exactly 4 hearts and 1 other card (that is not hearts) OR

(iii) a hand could have exactly 5 hearts.

So there are 3 cases. The total number of ways a hand of 5 cards has at least 3 hearts \(=\) the number of ways to have exactly 3 hearts and 2 other cards \(+\) the number of ways to have exactly 4 hearts and 1 other card \(+\) the number of ways to have 5 hearts.

**Non-Example:**How many groups of 10 students have 4 members from Indiana and 6 from Michigan? This event is clearly described and already very specific: 4 from one state and 6 from the other. There are no other options, possibilities, or cases.*Are repetitions allowed?*If the answer is yes, then use the Multiplication Rule.

**Are there steps?**If so, put a slot for each step and a dot (multiplication symbol) between the slots (Multiplication Principle).

**Example:**How many ways can a hand of 5 cards have exactly 3 hearts?*Solution*: There are two steps:1. get 3 hearts AND

2. get 2 other cards (that are not hearts).

Then the number of hands \(=\)

__# of ways to get 3 hearts__\(\cdot\)__# of ways to get 2 other cards (not hearts)__**Non-Example:**How many different groups of 4 books can be selected from a shelf that has 12 books? There is only one step: grab the 4 books.*Does order matter?*If so, use permutations or the Multiplication Rule.

**Example:**There are 4 distinct cleaning jobs: wash the windows, vacuum, dust, wash the kitchen floor. In how many ways could the jobs be assigned to 4 people from a set of 6?*Solution*: Order matters. A person is assigned a particular job.**Non-Example:**See the next point.*If order doesn't matter and the end result is a clump or group, then you are counting combinations.***Example:**How many ways can one choose 5 people from a set of 9 to help with some chores?*Solution*: Order doesn't matter. We are merely choosing the lucky folks who will help us get some work done. We have NOT assigned particular tasks.