Skip to main content
Statistics LibreTexts

1.3: Equally Likely Outcomes and Counting Techniques (Combinatorics)

  • Page ID
    12757
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    In this section, we consider the problem of assigning specific probabilities to outcomes in a sample space. As we saw in Section 1.2, the axiomatic definition of probability (Definition 1.2.1) does not tell us how to compute probabilities. So in this section we consider the commonly encountered scenario referred to as equally likely outcomes and develop methods for computing probabilities in this special case.

    Finite Sample Spaces

    Before focusing on equally likely outcomes, we consider the more general case of finite sample spaces. In other words, suppose that a sample space \(\Omega\) has a finite number of outcomes, which we can denote as \(N\). In this case, we can represent the outcomes in \(\Omega\) as follows:

    $$\Omega = \{\omega_1, \omega_2, \ldots, \omega_N\}.\notag$$

    Suppose further that we denote the probability assigned to each outcome in \(\Omega\) as \(P(\omega_i) = p_i\), for \(i=1, \ldots, N\). Then the probability of any event \(A\) in \(\Omega\) is given by adding the probabilities corresponding to the outcomes contained in \(A\) and we can write

    $$P(A) = \sum_{\omega_i \in A} p_i. \label{finitess}$$

    This follows from the third axiom of probability (Definition 1.2.1), since we can write any event as a disjoint union of the outcomes contained in the event. For example, if event \(A\) contains three outcomes, then we can write \(A = \{\omega_1, \omega_2, \omega_3\} = \{\omega_1\} \cup \{\omega_2\} \cup \{\omega_3\}\). So the probability of \(A\) is given by simply summing up the probabilities assigned to \(\omega_1, \omega_2, \omega_3\). This fact will be useful in the special case of equally likely outcomes, which we consider next.

    Equally Likely Outcomes

    First, let's state a formal definition of what it means for the outcomes in a sample space to be equally likely.

    Definition \(\PageIndex{1}\)

    The outcomes in a sample space \(\Omega\) are equally likely if each outcome has the same probability of occurring.

    In general, if outcomes in a sample space \(\Omega\) are equally likely, then computing the probability of a single outcome or an event is very straightforward, as the following exercise demonstrates.  You are encouraged to first try to answer the questions for yourself, and then click "Answer" to see the solution.

    Exercise \(\PageIndex{1}\)

    Suppose that there are \(N\) outcomes in the sample space \(\Omega\) and that the outcomes are equally likely.

    1. What is the probability of a single outcome in \(\Omega\)?
    2. What is the probability of an event \(A\) in \(\Omega\)?
    Answer

    First, note that we can represent the outcomes in \(\Omega\) as follows:

    $$\Omega = \{\omega_1, \omega_2, \ldots, \omega_N\}.\notag$$

    For each outcome in \(\Omega\), note that we can denote its probability as 

    $$P(\omega_i) = c,\ \text{for}\ i=1, 2, \ldots, N,\notag$$

    where \(c\) is some constant.  This follows from the fact that the outcomes of \(\Omega\) are equally likely and so have the same probability of occurring.  With this set-up and using the axioms of probability (Definition 1.2.1), we have the following:

    \begin{align*}
    1 = P(\Omega) & = P(\{\omega_1\}\cup\cdots\cup\{\omega_N\}) \notag \\
    & = P(\omega_1) + \cdots + P(\omega_N) \notag \\
    & = c + \cdots + c \notag \\
    & = N\times c \notag  \\
    \Rightarrow c & = \frac{1}{N} \notag . \end{align*}

    Thus, the probability of a single outcome is given by \(1\) divided by the number of outcomes in \(\Omega\).

    Now, for an event \(A\) in \(\Omega\), suppose it has \(n\) outcomes, where \(n\) is an integer such that \(0\leq n\leq N\).  We can represent the outcomes in \(A\) as follows:

    $$A = \{a_1, \ldots, a_n\}.\notag$$

    Using Equation \ref{finitess}, we compute the probability of \(A\) as follows:

    \begin{align*}
    P(A) & = \sum^n_{i=1} P(a_i) = \sum^n_{i=1}\frac{1}{N} \notag \\
    & = \frac{1}{N} + \cdots + \frac{1}{N} \notag \\
    & = n\left(\frac{1}{N}\right) \notag \\
    & = \frac{n}{N} \notag .
    \end{align*}

    Thus, the probability of an event in \(\Omega\) is equal to the number of outcomes in the event divided by the total number of outcomes in \(\Omega\).

    We have already seen an example of a sample space with equally likely outcomes in Example 1.2.1. You are encouraged to revisit that example and connect it to the results of Exercise 1.3.1.

    In general, Exercise 1.3.1 shows that if a sample space \(\Omega\) has equally likely outcomes, then the probability of an event \(A\) in the sample space is given by

    $$\boxed{P(A) = \frac{\text{number of outcomes in}\ A}{\text{number of outcomes in}\ \Omega}.}\label{eqlik}$$

    From this result, we see that in the context of equally likely outcomes calculating probabilities of events reduces to simply counting the number of outcomes in the event and the sample space.  So, we take a break from our discussion of probability, and briefly introduce some counting techniques.

    Counting Techniques

    First, let's consider the general context of performing multi-step experiments.  The following tells us how to count the number of outcomes in such scenarios.

    Multiplication Principle

    If one probability experiment has \(m\) outcomes and another has \(n\) outcomes, then there are \(m \times n\) total outcomes for the two experiments.

    More generally, if there are \(k\) many probability experiments with the first experiment having \(n_1\) outcomes, the second with \(n_2\), etc., then there are \(n_1 \times n_2 \times \cdots \times n_k\) total outcomes for the \(k\) experiments.

    Example \(\PageIndex{1}\)

    To demonstrate the Multiplication Principle, consider again the example of tossing a coin twice (see Example 1.2.1). Each toss is a probability experiment and on each toss, there are two possible outcomes: \(h\) or \(t\). Thus, for two tosses, there are \(2 \times 2 = 4\) total outcomes.

    If we toss the coin a third time, there are \(2\times 2 \times 2 = 8\) total outcomes.


    Next we define two commonly encountered situations, permutations and combinations, and consider how to count the number of ways in which they can occur.

    Definition \(\PageIndex{2}\)

    A permutation is an ordered arrangement of objects. For example, "MATH'' is a permutation of four letters from the alphabet.

    A combination is an unordered collection of \(r\) objects from \(n\) total objects. For example, a group of three students chosen from a class of 10 students.

    In order to count the number of possible permutations in a given setting, the Multiplication Principle is applied. For example, if we want to know the number of possible permutations of the four letters in "MATH'', we compute
    $$4\times3\times2\times1 = 4! = 24,\notag$$
    since there are four letters to select for the first position, three letters for the second, two for the third, leaving only one letter for the last. In other words, we treat each letter selection as an experiment in a multi-step process.

    Counting Permutations

    The number of permutations of \(n\) distinct objects is given by the following:

    $$n\times(n-1)\times\cdots\times 2\times1 = n!\notag$$

    Counting combinations is a little more complicated, since we are not interested in the order in which objects are selected and so the Multiplication Principle does not directly apply. Consider the example that a group of three students are chosen from a class of 10. The group is the same regardless of the order in which the three students are selected. This implies that if we want to count the number of possible combinations, we need to be careful not to include permutations, i.e., rearrangements, of a certain selection. This leads to the following result that the number of possible combinations of size \(r\) selected from a total of \(n\) objects is given by binomial coefficients.

    Counting Combinations

    The number of combinations of \(r\) objects selected without replacement from \(n\) distinct objects is given by
    $$\binom{n}{r} = \frac{n!}{r!\times(n-r)!}.\notag$$
    Note that \(\binom{n}{r}\), read as "\(n\) choose \(r\)", is also referred to as a binomial coefficient, since it appears in the Binomial Theorem.

    Using the above, we can compute the number of possible ways to select three students from a class of 10:

    $$\binom{10}{3} = \frac{10!}{3!\times7!} = 120 \notag$$

    Example \(\PageIndex{2}\)

    Consider the example of tossing a coin three times. Note that an outcome is a sequence of heads and tails. Suppose that we are interested in the number of outcomes with exactly two heads, not in the actual sequence. To find the number of outcomes with exactly two heads, we need to determine the number of ways to select positions in the sequence for the heads, then the remaining position will be a tails. If we toss the coin three times, there are three positions to select from, and we want to select two. Since the order that we make the selection of placements does not matter, we are counting the number of combinations of 2 positions from a total of 3 positions, i.e.,
    $$\binom{3}{2} = \frac{3!}{2!\times1!} = 3.\notag$$
    Of course, this example is small enough that we could have arrived at the answer of 3 using brute force by just listing the possibilities. However, if we toss the coin a higher number of times, say 50, then the brute force approach becomes infeasible and we need to make use of binomial coefficients.

     


    This page titled 1.3: Equally Likely Outcomes and Counting Techniques (Combinatorics) is shared under a not declared license and was authored, remixed, and/or curated by Kristin Kuter.

    • Was this article helpful?