Skip to main content
Statistics LibreTexts

4.3: The Binomial Distribution

  • Page ID
    51650
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Recall, there are many probability experiments where a trial has only two outcomes. For example, asking a group of individuals if they vote yes on a proposition, or randomly guessing on a multiple choice test. When we conduct a sequence of independent trials with only two outcomes per trial, we are conducting a binomial experiment. In the last lesson, you learned how to calculate the probability of success achieved on the xth trial when there are only two outcomes per each trial, and independent trials.

    Now, we will turn to computing the probability of x successes in n independent trials, where there are only two possible outcomes per trial. We will examine this distribution by conducting an experiment. We will look at the possible outcomes from flipping four fair coins. Tossing a coin is an example of a trial with only two outcomes which are heads and tails.

    1. Go to this website to simulate flipping coins (or if you want, use four actual coins for the experiment). We will define landing on heads as success, and landing on tails as failure. We define the random variable x to be the number of successes (heads) out of four trials (4 coin tosses).
      1. There are many possible outcomes when flipping four coins. For example, the first coin could land on heads, the second on tails, the third on heads, and the fourth on tails. Let’s denote this outcome as HTHT. In this outcome, there are 2 successes and 2 failures. Write two other possible outcomes from flipping four coins and the corresponding number of successes and failures.
      2. What is the maximum number of successes possible (this is the largest value of x)?
      3. What is the minimum number of successes possible (this is the smallest value of x)?
      4. Time to flip the coins! Conduct the experiment 20 times. Each time you toss the four coins, use tally marks to keep track of the number of successes (heads) in the table below. When you have made 20 tally marks, you have finished.

        Number of Heads, x

        Tally

        0

        1

        2

        3

        4

      5. Here are the results from the experiment repeated 200 times. Complete the table with the relative frequency.

        Number of Heads, x

        Frequency

        Relative Frequency

        0

        14

        1

        43

        2

        85

        3

        42

        4

        16

        Total

        200

      6. Use the data from e. to create a relative frequency histogram. The relative frequencies are approximations of the probabilities of each outcome, P(x).

        AD_4nXdA0AlzLUtBDQz3TmSYQ4tfv80Tn0M6XRpoJh0d73YS3Ub9PyBjWhLsPQSr9T_sPtmWpJ5icOXBngaKEWCNNld0PN-u7Fe3jvHkE56GC87Gx4R-Jyg1IgYhlW2797XH9vvQ1gucmq0rESg7vKCPHV71gFzCkeyi1XJeTDlU718V25snr3PRQ

      7. Describe the center, shape, and spread of the relative frequency histogram.
      8. Which value of the random variable occurred most often?

    The Binomial Distribution


    A binomial experiment is a probability experiment with the following characteristics:

    • The experiment consists of n independent trials.
    • Each trial has exactly two possible outcomes which are labeled success and failure.
    • The probability of success is the same for each trial. We denote the probability of success as p and the probability of failure as q=1-p.

    For the random variable x representing the number of successes in a fixed number of trials (n trials), the distribution of probabilities is called the binomial distribution.


    1. In the binomial experiment above, how many trials were there?

      n= ______

    2. In the binomial experiment above, what is the probability of success? What is the probability of failure?

      p= ______

      q=1-p= ______

    3. Next, we will compute the actual probabilities from a binomial distribution. We will apply our knowledge of probability laws to help with the computation. We will lower the number of trials to 2 unfair coins in this computation. An unfair coin is a coin that is unbalanced and therefore lands on one side more or less than the other. Let’s assume we have a collection of biased coins where the probability of getting heads is 60%.
      1. Recall, if events A and B are independent, then P(A and B)= _________________. This is the multiplication rule for independent events.
      2. If events A and B are mutually exclusive, then P(A or B)= _________________. This is the addition rule for mutually exclusive events.
      3. If you flip 2 coins, are the 2 coin flips independent? Explain.
      4. If 2 coins are tossed, there are 4 possible outcomes. One possible outcome is the first coin lands on heads, and the second coin lands on heads (H and H). In this outcome, there are 2 successes so the random variable takes on the value 2. List the other 3 possible outcomes from tossing 2 coins.
      5. Compute the probability of both coins landing on heads.

        \(P(H \cap H)=P(2)=\underline{\ \ \ \ \ \ \ \ \ \ }\)

      6. Compute the probability of both coins landing on tails. \(P(T \cap T)=P(0)=\underline{\ \ \ \ \ \ \ \ \ \ }\)
      7. Compute the probability of one coin landing on heads. Careful, there is more than one way this can happen! \(P((H \text { and } T) O R(T \text { and } H))=P(1)=\underline{\ \ \ \ \ \ \ \ \ \ }\)
      8. Summarize the distribution in the table:

        Heads, x

        P(x)

        0

        1

        2

        Total

      9. Graph the binomial probability distribution as a histogram.

        AD_4nXepV1ssdjoOCb_Is-n-Q38IZFampXGltYtiNtfxzAIUDB7N-YJQGBhKwGIcOkuDt_PsezIko0MfsIXv-5HXcgmCtUdDOtfRrwUhjGOL--UF_w7kzfXzy9W4xD12-NZAl2tg4KQPSSngtkaQNFAcrXvBYoUgkeyi1XJeTDlU718V25snr3PRQ

      10. Estimate the mean of the distribution.

    Next, we construct the binomial distribution for three biased coins where the probability of a coin landing on heads is 60%.

    \(\begin{aligned}
    & P(0)=P(T \text { and } T \text { and } T)=P(T) \cdot P(T) \cdot P(T)=(0.4)(0.4)(0.4)=1 \cdot(0.4)^3 \\
    & P(T \text { and } T \text { and } H)=P(T) \cdot P(T) \cdot P(H)=(0.4)(0.4)(0.6)=(0.4)^2(0.6)^1=0.096 \\
    & P(T \text { and } H \text { and } T)=P(T) \cdot P(H) \cdot P(T)=(0.4)(0.4)(0.6)=(0.4)^2(0.6)^1=0.096 \\
    & P(H \text { and } T \text { and } T)=P(H) \cdot P(T) \cdot P(T)=(0.4)(0.4)(0.6)=(0.4)^2(0.6)^1=0.096 \\
    & P(1)=P\left(T T H \text { or } T H T \text { or HTT }=3 \cdot(0.4)^2(0.6)^1=3(0.096)=0.288\right. \\
    & P(T \text { and } H \text { and } H)=P(T) \cdot P(H) \cdot P(H)=(0.4)(0.6)(0.6)=(0.4)^1(0.6)^2=0.144 \\
    & P(H \text { and } T \text { and } H)=P(H) \cdot P(T) \cdot P(H)=(0.4)(0.6)(0.6)=(0.4)^1(0.6)^2=0.144 \\
    & P(H \text { and } H \text { and } T)=P(H) \cdot P(H) \cdot P(T)=(0.4)(0.6)(0.6)=(0.4)^1(0.6)^2=0.144 \\
    & P(2)=P\left(T H H \text { or HTH or HHT }=3 \cdot(0.4)^1(0.6)^2=3(0.144)=0.432\right. \\
    & P(3)=P(H \text { and } H \text { and } H)=P(H) \cdot P(H) \cdot P(H)=(0.6)(0.6)(0.6)=1 \cdot(0.6)^3=0.216
    \end{aligned}\)

    1. Let’s notice some patterns: what do you notice about the probabilities when the number of trials is 3, the probability of success is 0.6, and the probability of failure is 0.4?

      \(\begin{aligned}
      & P(0)=1 \cdot(0.6)^0(0.4)^3 \\
      & P(1)=3 \cdot(0.6)^1(0.4)^2 \\
      & P(2)=3 \cdot(0.6)^2(0.4)^1 \\
      & P(3)=1 \cdot(0.6)^3(0.4)^0
      \end{aligned}\)

    2. Below are the probabilities when flipping 4 biased coins where the probability of success (landing on heads) is 0.6 and the probability of failure (landing on tails) is 0.4. Fill in the blanks.

      \[\begin{aligned}
      P(0)&=1 \cdot(0.6)^0(\underline{\ \ \ \ \ \ })^4 \\
      P(1)&=4 \cdot(\underline{\ \ \ \ \ \ })\overline{\ \ \ \ \ \ }(0.4)^3 \\
      P(2)&=6 \cdot(0.6)\overline{\ \ \ \ \ \ }(0.4)\overline{\ \ \ \ \ \ } \\
      P(3)&=4 \cdot(\underline{\ \ \ \ \ \ })^3(\underline{\ \ \ \ \ \ })^1 \\
      P(\underline{\ \ \ \ \ \ })&= \underline{\ \ \ \ \ \ }\cdot(0.6)^4(0.4)^0
      \end{aligned}\]

    The Combination Function

    Notice that there are 6 combinations of 4 coins that result in 2 heads: {HHTT, HTHT, HTTH, THTH, TTHH, THHT}. This process of listing out all combinations of x successes in n trials is complicated and time consuming. Luckily, mathematicians have derived a formula for such a function. The combination (or choose) function is denoted \({ }_n{C_x}\left({ } “ n\text { choose } x "\right)\). The number of ways to get x successes in n trials is

    \[{ }_n C_x=\frac{n!}{x!(n-x)!}\nonumber\]

    This function involves using the factorial n! (pronounced n factorial) means multiply all the whole numbers n down to 1. For example, \(5!=5 \cdot 4 \cdot 3 \cdot 2 \cdot 1=120\). We say that 0!=1.

    Let’s calculate \({ }_4 C_2\).

    \[{ }_4 C_2=\frac{4!}{2!\cdot 2!}=\frac{4 \cdot 3 \cdot 2 \cdot 1}{(2 \cdot 1)(2 \cdot 1)}=\frac{4 \cdot 3}{2 \cdot 1}=6\nonumber\]

    1. Rather than doing this calculation by hand, let’s use https://www.desmos.com/calculator to do it for us.
      1. Type in nCr(4,2). You will see that desmos gives us the answer 6 which matches our calculation above.
      2. Compute \({ }_{12} C_9\) by typing nCr(12,9) on line 2.
      3. Compute \({ }_{52} C_{49}\) by typing in nCr(52,49) on line 3.

    Computing Binomial Probability

    Given below is the computation of the probability of flipping exactly two heads in four coin tosses.

    P(x=2) because we are looking for the probability of 2 successes (heads) in 4 trials. Coefficient is 6 because there are 6 combinations in which 2 heads occur in four coin flips. Multiplied by the probability of success, 0.6, raised to the power of 2 which is the number of successes (heads). Multiplied by the probability of failure, 0.4, raised to the power of 2 which is the number of failures (tails).

    We now generalize this formula. In a binomial experiment with n independent trials where p is the probability of success, and q=1-p is the probability of failure, the probability of exactly x successes in n trials is


    This page titled 4.3: The Binomial Distribution is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Hannah Seidler-Wright.

    • Was this article helpful?