Skip to main content
Statistics LibreTexts

13.4: Problems on Transform Methods

  • Page ID
    10841
    • Contributed by Paul Pfeiffer
    • Professor emeritus (Computational and Applied Mathematics) at Rice University

    Exercise \(\PageIndex{1}\)

    Calculate directly the generating function \(g_X(s)\) for the geometric \((p)\) distribution.

    Answer

    \(g_X (s) = E[s^2] = \sum_{k = 0}^{\infty} p_k s^k = p \sum_{k = 0}^{\infty} q^k s^k = \dfrac{p}{1 - qs}\) (geometric series)

    Exercise \(\PageIndex{2}\)

    Calculate directly the generating function \(g_X(s)\) for the Poisson \((\mu)\) distribution.

    Answer

    \(g_X (s) = E[s^X] = \sum_{k = 0}^{\infty} p_k s^k = e^{-\mu} \sum_{k = 0}^{\infty} \dfrac{\mu^k s^k}{k!} = e^{-\mu} e^{\mu s} = e^{\mu (s - 1)}\)

    Exercise \(\PageIndex{3}\)

    A projection bulb has life (in hours) represented by \(X\) ~ exponential (1/50). The unit will be replaced immediately upon failure or at 60 hours, whichever comes first. Determine the moment generating function for the time \(Y\) to replacement.

    Answer

    \(Y = I_{[0, a]} (X) X + I_{(a, \infty)} (X) a\) \(e^{sY} = I_{[0, a)} (X) e^{sX} + I_{(a, \infty) (X) e^{as}\)

    \(M_Y (s) = \int_{0}^{a} e^{st} \lambda e^{-\lambda t}\ dt + s^{sa} \int_{a}^{\infty} \lambda e^{-\lambda t}\ dt\)

    \(= \dfrac{\lambda}{\lambda - s} [1 - e^{(\lambda - s) a}] + e^{-(\lambda - s) a}\)

    Exercise \(\PageIndex{4}\)

    Simple random variable \(X\) has distribution

    \(X =\) [-3 -2 0 1 4] \(PX =\) [0.15 0.20 0.30 0.25 0.10]

    a. Determine the moment generating function for \(X\)
    b. Show by direct calculation the \(M_X' (0) = E[X]\) and \(M_X'' (0) = E[X^2]\).

    Answer

    \(M_X (s) = 0.15 e^{-3s} + 0.20 e^{-2s} + 0.30 + 0.25 e^s + 0.10 e^{4s}\)

    \(M_X' (s) = -3 \cdot 0.15 e^{-3s} - 2 \cdot 0.20 e^{-2s} + 0 + 0.25 e^{s} + 4 \cdot 0.10 e^{4s}\)

    \(M_X''(s) = (-3)^2 \cdot 0.15 e^{-3s} + (-2)^2 \cdot 0.20 e^{-2s} + 0 + 0.25 e^{s} + 4^2 \cdot 0.10 e^{4s}\)

    Setting \(s = 0\) and using \(e^0 = 1\) give the desired results.

    Exercise \(\PageIndex{5}\)

    Use the moment generating function to obtain the variances for the following distributions

    EXponential \((\lambda)\) Gamma (\(\alpha, \lambda\)) Normal (\(\mu, \sigma^2\))

    Answer

    a. Exponential:

    \(M_X(s) = \dfrac{\lambda}{\lambda - s}\) \(M_X'(s) = \dfrac{\lambda}{(\lambda - s)^2}\) \(M_X''(s) = \dfrac{2\lambda}{(\lambda - s)^3}\)

    \(E[X] = \dfrac{\lambda}{\lambda^2} = \dfrac{1}{\lambda}\) \(E[X^2] = \dfrac{2\lambda}{\lambda^3} = \dfrac{2}{\lambda^2}\) \(\text{Var}[X] = \dfrac{2}{\lambda^2} - (\dfrac{1}{\lambda})^2= \dfrac{1}{\lambda^2}\)

    b. Gamma (\(\alpha, \lambda\)):

    \(M_X (s) = (\dfrac{\lambda}{\lambda - s})^{\alpha}\) \(M_X' (s) = \alpha (\dfrac{\lambda}{\lambda - s})^{\alpha - 1}\) \(\dfrac{\lambda}{(\lambda - s)^2} = \alpha (\dfrac{\lambda}{\lambda - s})^{\alpha} \dfrac{1}{\lambda - s}\)

    \(M_X'' (s) = \alpha^2 (\dfrac{\lambda}{\lambda - s})^{\alpha}\dfrac{1}{\lambda - s} \dfrac{1}{\lambda - s} + \alpha (\dfrac{\lambda}{\lambda - s})^{\alpha} \dfrac{1}{(\lambda - s)^2}\)

    \(E[X] =\dfrac{\alpha}{\lambda}\) \(E[X^2] =\dfrac{\alpha^2 + \alpha}{\lambda^2}\) \(\text{Var} [X] = \dfrac{\alpha}{\lambda^2}\)

    c. Normal(\(\mu, \sigma\)):

    \(M_X (s) = \text{exp} (\dfrac{\sigma^2 s^2}{2} + \mu s)\) \(M_X'(s) = M_X (s) \cdot (\sigma^2 s + \mu)\)

    \(M_X''(s) = M_X (s) \cdot (\sigma^2 s + \mu)^2 + M_X (s) \sigma^2\)

    \(E[X] = \mu\) \(E[X^2] = \mu^2 + \sigma^2\) \(\text{Var} [X] = \sigma^2\)

    Exercise \(\PageIndex{6}\)

    The pair \(\{X, Y\}\) is iid with common moment generating function \(\dfrac{\lambda^3}{(\lambda - s)^3}\). Determine the moment generating function for \(Z = 2X - 4Y + 3\).

    Answer

    \(M_Z(s) = e^{3s} (\dfrac{\lambda}{\lambda - 2s})^3 (\dfrac{\lambda}{\lambda + 4s})^3\)

    Exercise \(\PageIndex{7}\)

    The pair \(\{X, Y\}\) is iid with common moment generating function \(M_X (s) = (0.6 + 0.4e^s)\). Determine the moment generating function for \(Z = 5X + 2Y\).

    Answer

    \(M_Z (s) = (0.6 + 0.4e^{5s})(0.6 + 0.4e^{2s})

    Exercise \(\PageIndex{8}\)

    Use the moment generating function for the symmetric triangular distribution on \((-c, c)\) as derived in the section "Three Basic Transforms".

    1. Obtain an expression for the symmetric triangular distribution on \((a, b)\) for any \(a < b\).
    2. Use the result of part (a) to show that the sum of two independent random variables uniform on \((a, b)\) has symmetric triangular distribution on \((2a, 2b)\).
    Answer

    Let \(m = (a + b)/2\) and \(c = (b - a)/2\). If \(Y\) ~ symetric triangular on \((-c, c)\), then \(X = Y + m\) is symmetric triangular on \((m - c, m + c) = (a, b)\) and

    \(M_X (s) = e^{ms} M_Y (s) = \dfrac{e^{cs} + e^{-cs} - 2}{c^2s^2} e^{ms} = \dfrac{e^{(m + c)s} + e^{(m - c)s} - 2e^{ms}}{c^2s^2} = \dfrac{e^{hs} + e^{as} - 2e^{\dfrac{a+b}{2}s}}{(\dfrac{b - a}{2})^2s^2}\)

    \(M_{X + Y} (s) = [\dfrac{e^{sb} - e^{sa}}{s(b - a)}]^2 = \dfrac{e^{s2b}+ e^{s2a} - 2e^{s(b + a)}}{s^2 (b - a)^2}\)

    Exercise \(\PageIndex{9}\)

    Random variable \(X\) has moment generating function \(\dfrac{p^2}{(1 - qe^s)^2}\).

    a. Use derivatives to determine \(E[X]\) and \(\text{Var} [X]\).

    b. Recognize the distribution from the form and compare \(E[X]\) and \(\text{Var} [X]\) with the result of part (a).

    Answer

    \([p^2 (1 - qe^s)^{-2}]' = \dfrac{2p^2qe^s}{(1 - qe^s)^3}\) so that \(E[X] = 2q/p\)

    \([p^2 (1 - qe^s)^{-2}]'' = \dfrac{6p^2 q^2 e^s}{(1 - qe^s)^4} + \dfrac{2p^2qe^s}{(1 - qe^s)^3}\) so that \(E[X^2] = \dfrac{6q^2}{p^2} + \dfrac{2q}{p}\)

    \(\text{Var} [X] = \dfrac{2q^2}{p^2} + \dfrac{2q}{p} = \dfrac{2(q^2 + pq)}{p^2} = \dfrac{2q}{p^2}\)

    \(X\) ~ negative binomial \((2, p)\), which has \(E[X] = 2q/p\) and \(\text{Var} [X] = 2q/p^2\).

    Exercise \(\PageIndex{10}\)

    The pair \(\{X, Y\}\) is independent. \(X\) ~ Poisson (4) and \(Y\) ~ geometric (0, 3). Determine the generating function \(g_Z\) for \(Z = 3X + 2Y\).

    Answer

    \(g_Z (s) = g_X (s^3) g_Y (s^2) = e^{4(s^3-1)} \cdot \dfrac{0.3}{1 - qs^2}\)

    Exercise \(\PageIndex{11}\)

    Random variable \(X\) has moment generating function

    \(M_X (s) = \dfrac{1}{1 - 3s} \cdot \text{exp} (16s^2/2 + 3s)\)

    By recognizing forms and using rules of combinations, determine \(E[X]\) and \(\text{Var} [X]\).

    Answer

    \(X = X_1 + X_2\) with \(X_1\) ~ exponential (1/3) \(X_2\) ~ \(N\)(3, 16)

    \(E[X] = 3 + 3 = 6\) \(\text{Var} [X] = 9 + 16 = 25\)

    Exercise \(\PageIndex{12}\)

    Random variable \(X\) has moment generating function

    \(M_X (s) = \dfrac{\text{exp} (3(e^s - 1))}{1 - 5s} \cdot \text{exp} (16s^2/2 + 3s)\)

    By recognizing forms and using rules of combinations, determine \(E[X]\) and \(\text{Var} [X]\).

    Answer

    \(X = X_1 + X_2 + X_3\), with \(X_1\) ~ Poisson (3), \(X_2\) ~ exponential (1/5), \(X_3\0 ~ \(N\) (3, 16)

    \(E[X] = 3 + 5 + 3 = 11\) \(\text{Var} [X] = 3 + 25 + 16 = 44\)

    Exercise \(\PageIndex{13}\)

    Suppose the class \(\{A, B, C\}\) of events is independent, with respective probabilities 0.3, 0.5, 0.2. Consider

    \(X = -3I_A + 2I_B + 4I_C\)

    a. Determine the moment generating functions for and use properties of moment generating functions to determine the moment generating function for \(X\).
    b. Use the moment generating function to determine the distribution for \(X\).
    c. Use canonic to determine the distribution. Compare with result (b).
    d. Use distributions for the separate terms; determine the distribution for the sum with mgsum3. Compare with result (b).

    Answer

    \(M_X (s) = (0.7 + 0.3 e^{-3s})(0.5 + 0.5 e^{2s}) (0.8 + 0.2 e^{4s}) =\)

    \(0.12 e^{-3s} + 0.12 e^{-s} + 0.28 + 0.03 e^{s} + 0.28 e^{2s} + 0.03 e^{3s} + 0.07 e^{4s} + 0.07 e^{6s}\)

    The distribution is

    \(X = \) [-3 -1 0 1 2 3 4 6] \(PX =\) [0.12 0.12 0.28 0.03 0.28 0.03 0.07 0.07]

    c = [-3 2 4 0];
    P = 0.1*[3 5 2];
    canonic
     Enter row vector of coefficients  c
     Enter row vector of minterm probabilities  minprob(P)
    Use row matrices X and PX for calculations
    Call for XDBN to view the distribution
    P1 = [0.7 0.3];
    P2 = [0.5 0.5];
    P3 = [0.8 0.2];
    X1 = [0 -3];
    X2 = [0 2];
    X3 = [0 4];
    [x,px] = mgsum3(X1,X2,X3,P1,P2,P3);
    disp([X;PX;x;px]')
       -3.0000    0.1200   -3.0000    0.1200
       -1.0000    0.1200   -1.0000    0.1200
             0    0.2800         0    0.2800
        1.0000    0.0300    1.0000    0.0300
        2.0000    0.2800    2.0000    0.2800
        3.0000    0.0300    3.0000    0.0300
        4.0000    0.0700    4.0000    0.0700
        6.0000    0.0700    6.0000    0.0700 

    Exercise \(\PageIndex{14}\)

    Suppose the pair \(\{X, Y\}\) is independent, with both \(X\) and \(Y\) binomial. Use generating functions to show under what condition, if any, \(X + Y\) is binomial.

    Answer

    Binomial iff both have same \(p\), as shown below.

    \(g_{X + Y} (s) = (q_1 + p_1 s)^n (q_2 + p_2s)^m = (q + ps)^{n + m}\) iff \(p_1 = p_2\)

    Exercise \(\PageIndex{15}\)

    Suppose the pair \(\{X, Y\}\) is independent, with both \(X\) and \(Y\) Poisson.

    a. Use generating functions to show under what condition \(X + Y\) is Poisson.
    b. What about \(X - Y\)? Justify your answer.

    Answer

    Always Poisson, as the argument below shows.

    \(g_{X + Y} (s) = e^{\mu(s - 1)} e^{v(s - 1)} = e^{(\mu + v) (s - 1)}\)

    However, \(Y\) ~ \(X\) could have negative values.

    Exercise \(\PageIndex{16}\)

    Suppose the pair \(\{X, Y\}\) is independent, \(Y\) is nonnegative integer-valued, \(X\) is Poisson and \(X + Y\) is Poisson. Use the generating functions to show that \(Y\) is Poisson.

    Answer

    \(E[X+Y] = \mu + v\), where \(v = E[Y] > 0\), \(g_X (s) = e^{\mu(s - 1)}\) and \(g_{X + Y} (s) = g_X (s) g_Y (s) = e^{(\mu + s) (s - 1)\). Division by \(g_X (s)\) gives \(g_Y (s) = e^{v(s - 1)}\).

    Exercise \(\PageIndex{17}\)

    Suppose the pair \(\{X, Y\}\) is iid, binomial (6, 0.51). By the result of Exercise 13.4.14

    \(X + Y\) is binomial. Use mgsum to obtain the distribution for \(Z = 2X + 4Y\). Does \(Z\) have the binomial distribution? Is the result surprising? Examine the first few possible values for \(Z\). Write the generating function for \(Z\); does it have the form for the binomial distribution?

    Answer
    x  = 0:6;
    px = ibinom(6,0.51,x);
    [Z,PZ] = mgsum(2*x,4*x,px,px);
    disp([Z(1:5);PZ(1:5)]')
             0    0.0002       % Cannot be binomial, since odd values missing
        2.0000    0.0012
        4.0000    0.0043
        6.0000    0.0118
        8.0000    0.0259
        - - - - - - - -

    \(g_X (s) = g_Y (s) = (0.49 + 0.51s)^6\) \(g_Z (s) = (0.49 + 0.51s^2)^6 (0.49 + 0.51s^4)^6\)

    Exercise \(\PageIndex{18}\)

    Suppose the pair \(\{X, Y\}\) is independent, with \(X\) ~ binomial (5, 0.33) and \(Y\) ~ binomial (7, 0.47).

    Let \(G = g(X) = 3X^2 - 2X\) and \(H = h(Y) = 2Y^2 + Y + 3\).

    a. Use the mgsum to obtain the distribution for \(G + H\).
    b. Use icalc and csort to obtain the distribution for \(G + H\) and compare with the result of part (a).

    Answer
    X = 0:5;
    Y = 0:7;
    PX = ibinom(5,0.33,X);
    PY = ibinom(7,0.47,Y);
    G = 3*X.^2 - 2*X;
    H = 2*Y.^2 + Y + 3;
    [Z,PZ] = mgsum(G,H,PX,PY);
     
     
    icalc
    Enter row matrix of X-values  X
    Enter row matrix of Y-values  Y
    Enter X probabilities  PX
    Enter Y probabilities  PY
     Use array operations on matrices X, Y, PX, PY, t, u, and P
    M = 3*t.^2 - 2*t + 2*u.^2 + u + 3;
    [z,pz] = csort(M,P);
    e = max(abs(pz - PZ))  % Comparison of p values
    e =  0

    Exercise \(\PageIndex{19}\)

    Suppose the pair \(\{X, Y\}\) is independent, with \(X\) ~ binomial (8, 0.39) and \(Y\) ~ uniform on {-1.3, -0.5, 1.3, 2.2, 3.5}. Let

    \(U = 3X^2 - 2X + 1\) and \(V = Y^3 + 2Y - 3\)

    a. Use mgsum to obtain the distribution for \(U + V\).
    b. Use icalc and csort to obtain the distribution for \(U + V\) and compare with the result of part (a).

    Answer
    X = 0:8;
    Y = [-1.3 -0.5 1.3 2.2 3.5];
    PX = ibinom(8,0.39,X);
    PY = (1/5)*ones(1,5);
    U  = 3*X.^2 - 2*X + 1;
    V  = Y.^3 + 2*Y - 3;
    [Z,PZ] = mgsum(U,V,PX,PY);
    icalc
    Enter row matrix of X-values  X
    Enter row matrix of Y-values  Y
    Enter X probabilities  PX
    Enter Y probabilities  PY
     Use array operations on matrices X, Y, PX, PY, t, u, and P
    M = 3*t.^2 - 2*t + 1 + u.^3 + 2*u - 3;
    [z,pz] = csort(M,P);
    e = max(abs(pz - PZ))
    e = 0

    Exercise \(\PageIndex{20}\)

    If \(X\) is a nonnegative integer-valued random variable, express the generating function as a power series.

    a. Show that the \(k\)th derivative at \(s = 1\) is

    \(g_X^{(k)} (1) = E[X(X - 1)(X - 2) \cdot \cdot \cdot (X - k + 1)]\)

    b. Use this to show the \(\text{Var} [X] = g_X''(1) + g_X'(1) - [g_X'(1)]^2\).

    Answer

    Since power series may be differentiated term by term

    \(g_X^{(n)} (s) = \sum_{k = 0}^{\infty} k (k - 1) \cdot (k - n + 1) p_k s^{k - n}\) so that

    \(g_X^{(n)} (1) = \sum_{k = 0}^{\infty} k(k - 1) \cdot (k - n + 1) p_k = E[X(X - 1) \cdot\cdot\cdot (X - n + 1)]\)

    \(\text{Var} [X] = E[X^2] - E^2[X] = E[X(X - 1)] + E[X] - E^2[X] = g_X''(1) + g_X' (1) - [g_X'(1)]^2\)

    Exercise \(\PageIndex{21}\)

    Let \(M_X (\cdot)\) be the moment generating function for \(X\).

    a. Show that \(\text{Var}[X]\) is the second derivative of \(e^{-s\mu} M_X(s)\) evaluated at \(s = 0\).
    b. Use this fact to show that \(X\) ~ \(N(\mu, \sigma^2)\), then \(\text{Var} [X] = \sigma^2\).

    Answer

    \(f(s) = e^{-s \mu} M_X (s)\) \(f''(s) = e^{-s\mu} [-\mu M_X' (s) + \mu^2 M_X (s) + M_X''(s) - \mu M_X'(s)]\)

    Setting \(s = 0\) and using the result on moments gives

    \(f''(0) = -\mu^2 + \mu^2 + E[X^2] - \mu^2 = \text{Var} [X]\)

    Exercise \(\PageIndex{22}\)

    Use derivatives of \(M_{M_m} (s)\) to obtain the mean and variance of the negative binomial (\(m, p\)) distribution.

    Answer

    To simplify writing use \(f(s)\) for \(M_X (S)\).

    \(f(s) = \dfrac{p^m}{(1 - qe^s)^m}\) \(f'(s) = \dfrac{mp^mqe^s}{(1 - qe^s)^{m + 1}}\) \(f''(s) = \dfrac{mp^m qe^s}{1 - qe^s)^{m + 1}} + \dfrac{m(m+1) p^m q^2 e^{2s}}{1 - qe^s)^{m + 2}}\)

    \(E[X] = \dfrac{mp^m q}{(1 - q)^{m + 1}} = \dfrac{mq}{p}\) \(E[X^2] = \dfrac{mq}{p} + \dfrac{m(m+1)p^mq^2}{(1-q)^{m + 2}}\)

    \(\text{Var} [X] = \dfrac{mq}{p} + \dfrac{m(m + 1) q^2}{p^2} - \dfrac{m^2 q^2}{p^2} = \dfrac{mq}{p^2}\)

    Exercise \(\PageIndex{23}\)

    Use moment generating functions to show that variances add for the sum or difference of independent random variables.

    Answer

    To simplify writing, set \(f(s) = M_X (s)\), \(g(s) = M_Y (s)\), and \(h(s) = M_X (s) M_Y(s)\)

    \(h'(s) = f'(s) g(s) + f(s) g'(s)\) \(h''(s) = f''(s) g(s) + f'(s) g'(s) + f'(s) g'(s) + f(s) g''(s)\)

    Setting \(s = 0\) yields

    \(E[X + Y] = E[X] + E[Y]\) \(E[(X + Y)^2] = E[X^2] + 2E[X]E[Y] + E[Y^2]\) \(E^2 [X + Y] = E^2[X] + 2E[X] E[Y] + E^2[Y]\)

    Taking the difference gives \(\text{Var}[X + Y] = \text{Var} [X] + \text{Var} [Y]\). A similar treatment with \(g(s)\) replaced by \(g(-s)\) shows \(\text{Var} [X - Y] = \text{Var} [X] + \text{Var} [Y]\).

    Exercise \(\PageIndex{24}\)

    The pair \(\{X, Y\}\) is iid \(N\)(3,5). Use the moment generating function to show that \(Z = 2X - 2Y + 3\) is normal (see Example 3 from "Transform Methods" for general result).

    Answer

    \(M_{3X} (s) = M_X (3s) = \text{exp} (\dfrac{9 \cdot 5s^2}{2} + 3 \cdot 3s)\) \(M_{-2Y} (s) = M_Y(-2s) = \text{exp} (\dfrac{4 \cdot 5s^2}{2} - 2 \cdot 3s)\)

    \(M_Z (s) = e^{3s} \text{exp} (\dfrac{(45 + 20)s^2}{2} + (9 - 6) s) = \text{exp} (\dfrac{65s^2}{2} + 6s)\)

    Exercise \(\PageIndex{25}\)

    Use the central limit theorem to show that for large enough sample size (usually 20 or more), the sample average

    \(A_n = \dfrac{1}{n} \sum_{i = 1}^{n} X_i\)

    is approximately \(N(\mu, \sigma^2/n)\) for any reasonable population distribution having mean value \(\mu\) and variance \(\sigma^2\).

    Answer

    \(E[A_n] = \dfrac{1}{n} \sum_{i = 1}^{n} \mu = \mu\) \(\text{Var} [A_n] = \dfrac{1}{n^2} \sum_{i = 1}^{n} \sigma^2 = \dfrac{\sigma^2}{n}\)

    By the central limit theorem, \(A_n\) is approximately normal, with the mean and variance above.

    Exercise \(\PageIndex{26}\)

    A population has standard deviation approximately three. It is desired to determine the sample size n needed to ensure that with probability 0.95 the sample average will be within 0.5 of the mean value.

    1. Use the Chebyshev inequality to estimate the needed sample size.
    2. Use the normal approximation to estimate \(n\) (see Example 1 from "Simple Random Samples and Statistics").
    Answer

    Chevyshev inequality:

    \(P(\dfrac{|A_n - \mu|}{\sigma/\sqrt{n}} \ge \dfrac{0.5 \sqrt{n}}{3}) \le \dfrac{3^2}{0.5^2 n} \le 0.05\) implies \(n \ge 720\)

    Normal approximation: Use of the table in Example 1 from "Simple Random Samples and Statistics" shows

    \(n \ge (3/0.5)^2 3.84 = 128\)

    • Was this article helpful?