14 Problems Solved
Congratulations on reaching the final tutorial! This sheet covers advanced topics like Functions of Random Variables, Moment Generating Functions (MGFs), and Inequalities. These tools are the backbone of statistical inference and reliability engineering.
Find the PDF of \( e^X \) in terms of the PDF of \( X \). Specialize for \( X \sim U(0,1) \).
Use the change of variables formula: \( f_Y(y) = f_X(g^{-1}(y)) \left| \frac{d}{dy} g^{-1}(y) \right| \).
Let \( Y = e^X \), so \( X = \ln Y \).
The derivative is \( \frac{dx}{dy} = \frac{1}{y} \).
\[ f_Y(y) = f_X(\ln y) \cdot \frac{1}{y} \]
Special Case: \( X \sim U(0,1) \)
\( f_X(x) = 1 \) for \( 0 < x < 1 \).
As \( x \) ranges from 0 to 1, \( y = e^x \) ranges from 1 to \( e \).
If \( X \sim U(-1,1) \), find the PDF of \( Y = -\ln|X| \).
First find the distribution of \( W = |X| \). If \( X \) is uniform on \([-1,1]\), what is \( |X| \) uniform on?
1. Let \( W = |X| \). Since \( X \) is uniform on \([-1,1]\), its absolute value \( W \) is uniform on \([0,1]\). So \( f_W(w) = 1 \).
2. Now \( Y = -\ln W \), so \( W = e^{-Y} \). The derivative is \( |-e^{-Y}| = e^{-Y} \).
3. Transformation formula: \( f_Y(y) = f_W(e^{-y}) \cdot e^{-y} = 1 \cdot e^{-y} \).
Find the PDF of \( Z = X+Y, Z=X-Y, Z=XY, Z=X/Y \) in terms of PDFs of independent \( X, Y \).
Use the method of convolution for sums and ratios.
Independent random variables allow the joint PDF to be expressed as the product of marginals: \( f(x,y) = f_X(x)f_Y(y) \).
\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z-x) \, dx \]
\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(x-z) \, dx \]
\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z/x) \frac{1}{|x|} \, dx \]
\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(zy)f_Y(y) |y| \, dy \]
Prove: (a) \( M_{aX}(t) = M_X(at) \), (b) \( M_{X+a}(t) = e^{at}M_X(t) \).
Start from the definition \( M_X(t) = E[e^{tX}] \) and apply properties of expectation.
Proof (a):
\[ M_{aX}(t) = E[e^{t(aX)}] = E[e^{(at)X}] = M_X(at) \]
Proof (b):
\[ M_{X+a}(t) = E[e^{t(X+a)}] = E[e^{tX} \cdot e^{at}] \]
Since \( e^{at} \) is constant w.r.t expectation:
\[ = e^{at} E[e^{tX}] = e^{at} M_X(t) \]
Find \( Cov(X,Y) \) for \( f(x,y) = x+y, 0 < x,y < 1 \).
Calculate \( E[X], E[Y], E[XY] \) using double integration.
1. \( E[X] = \int_0^1 \int_0^1 x(x+y) \, dx \, dy = 7/12 \). By symmetry, \( E[Y] = 7/12 \).
2. \( E[XY] = \int_0^1 \int_0^1 xy(x+y) \, dx \, dy = \int_0^1 (y/3 + y^2/2) \, dy = 1/3 \).
3. \( Cov(X,Y) = 1/3 - (7/12)^2 = 48/144 - 49/144 = -1/144 \).
X and Y are independent with \( g(x) = 8/x^3, x > 2 \) and \( h(y) = 2y, 0 < y < 1 \). Find \( E[XY] \).
For independent variables, \( E[XY] = E[X]E[Y] \).
Expected value of an absolute product: \( E[XY] = \iint xy f(x,y) \, dx \, dy \).
\[ E[X] = \int_2^\infty x \cdot \frac{8}{x^3} \, dx = [-8/x]_2^\infty = 4 \]
\[ E[Y] = \int_0^1 y \cdot 2y \, dy = [2y^3/3]_0^1 = 2/3 \]
Red die (X) and Green die (Y) are tossed. Find \( E[X+Y], E[X-Y], E[XY] \).
Each die has \( E[X] = 3.5 \).
Find the Moment Generating Function for \( X \sim \{1, 2, \dots, k\} \) with \( P(x) = 1/k \).
This is a finite geometric series with common ratio \( e^t \).
\[ M_X(t) = \frac{1}{k} \sum_{x=1}^k e^{tx} \]
Using the geometric series formula \( \sum ar^{n-1} = \frac{a(1-r^n)}{1-r} \):
Find MGF of geometric distribution \( pq^{x-1} \). Use it to find mean and variance.
Infinite geometric series sum: \( S = \frac{a}{1-r} \).
\[ M_X(t) = \sum_{x=1}^\infty e^{tx} pq^{x-1} = p e^t \sum_{x=1}^\infty (qe^t)^{x-1} = \frac{pe^t}{1-qe^t} \]
Differentiating at \( t=0 \):
Find MGF and calculate mean/variance for \( P(x) = \frac{e^{-\lambda}\lambda^x}{x!} \).
Use the series definition: \( e^u = \sum \frac{u^x}{x!} \).
\[ M_X(t) = \sum e^{tx} \frac{e^{-\lambda}\lambda^x}{x!} = e^{-\lambda} \sum \frac{(\lambda e^t)^x}{x!} = e^{\lambda(e^t - 1)} \]
Derivative properties:
If average life span is 75 years, find the upper bound on the probability of living to 110.
Markov's inequality only requires the mean.
\[ P(X \ge 110) \le \frac{75}{110} = \frac{15}{22} \approx 0.68 \]
Previous problem, but with SD = 5 years. Find the Chebyshev bound for living to 110.
Calculate how many standard deviations (\( k \)) 110 is away from the mean 75.
Difference \( \epsilon = 110 - 75 = 35 \).
\( k = \epsilon/\sigma = 35/5 = 7 \).
\[ P(|X-75| \ge 35) \le \frac{1}{k^2} = \frac{1}{49} \approx 0.0204 \]
Relate central moments \( c_n \) and raw moments \( \mu_k \).
Use the binomial expansion of \( (X - \mu)^n \).
\[ c_n = E[(X-\mu)^n] = E\left[ \sum_{k=0}^n \binom{n}{k} X^k (-\mu)^{n-k} \right] \]
By linearity of expectation:
\( W = I^2 R \). \( f(i) = 6i(1-i) \), \( g(r) = 2r \). Find the distribution for Power W.
Step 1: Find PDF of \( U = I^2 \). Step 2: Use product formula for \( W = UR \).
1. PDF of \( U=I^2 \): \( f_U(u) = 3(1-\sqrt{u}) \) for \( 0 < u < 1 \).
2. Joint Product Distribution integral:
\[ f_W(w) = \int_w^1 \frac{6w(1-\sqrt{u})}{u^2} \, du \]
3. Integration leads to: