← Back to Portal

Tutorial Sheet 6

Functions of Random Variables

14 Problems Solved

👨‍🏫

Professor's Note

Congratulations on reaching the final tutorial! This sheet covers advanced topics like Functions of Random Variables, Moment Generating Functions (MGFs), and Inequalities. These tools are the backbone of statistical inference and reliability engineering.

Q1

Transformation: Y = e^X

Find the PDF of \( e^X \) in terms of the PDF of \( X \). Specialize for \( X \sim U(0,1) \).

💡 Hints

Use the change of variables formula: \( f_Y(y) = f_X(g^{-1}(y)) \left| \frac{d}{dy} g^{-1}(y) \right| \).

📚 Concepts Used

\[ f_Y(y) = f_X(x) \left| \frac{dx}{dy} \right| \]

✍️ Full Solution

Let \( Y = e^X \), so \( X = \ln Y \).

The derivative is \( \frac{dx}{dy} = \frac{1}{y} \).

\[ f_Y(y) = f_X(\ln y) \cdot \frac{1}{y} \]

Special Case: \( X \sim U(0,1) \)

\( f_X(x) = 1 \) for \( 0 < x < 1 \).

As \( x \) ranges from 0 to 1, \( y = e^x \) ranges from 1 to \( e \).

\( f_Y(y) = 1/y \) for \( 1 < y < e \)
Q2

Standard Uniform Transformations

If \( X \sim U(-1,1) \), find the PDF of \( Y = -\ln|X| \).

💡 Hints

First find the distribution of \( W = |X| \). If \( X \) is uniform on \([-1,1]\), what is \( |X| \) uniform on?

✍️ Full Solution

1. Let \( W = |X| \). Since \( X \) is uniform on \([-1,1]\), its absolute value \( W \) is uniform on \([0,1]\). So \( f_W(w) = 1 \).

2. Now \( Y = -\ln W \), so \( W = e^{-Y} \). The derivative is \( |-e^{-Y}| = e^{-Y} \).

3. Transformation formula: \( f_Y(y) = f_W(e^{-y}) \cdot e^{-y} = 1 \cdot e^{-y} \).

\( f_Y(y) = e^{-y} \) for \( y > 0 \). (Exponential distribution with \(\lambda=1\)).
Q3

General Combination Proofs

Find the PDF of \( Z = X+Y, Z=X-Y, Z=XY, Z=X/Y \) in terms of PDFs of independent \( X, Y \).

💡 Hints

Use the method of convolution for sums and ratios.

📚 Concepts Used

Independent random variables allow the joint PDF to be expressed as the product of marginals: \( f(x,y) = f_X(x)f_Y(y) \).

1. Sum: \( Z = X + Y \)

\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z-x) \, dx \]

2. Difference: \( Z = X - Y \)

\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(x-z) \, dx \]

3. Product: \( Z = XY \)

\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z/x) \frac{1}{|x|} \, dx \]

4. Ratio: \( Z = X/Y \)

\[ f_Z(z) = \int_{-\infty}^{\infty} f_X(zy)f_Y(y) |y| \, dy \]

Q4

MGF Linear Properties

Prove: (a) \( M_{aX}(t) = M_X(at) \), (b) \( M_{X+a}(t) = e^{at}M_X(t) \).

💡 Hints

Start from the definition \( M_X(t) = E[e^{tX}] \) and apply properties of expectation.

Proof (a):

\[ M_{aX}(t) = E[e^{t(aX)}] = E[e^{(at)X}] = M_X(at) \]

Proof (b):

\[ M_{X+a}(t) = E[e^{t(X+a)}] = E[e^{tX} \cdot e^{at}] \]

Since \( e^{at} \) is constant w.r.t expectation:

\[ = e^{at} E[e^{tX}] = e^{at} M_X(t) \]

Q5

Covariance for Sum PDF

Find \( Cov(X,Y) \) for \( f(x,y) = x+y, 0 < x,y < 1 \).

💡 Hints

Calculate \( E[X], E[Y], E[XY] \) using double integration.

✍️ Full Solution

1. \( E[X] = \int_0^1 \int_0^1 x(x+y) \, dx \, dy = 7/12 \). By symmetry, \( E[Y] = 7/12 \).

2. \( E[XY] = \int_0^1 \int_0^1 xy(x+y) \, dx \, dy = \int_0^1 (y/3 + y^2/2) \, dy = 1/3 \).

3. \( Cov(X,Y) = 1/3 - (7/12)^2 = 48/144 - 49/144 = -1/144 \).

Answer: -1/144 \(\approx -0.0069\)
Q6

Expectation of Product

X and Y are independent with \( g(x) = 8/x^3, x > 2 \) and \( h(y) = 2y, 0 < y < 1 \). Find \( E[XY] \).

💡 Hints

For independent variables, \( E[XY] = E[X]E[Y] \).

📚 Concepts Used

Expected value of an absolute product: \( E[XY] = \iint xy f(x,y) \, dx \, dy \).

✍️ Full Solution

\[ E[X] = \int_2^\infty x \cdot \frac{8}{x^3} \, dx = [-8/x]_2^\infty = 4 \]

\[ E[Y] = \int_0^1 y \cdot 2y \, dy = [2y^3/3]_0^1 = 2/3 \]

\( E[XY] = 4 \times 2/3 = 8/3 \)
Q7

Red and Green Dice

Red die (X) and Green die (Y) are tossed. Find \( E[X+Y], E[X-Y], E[XY] \).

💡 Hints

Each die has \( E[X] = 3.5 \).

  • \( E[X+Y] = 3.5 + 3.5 = 7.0 \)
  • \( E[X-Y] = 3.5 - 3.5 = 0.0 \)
  • \( E[XY] = 3.5 \times 3.5 = 12.25 \)
Q8

MGF of Discrete Uniform

Find the Moment Generating Function for \( X \sim \{1, 2, \dots, k\} \) with \( P(x) = 1/k \).

💡 Hints

This is a finite geometric series with common ratio \( e^t \).

\[ M_X(t) = \frac{1}{k} \sum_{x=1}^k e^{tx} \]

Using the geometric series formula \( \sum ar^{n-1} = \frac{a(1-r^n)}{1-r} \):

\( M_X(t) = \frac{e^t(1 - e^{kt})}{k(1 - e^t)} \)
Q9

MGF of Geometric Distribution

Find MGF of geometric distribution \( pq^{x-1} \). Use it to find mean and variance.

💡 Hints

Infinite geometric series sum: \( S = \frac{a}{1-r} \).

\[ M_X(t) = \sum_{x=1}^\infty e^{tx} pq^{x-1} = p e^t \sum_{x=1}^\infty (qe^t)^{x-1} = \frac{pe^t}{1-qe^t} \]

Differentiating at \( t=0 \):

  • \( E[X] = M'(0) = 1/p \)
  • \( E[X^2] = M''(0) = \frac{1+q}{p^2} \)
  • \( Var(X) = q/p^2 \)

🎯 Exam Perspective

Show the infinite series step clearly. Leaving the MGF as a fraction is standard.
Q10

MGF of Poisson Distribution

Find MGF and calculate mean/variance for \( P(x) = \frac{e^{-\lambda}\lambda^x}{x!} \).

💡 Hints

Use the series definition: \( e^u = \sum \frac{u^x}{x!} \).

\[ M_X(t) = \sum e^{tx} \frac{e^{-\lambda}\lambda^x}{x!} = e^{-\lambda} \sum \frac{(\lambda e^t)^x}{x!} = e^{\lambda(e^t - 1)} \]

Derivative properties:

  • \( M'(t) = \lambda e^t e^{\lambda(e^t-1)} \implies E[X] = \lambda \)
  • \( M''(0) = \lambda + \lambda^2 \implies Var(X) = \lambda \)
Q11

Markov Inequality Bound

If average life span is 75 years, find the upper bound on the probability of living to 110.

💡 Hints

Markov's inequality only requires the mean.

📚 Concepts Used

\[ P(X \ge a) \le \frac{E[X]}{a} \]

\[ P(X \ge 110) \le \frac{75}{110} = \frac{15}{22} \approx 0.68 \]

Note: Markov bounds are often quite "loose" but provide a absolute maximum.
Q12

Chebyshev Inequality Bound

Previous problem, but with SD = 5 years. Find the Chebyshev bound for living to 110.

💡 Hints

Calculate how many standard deviations (\( k \)) 110 is away from the mean 75.

Difference \( \epsilon = 110 - 75 = 35 \).

\( k = \epsilon/\sigma = 35/5 = 7 \).

\[ P(|X-75| \ge 35) \le \frac{1}{k^2} = \frac{1}{49} \approx 0.0204 \]

Answer: 0.02 (2%). Much tighter than Markov!
Q13

Moments Relationship

Relate central moments \( c_n \) and raw moments \( \mu_k \).

💡 Hints

Use the binomial expansion of \( (X - \mu)^n \).

\[ c_n = E[(X-\mu)^n] = E\left[ \sum_{k=0}^n \binom{n}{k} X^k (-\mu)^{n-k} \right] \]

By linearity of expectation:

\[ c_n = \sum_{k=0}^n \binom{n}{k} \mu'_k (-\mu)^{n-k} \]
Q14

Power Dissipation Distribution

\( W = I^2 R \). \( f(i) = 6i(1-i) \), \( g(r) = 2r \). Find the distribution for Power W.

💡 Hints

Step 1: Find PDF of \( U = I^2 \). Step 2: Use product formula for \( W = UR \).

1. PDF of \( U=I^2 \): \( f_U(u) = 3(1-\sqrt{u}) \) for \( 0 < u < 1 \).

2. Joint Product Distribution integral:

\[ f_W(w) = \int_w^1 \frac{6w(1-\sqrt{u})}{u^2} \, du \]

3. Integration leads to:

\( f_W(w) = 6w + 6 - 12\sqrt{w} \) for \( 0 < w < 1 \).