### Discrete Probability Distributions This section covers problems related to discrete random variables, including probability mass functions (PMF), cumulative distribution functions (CDF), mean, and variance. **Key Concepts:** - **Probability Mass Function (PMF):** $P(X=x)$ for discrete RVs. $\sum P(X=x) = 1$. - **Cumulative Distribution Function (CDF):** $F(x) = P(X \le x) = \sum_{t \le x} P(X=t)$. - **Expected Value (Mean):** $E[X] = \sum x P(X=x)$. - **Variance:** $Var(X) = E[X^2] - (E[X])^2 = \sum x^2 P(X=x) - (E[X])^2$. **Problems:** 1. **Coin Toss:** A coin is tossed three times. $X$ is the random variable for the number of heads. Construct the probability distribution table for $X$. * **Hint:** Possible outcomes: HHH, HHT, HTH, THH, HTT, THT, TTH, TTT. Count heads for each. 2. **Battery Testing:** Two depleted batteries are in a set of five. Batteries are tested one by one without replacement. $X$ is the number of batteries tested to find the two depleted batteries. Find the PMF of $X$. * **Hint:** Consider the sequence of finding the two depleted batteries. 3. **PMF Validation & Probabilities:** For $f(x) = c(1/4)^x$, $x = 0, 1, 2, ...$ to be a PMF. * (a) Find $c$. * (b) Find $P(X 1)$, $P(3 \le X ### Continuous Probability Distributions This section focuses on continuous random variables, probability density functions (PDF), cumulative distribution functions (CDF), mean, and variance. **Key Concepts:** - **Probability Density Function (PDF):** $f(x)$ for continuous RVs. $\int_{-\infty}^{\infty} f(x) dx = 1$. $P(a \le X \le b) = \int_a^b f(x) dx$. - **Cumulative Distribution Function (CDF):** $F(x) = P(X \le x) = \int_{-\infty}^{x} f(t) dt$. $f(x) = F'(x)$. - **Expected Value (Mean):** $E[X] = \int_{-\infty}^{\infty} x f(x) dx$. - **Variance:** $Var(X) = E[X^2] - (E[X])^2 = \int_{-\infty}^{\infty} x^2 f(x) dx - (E[X])^2$. **Problems:** 6. **PDF Analysis 1:** Given PDF $f(x) = \begin{cases} k(x+1) & 2 \le x \le 4 \\ 0 & \text{otherwise} \end{cases}$ * (i) Find $k$. (ii) Obtain $F(x)$. (iii) Using $F(x)$ compute $P(X \ge 3)$. (iv) Find Mean and Variance of $X$. (v) Using $F(x)$ find $P(2.5 1 \end{cases}$ (Note: The provided CDF in the problem has a typo, assuming the correct form for a valid CDF). * (i) Find the PDF $f(x)$. (ii) Find $P(-0.5 1$. 12. **CDF to PDF:** Given CDF $F(x) = \begin{cases} 1 - (1+x)e^{-x} & x \ge 0 \\ 0 & \text{otherwise} \end{cases}$. Find the corresponding density function of $X$. * **Hint:** $f(x) = F'(x)$. 13. **Exponential Distribution:** Given PDF $f(x) = kxe^{-kx}$ for $x \ge 0$, and $0$ otherwise. * Determine (i) $k$, (ii) Mean, and (iii) Variance. * **Hint:** $\int_0^\infty kxe^{-kx} dx = 1$. Use integration by parts for mean and $E[X^2]$. 14. **Moment Generating Function (MGF):** Obtain the MGF of a random variable $X$ having PDF $f(x) = \begin{cases} x & 0 \le x \le 1 \\ 2-x & 1 \le x \le 2 \\ 0 & \text{otherwise} \end{cases}$. * **Hint:** $M_X(t) = E[e^{tX}] = \int_{-\infty}^{\infty} e^{tx} f(x) dx$. 15. **MGF for Discrete RV:** Let $X$ be a discrete random variable with PDF $f(x) = ab^x$ where $a, b$ are positive such that $a+b=1$, and $X$ takes values $0, 1, 2, ...$. Find the MGF of $X$. * **Hint:** $M_X(t) = \sum_{x=0}^{\infty} e^{tx} ab^x$. This is a geometric series. ### Chebyshev's Inequality Chebyshev's inequality provides a lower bound on the probability that a random variable falls within a certain distance from its mean, or an upper bound on the probability that it falls outside. **Key Concepts:** - **Chebyshev's Inequality:** $P(|X - \mu| \ge k\sigma) \le \frac{1}{k^2}$ or $P(|X - \mu| \ge \epsilon) \le \frac{Var(X)}{\epsilon^2}$. - Also, $P(|X - \mu| ### Binomial and Poisson Distributions This section deals with specific discrete probability distributions commonly used for counts of events. **Key Concepts:** - **Binomial Distribution:** $X \sim B(n, p)$. $P(X=k) = \binom{n}{k} p^k (1-p)^{n-k}$. $E[X] = np$, $Var(X) = np(1-p)$. - **Poisson Distribution:** $X \sim P(\lambda)$. $P(X=k) = \frac{e^{-\lambda} \lambda^k}{k!}$. $E[X] = \lambda$, $Var(X) = \lambda$. **Problems:** 19. **Binomial Parameters:** Mean and variance of a Binomial variate $X$ with parameters $n$ and $p$ are $16$ and $0.8$ respectively. * Find (i) $P(X = 0)$, (ii) $P(X \ge 2)$. * **Hint:** Use $np = 16$ and $np(1-p) = 0.8$ to find $n$ and $p$. 20. **Binomial Probability:** Find $p$ of the Binomial distribution if $n = 6$ and $P(X = 4) = P(X = 2)$. * **Hint:** Set up the equation $\binom{6}{4} p^4 (1-p)^2 = \binom{6}{2} p^2 (1-p)^4$. 21. **Poisson Probabilities:** If $X$ has a Poisson distribution with parameter $\lambda > 0$. * Find (i) $P(X \text{ is even})$, (ii) $P(X \text{ is odd})$. * **Hint:** Use the series expansion for $e^{\lambda}$ and $e^{-\lambda}$ and relate to sums of even/odd terms. 23. **Poisson Mean & Variance:** If $X$ has a Poisson distribution such that $P(X=2) = \frac{5}{9} P(X=4)$. * Find the mean and variance of $X$. Also find $P(X \le 1)$. * **Hint:** Substitute PMF formula for $P(X=2)$ and $P(X=4)$ to find $\lambda$. 25. **Bernoulli Trials (Binomial):** Probability of success in one Bernoulli trial is $0.01$. How many independent trials are necessary for the probability of at least one success to be greater than $\frac{1}{2}$? * **Hint:** $P(\text{at least one success}) = 1 - P(\text{no successes})$. 26. **Multiple Dice (Binomial):** Six dice are thrown $729$ times. How many times do you expect at least $3$ dice show a $5$ or $6$? * **Hint:** For each throw, define success as getting a 5 or 6 (probability $p = 2/6 = 1/3$). This is a binomial distribution $B(6, 1/3)$. Calculate $P(X \ge 3)$. 27. **Poisson Approximation to Binomial:** In sampling, mean number of defectives in a sample of $20$ is $2$. Out of $2000$ such samples, how many would be expected to contain at least $3$ defectives? * **Hint:** Use Poisson approximation if appropriate. $\lambda = 2$. $P(X \ge 3) = 1 - P(X ### Special Distributions and Properties This section covers properties and calculations for various common probability distributions. **Key Concepts:** - **Geometric Distribution:** $P(X=x) = (1-p)^{x-1}p$ for $x=1,2,...$. $E[X] = 1/p$, $Var(X) = (1-p)/p^2$. - **Uniform Distribution:** PDF $f(x) = \frac{1}{b-a}$ for $a \le x \le b$. $E[X] = \frac{a+b}{2}$, $Var(X) = \frac{(b-a)^2}{12}$. - **Exponential Distribution:** PDF $f(x) = \lambda e^{-\lambda x}$ for $x \ge 0$. $E[X] = 1/\lambda$, $Var(X) = 1/\lambda^2$. Memoryless property: $P(X > s+t | X > s) = P(X > t)$. - **Weibull Distribution:** PDF $f(x) = \alpha \beta x^{\beta-1} e^{-\alpha x^\beta}$ for $x \ge 0$. - **Normal Distribution:** $X \sim N(\mu, \sigma^2)$. Use Z-score $Z = (X-\mu)/\sigma$ and standard normal tables. - **Variance Properties:** $Var(aX+b) = a^2 Var(X)$. **Problems:** 35. **Geometric MGF:** Obtain the MGF of the Geometric distribution with PDF $f(x) = pq^{x-1}$ for $x = 1, 2, ...$. Hence find mean and variance. * **Hint:** $M_X(t) = E[e^{tX}] = \sum_{x=1}^{\infty} e^{tx} pq^{x-1}$. 36. **Uniform MGF:** Find the MGF of Uniform distribution with parameters $a$ and $\beta$, where $a x+a | X > a) = P(X > x)$, where $a > 0$. (Memoryless property) * (ii) Obtain the mean and variance of $X$. * **Hint:** For (i), use conditional probability formula and the CDF of exponential distribution. 39. **Uniform Distribution Probabilities:** Error in determining density is a Uniform random variable with $a = -0.015$ and $\beta = 0.015$. Find the probability that: * (i) Error is between $-0.01$ and $0.02$. * (ii) Error will exceed $0.005$. * **Hint:** The PDF is constant over the interval $[-0.015, 0.015]$. 40. **Normal Distribution Probabilities:** $X$ is a Normal random variable with mean $105$ and variance $25$. * Find (i) $P(X \le 112.5)$, (ii) $P(X > 100)$, (iii) $P(110.5 6)$. * **Hint:** Use $E[X] = \frac{a+b}{2}=5$ and $Var(X) = \frac{(b-a)^2}{12}=3$ to solve for $a$ and $b$. 47. **Weibull Distribution:** Random variable $X$, time to failure (in thousands of miles) of signal light on an automobile, has a Weibull distribution with $\alpha = 0.04$, $\beta = 2$. * What is the probability that the light will fail during the first $3000$ miles driven? Also find the mean and variance of $X$. * **Hint:** The CDF of Weibull is $F(x) = 1 - e^{-\alpha x^\beta}$. $P(X \le 3)$ means $P(X \le 3000/1000 = 3)$. ### Joint Distributions This section focuses on the behavior of multiple random variables together, including joint PMF/PDF, marginal distributions, and independence. **Key Concepts:** - **Joint PMF/PDF:** $f(x,y)$ describes the probability of $X=x$ and $Y=y$. - **Marginal PMF/PDF:** $f_X(x) = \sum_y f(x,y)$ (discrete) or $f_X(x) = \int f(x,y) dy$ (continuous). - **Independence:** $X$ and $Y$ are independent if $f(x,y) = f_X(x) f_Y(y)$ for all $x,y$. - **Conditional Probability:** $f_{Y|X}(y|x) = f(x,y)/f_X(x)$. **Problems:** 48. **Joint PDF & Marginals:** Given joint PDF $f(x,y) = x+y$ for $0 \le x \le 1, 0 \le y \le 1$, and $0$ otherwise. * (i) Find marginal density functions of $X$ and $Y$. * (ii) Find the mean of $X$. * (iii) Find $P(X \ge \frac{1}{2}, Y \ge \frac{1}{2})$. * **Hint:** For marginal $f_X(x) = \int_0^1 (x+y) dy$. 49. **Joint PDF & Probabilities:** Joint density function for $(X, Y)$ is $f(x,y) = \frac{3}{2}(x^2+y^2)$ for $0 \le x \le 1, 0 \le y \le 1$, and $0$ otherwise. * (i) Compute the probability that neither line is busy more than half of the time. (i.e., $P(X \le 0.5, Y \le 0.5)$) * (ii) Find the probability that first line is busy more than $75\%$ of the time. (i.e., $P(X > 0.75)$) * **Hint:** Integrate the joint PDF over the specified regions. For (ii), you'll need the marginal PDF of X. 50. **Joint PDF, Independence, & Probabilities:** Given joint density function $f(x,y) = xye^{-(x+y)}$ for $x \ge 0, y \ge 0$, and $0$ otherwise. * (i) Examine whether $X$ and $Y$ are independent. * (ii) Find $P(X 1)$. * (c) Determine whether $X$ and $Y$ are independent. * **Hint:** For marginals, sum across rows/columns. For independence, check if $P(X=x, Y=y) = P(X=x) P(Y=y)$ for all pairs.