https://mids-w203.github.io/practice_problems/
Several of these problems are adapted from Durrett's Elementary Probability for Appliations
This repo contains pointers to practice problems that run alongside the weekly concepts covered in MIDS w203.
Concept | Example | Page Number |
---|---|---|
Probability Spaces | 1.1: Dice Rolling | 1 |
Probability Spaces | 1.2: Dice Rolling | 2 |
Probability Spaces | 1.3: Letters in Word | 4 |
Set Theory Probability | 1.1.2: Basic Properties | 5 |
Set Theory Probability | 1.4: Car Stereos | 5 |
Set Theory Probability | 1.5: World Series | 6-8 |
Independent Events | 1.7: Draw 2 cards | 10 |
Independent Events | 1.8: Roll 2 dice | 11 |
Independent Events | 1.9: Birthdays | 11 |
Independent Events | 1.10: Roll 3 dice | 12 |
Conditional Probability | 3.1: Dice Rolling | 81 |
Conditional Probability | 3.2: Betting Games | 82 |
Conditional Probability | 3.4: Cards | 83 |
Conditional Probability | 3.5: Hockey Playoff | 83 |
Conditional Probability | 3.6: Graduating | 84 |
Conditional Probability | 3.8: Monty Hall Problem | 84 |
Conditional Probability | 3.9: Cognitive Dissonance | 85 |
Concept | Questions | Page Number |
---|---|---|
Probability Spaces | 1-2, 4-6, 8-18 | 26-27 |
Statistical Independence | All | 27-28 |
Concept | Example | Page Number |
---|---|---|
Discrete R.V.s | 1.11: Roll 2 dice | 13 |
Discrete R.V.s | 1.12: Geometric Distr. | 13 |
Discrete R.V.s | 1.13: Birthday problem II | 13 |
Discrete R.V.s: Joint Distributions | 3.25: Two 4 sided die | 100 |
Discrete R.V.s: Joint Distributions | 3.26: Draw 2 balls | 100 |
Discrete R.V.s: Joint Distributions | 3.27: Calculus Grades | 101 |
Discrete R.V.s: Independence | 3.28: Poisson Distr. | 103 |
Discrete R.V.s: Conditional Distributions | 3.29: AP Calculus | 104 |
Discrete R.V.s: Conditional Distributions | 3.30: Simpson's Paradox | 105 |
Continuous R.V.s | 5.1: Uniform Distr. | 162 |
Continuous R.V.s | 5.2: Exponential Distr. | 162 |
Continuous R.V.s: Distribution Functions | 5.7: Uniform Distr. | 167 |
Continuous R.V.s: Distribution Functions | 5.8: Exponential Distr. | 167 |
Continuous R.V.s: Distribution Functions | 5.10: Binomial Distr. | 169 |
Continuous R.V.s: Functions of R.V.s | 5.16: Functions of R.V.s | 173 |
Continuous R.V.s: Joint Distributions | 5.21: Uniform Distr. Ball | 178 |
Continuous R.V.s: Joint Distributions | 5.23: Uniform Distr. Square | 180 |
Continuous R.V.s: Marginal Density | 5.24: Marginal Density | 182 |
Continuous R.V.s: Independence | 5.25: Independence | 182 |
Continuous R.V.s: Conditional Distributions | 5.28: Conditional Distr. I | 184 |
Continuous R.V.s: Conditional Distributions | 5.29: Conditional Distr. II | 185 |
Concept | Questions | Page Number |
---|---|---|
Discrete Distributions | 35-39 | 29 |
Discrete Joint Distributions | 60-66 | 112 |
Density Functions | 1-2 | 185 |
Continuous Distribution Functions | 6-14 | 186 |
Functions of Random Variables | 18-21 | 187 |
Continuous Joint Distributions | All | 187 |
You are excited about a concert featuring your favorite a cappella group: the Pitch Estimators. Tickets go on sale at noon, but before you can buy a ticket, you have to wait for your turn in an online waiting room. Because tickets are in high demand, you enlist two of your friends to help you. All three of you enter the waiting room at noon, and as soon as any one of you gets a ticket, you are done and can all sign off.
Suppose your waiting time in minutes is a continuous random variable
Suppose these random varibles are have probability density functions given by:
These are examples of what we call exponential random variables.
a. Please sketch the probability density for all three random variables on one graph.
b. For a particular time
c. Let
Suppose that the time of an event is a random variable
The hazard rate at time
- Say that the time that a server breaks down is a random variable
$B$ , which is uniformly distributed on$[0,2]$ . Compute the hazard rate of$B$ . - Prove that if
$X$ is a random variable with hazard rate$h_X$ and$Y$ is a random variable with hazard rate$h_Y$ , then the hazard rate of$min(X,Y)$ is$h_X + h_Y$ . (Hint: write the hazard rate in terms of just the cdf - no pdf. Then remember what the cdf of a minimum of random variables looks like from the previous problem)
The following statements are either true or false. Prove them or provide a counterexample:
- If X, Y, and Z are random variables, X and Y are independent, Y and Z are independent, then X and Z must be independent.
- If X, Y, and Z are random variables, X and Y are not independent, Y and Z are not independent, then X and Z must be not independent.
Concept | Examples | Page Number |
---|---|---|
Expected Value | 1.18, 1.20, 1.21, 1.22 | 17-19 |
Expected Value of Sums | 6.8, 6.9, 6.10, 6.11, 6.12 | 197-200 |
Variance and Covariance | 1.26, 1.27, 1.29, | 23-25 |
Concept | Questions | Page Number |
---|---|---|
Expected Value | 40-49 | 29-30 |
Expected Value of Sums | 1-9 | 223 |
Variance and Covariance | (i) 50-56; (ii) 10-16 | (i) 30-31; (ii) 223-224 |
In
-
Compute the conditional expectation function of Y given X.
-
Compute the conditional expectation function of X given Y.
Suppose random variables X and Y have joint density given by,
This is a long (but revealing) way to compute
- Compute the marginal distribution of X.
- Compute
$var[Y|X]$ . This will be a fuction of X. - Compute
$\E[ var[Y|X]]$ . This will be a number. - Compute
$\E[Y|X]$ . This will be a function of X. - Compute
$var[ \E[Y|X]]$ . This will be a number. - Using the law of total variance, compute
$var[Y]$ .
Suppose X and Y have joint density given by,
-
Prove that the linear predictor
$g(x) = 0.5$ fulfills the first population moment condition. -
Prove that the linear predictor
$g(x) = 0.5$ fulfills the second population moment condition.
You conclude that
Suppose X and Y have joint mass function given by,
Consider a linear predictor,
- Compute
$\E[\epsilon]$ in terms of$\beta_0$ and$\beta_1$ .
Now set
- Compute
$\E[X \epsilon]$ in terms of$\beta_1$
Now set
- What is the BLP?
Consider random variables
In other words, linear predictors that pass through the origin. Given such a predictor, define
Examine the proof on page 77 of Agnostic Statistics and consider how it would be different for regression through the origin.
- Prove that
$\E[\epsilon X] = 0$ as before. - Is it still true that
$cov[\epsilon, X] = 0$ ? Prove it or give a counterexample. - Compute an expression for
$\beta_1$ .