16  Discrete Random Variables: Probability Mass Functions

Example 16.1 Roll a fair four-sided die twice and let \(X\) be the sum and \(Y\) the larger of the two rolls.

  1. Verify that the pmf of \(X\) is \[ p_X(x) = \frac{4 - |x - 5|}{16}, \qquad x = 2, 3, \ldots, 8 \]




  2. Is the following the pmf of \(Y\)? Discuss. \[ p_Y(u) = \frac{2u - 1}{16} \]




Example 16.2 Recall the “lookaway challenge of Example 8.4.

The game consists of possibly multiple rounds. In the first round, you point in one of four directions: up, down, left or right. At the exact same time, your friend also looks in one of those four directions. If your friend looks in the same direction you’re pointing, you win! Otherwise, you switch roles and the game continues to the next round — now your friend points in a direction and you try to look away. As long as no one wins, you keep switching off who points and who looks. The game ends, and the current “pointer” wins, whenever the “looker” looks in the same direction as the pointer.

Suppose that each player is equally likely to point/look in each of the four directions, independently from round to round.

Let \(X\) be the number of rounds until the game ends.

  1. Can the rounds be considered Bernoulli trials?




  2. What are the possible values that \(X\) can take? Is \(X\) discrete or continuous?




  3. Compute and interpret \(\text{P}(X=1)\).




  4. Compute and interpret \(\text{P}(X=2)\).




  5. Compute and interpret \(\text{P}(X=3)\).




  6. Find the probability mass function of \(X\).




  7. Construct a table, plot, and spinner representing the distribution of \(X\).




  8. How can you use the distribution of \(X\) to compute the probability that the player who starts as the pointer wins the game? (In Example 8.4 we computed this to be 4/7 using the law of total probability.)




  9. Compute and interpret \(\text{P}(X>3)\). Can you think of a way to do this without summing several terms?




  10. Compute and interpret \(\text{P}(X>5|X > 2)\). Compare to the previous part. What do you notice? Does this make sense?




  11. What seems like a reasonable shortcut formula for \(\text{E}(X)\) in terms of \(p\)? Consider the case \(p=0.25\) first and then general \(p\).




  12. Compute \(\text{E}(X)\) using the pmf of \(X\). Did the shortcut work?




  13. Interpret \(\text{E}(X)\) in context.




  14. Compute \(\text{Var}(X)\).




  15. Would \(\text{Var}(X)\) be bigger or smaller if \(p=0.9\)? If \(p=0.1\)?




Example 16.3 Continuing Example 16.2.

  1. Specify how you could simulate a value of \(X\) using the “simulate from the probability space” method.




  2. Specify how you could simulate a value of \(X\) using the “simulate from the distribution” method.




  3. Explain how you could use simulation to approximate the distribution of \(X\) and its expected value.




Example 16.4 Donny Dont is thoroughly confused about the distinction between a random variable and its distribution. Help him understand by by providing a simple concrete example of two different random variables \(X\) and \(Y\) that have the same distribution. Can you think of \(X\) and \(Y\) for which \(\text{P}(X = Y) = 0\)?