Overview
On this page, we discuss the joint probability distribution of two discrete random variables, and when random variables are independent and what this implies when calculating the expected value or variance of products or sums of random variables, respectively.
Basic learning objectives
These are the tasks you should be able to perform with reasonable fluency when you arrive at your next class meeting. Important new vocabulary words are indicated in italics.
- Understand the concept of the joint probability distribution of two discrete random variables.
- Know the precise definition of when two random variables are independent and be able to verify this in practice.
Advanced learning objectives
In addition to mastering the basic objectives, here are the tasks you should be able to perform after class, with practice:
-
Know that \(E(XY)=E(X)\cdot E(Y)\) only if \(X\) and \(Y\) are independent random variables.
-
Know that \(\text{Var}(X + Y) = \text{Var}(X) + \text{Var}(Y)\) only if \(X\) and \(Y\) are independent random variables.
-
Know properties of expectation and variance which are always true, even if not dealing with independent random variables.
To prepare for class
-
Watch the first 6 minutes of the following video (by MIT OpenCourseWare) which introduces the concept of the joint probability distribution of two discrete random variables (you can ignore the second half for now, it discusses joint distributions of more than two random variables):
-
Watch the following video (by MIT OpenCourseWare) which defines when two discrete random variables are independent:
After class
-
Watch the following two videos (by MIT OpenCourseWare) which show special properties of expected value and variance when dealing with independent random variables (in contrast with properties which always hold):