Events \(A\) and \(B\) are independent if \[P(A \cap B) = P(A) \cdot P(B)\]
Suppose we flip a fair sided coin twice. Let \(A\) be the event that the first flip is a head. Let \(B\) be the event that the second flip is a head. Let \(C\) be the event that the second flip is a tail.
The sample space is the set of all possible outcomes: \[\Omega = \{HH, HT, TH, TT\}\] The measurable events are all subsets of \(\Omega,\) \(\mathcal{P}(\Omega).\) The probability is defined by giving weight \(\frac{1}{4}\) to each point in \(\Omega.\)
We can write the events as subsets of \(\Omega\) and compute their probabilities. \begin{align} & A = \{HH, HT\} \\ & B = \{HH, TH\} \\ & C = \{HT, TT\} \\ \end{align} To compute the probability of \(A,\) we can use the fact that \(A = \{HH\} \cup \{HT\}\) and that \(\{HH\} \cap \{HT\} = \emptyset.\) \begin{align} P(A) & = P(\{HH, HT\}) \\ & = P(\{HH\}) + P(\{HT\}) \\ & = \frac{1}{4} + \frac{1}{4} \\ & = \frac{1}{2} \end{align} Similarly, \(P(B) = \frac{1}{2}\) and \(P(C) = \frac{1}{2}.\)
Now we can use the probabilities to determine which of the events are independent. \begin{align} & P(A) \cdot P(B) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4} \\ & P(A \cap B) = P(\{HH, HT\} \cap \{HH, TH\}) = P(\{HH\}) = \frac{1}{4} \\ \end{align} So, \(A\) and \(B\) are independent events. \begin{align} & P(A) \cdot P(C) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4} \\ & P(A \cap C) = P(\{HH, HT\} \cap \{HT, TT\}) = P(\{HT\}) = \frac{1}{4} \\ \end{align} So, \(A\) and \(C\) are independent events. \begin{align} & P(B) \cdot P(C) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4} \\ & P(B \cap C) = P(\{HH, TH\} \cap \{HT, TT\}) = P(\emptyset) = 0 \\ \end{align} So, \(B\) and \(C\) are not independent events.
There are often misconceptions about independent events, so we give a visualization here.
\(A\) is \(1/3\) of \(\Omega:\)
\(A\) takes up \(1/3\) of the blue set \(B:\)
\(B\) is \(1/2\) of \(\Omega:\)
\(B\) takes up \(1/2\) of the red set \(A:\)
If \(A\) and \(B\) are independent events, then \(P(A|B) = P(A).\)
Claim: If \(A\) and \(B\) are independent, then \(A\) and \(B^C\) are independent.
By the claim, if \(A\) and \(B\) are independent then so are \(A^C\) and \(B,\) and \(A^C\) and \(B^C.\)
A collection of events \((A_i : i \in I)\) for some index set \(I\) is said to be a collection of independent events if for every finite subset \(J \subset I,\) \[P\left(\bigcap_{j \in J}A_j\right) = \prod_{j \in J} P(A_j)\]
Check your understanding:
1. Events \(A\) and \(B\) are independent. If \(P(A) = 0.4\) and \(P(B)=0.3,\) find \(P(A \cap B).\)
Unanswered
2. Events \(A\) and \(B\) are independent. If \(P(A) = 0.6\) and \(P(B)=0.5,\) find \(P(A|B).\)
Unanswered
3. Events \(A\) and \(B\) are independent. If \(P(A) = 0.5\) and \(P(B)=0.7,\) find \(P(A \cup B).\)
Unanswered
4. A die is rolled twice in a row. Find the probability that the first roll is 3 or higher, and the second roll is 5 or lower.
Unanswered