# 2.1: What are Tree Diagrams?

**At Grade**Created by: CK-12

**Learning Objectives**

- Know the definition of conditional probability.
- Use conditional probability to solve for probabilities in finite sample spaces.

In the last chapter, we studied independent and dependent events, as well as mutually inclusive and mutually exclusive events. We used the Addition Rule for dependent events, as well as mutually inclusive and mutually exclusive events. The Addition Rule, or Addition Principle, is used to find \begin{align*}P(A \ or \ B)\end{align*}, while the Multiplication Rule is used for independent events.

**Addition Rule** – For 2 events, \begin{align*}A\end{align*} and \begin{align*}B\end{align*}, the probability of selecting one event or another is given by: \begin{align*}P(A \ \text{or} \ B) = P(A) + P(B) - P(A \ \text{and} \ B)\end{align*}.

**Multiplication Rule** – For 2 independent events, \begin{align*}A\end{align*} and \begin{align*}B\end{align*}, where the outcome of \begin{align*}A\end{align*} does not change the probability of \begin{align*}B\end{align*}, the probability of \begin{align*}A\end{align*} and \begin{align*}B\end{align*} is given by: \begin{align*}P(A \ \text{and} \ B) = P(A) \times P(B)\end{align*}.

**Tree diagrams** are another way to show the outcomes of simple probability events. In a tree diagram, each outcome is represented as a branch on a tree.

Let’s say you were going to toss a coin 2 times and wanted to find the probability of getting 2 heads. This is an example of independent events, because the outcome of one event does not affect the outcome of the second event. What does this mean? Well, when you flip the coin once, you have an equal chance of getting a head (H) or a tail (T). On the second flip, you also have an equal chance of getting a a head or a tail. In other words, whether the first flip was heads or tails, the second flip could just as likely be heads as tails. You can represent the outcomes of these events on a tree diagram.

From the tree diagram, you can see that the probability of getting a head on the first flip is \begin{align*}\frac{1}{2}\end{align*}. Starting with heads, the probability of getting a second head will again be \begin{align*}\frac{1}{2}\end{align*}. But how do we calculate the probability of getting *2* heads? These are independent events, since the outcome of tossing the first coin in no way affects the outcome of tossing the second coin. Therefore, we can calculate the probability as follows:

\begin{align*}P(A \ \text{and} \ B) &= \frac{1}{2} \times \frac{1}{2}\\ P(A \ \text{and} \ B) &= \frac{1}{4}\end{align*}

Therefore, we can conclude that the probability of getting 2 heads when tossing a coin twice is \begin{align*}\frac{1}{4}\end{align*}, or 25%. Let’s try an example that is a little more challenging.

*Example 1*

Irvin opens up his sock drawer to get a pair socks to wear to school. He looks in the sock drawer and sees 4 red socks, 6 white socks, and 8 brown socks. Irvin reaches in the drawer and pulls out a red sock. He is wearing blue shorts, so he replaces it. He then draws out a white sock. What is the probability that Irvin pulls out a red sock, replaces it, and then pulls out a white sock?

*Solution:*

First let’s draw a tree diagram.

There are 18 socks in Irvin’s sock drawer. The probability of getting a red sock when he pulls out the first sock is:

\begin{align*}P(\text{red}) &= \frac{4}{18}\\ P(\text{red}) &= \frac{2}{9}\end{align*}

Irvin puts the sock back in the drawer and pulls out the second sock. The probability of getting a white sock on the second draw is:

\begin{align*}P(\text{white}) &= \frac{6}{18}\\ P(\text{white}) &= \frac{1}{3}\end{align*}

Therefore, the probability of getting a red sock and then a white sock when the first sock is *replaced* is:

\begin{align*}P(\text{red and white}) &= \frac{2}{9} \times \frac{1}{3}\\ P(\text{red and white}) &= \frac{2}{27}\end{align*}

One important part of these types of problems is that order is not important.

Let’s say Irvin picked out a white sock, replaced it, and then picked out a red sock. Calculate this probability.

\begin{align*}P(\text{white and red}) &= \frac{1}{3} \times \frac{2}{9}\\ P(\text{white and red}) &= \frac{2}{27}\end{align*}

So regardless of the order in which he takes the socks out, the probability is the same. In other words, \begin{align*}P(\text{red and white}) = P(\text{white and red})\end{align*}.

*Example 2*

In Example 1, what happens if the first sock is *not replaced*?

*Solution:*

The probability that the first sock is red is:

\begin{align*}P(\text{red}) &= \frac{4}{18}\\ P(\text{red}) &= \frac{2}{9}\end{align*}

The probability of picking a white sock on the second pick is now:

So now, the probability of selecting a red sock and then a white sock, without replacement, is:

\begin{align*}P(\text{red and white}) &= \frac{2}{9} \times \frac{6}{17}\\ P(\text{red and white}) &= \frac{12}{153}\\ P(\text{red and white}) &= \frac{4}{51}\end{align*}

If the first sock is white, will \begin{align*}P(\text{red and white}) = P(\text{white and red})\end{align*} as we found in Example 1? Let's find out.

\begin{align*}P(\text{white}) &= \frac{6}{18}\\ P(\text{white}) &= \frac{1}{3}\end{align*}

The probability of picking a red sock on the second pick is now:

As with the last example, \begin{align*}P(\text{red and white}) = P(\text{white and red})\end{align*}. So when does order *really* matter?