<meta http-equiv="refresh" content="1; url=/nojavascript/"> What are Tree Diagrams? | CK-12 Foundation
Dismiss
Skip Navigation
You are reading an older version of this FlexBook® textbook: CK-12 Probability and Statistics - Basic (A Full Course) Go to the latest version.

Learning Objectives

  • Know the definition of conditional probability.
  • Use conditional probability to solve for probabilities in finite sample spaces.

In the last chapter, we studied independent and dependent events, as well as mutually inclusive and mutually exclusive events. We used the Addition Rule for dependent events, as well as mutually inclusive and mutually exclusive events. The Addition Rule, or Addition Principle, is used to find P(A \ or \ B), while the Multiplication Rule is used for independent events.

Addition Rule – For 2 events, A and B, the probability of selecting one event or another is given by: P(A \ \text{or} \ B) = P(A) + P(B) - P(A \ \text{and} \ B).

Multiplication Rule – For 2 independent events, A and B, where the outcome of A does not change the probability of B, the probability of A and B is given by: P(A \ \text{and} \ B) = P(A) \times P(B).

Tree diagrams are another way to show the outcomes of simple probability events. In a tree diagram, each outcome is represented as a branch on a tree.

Let’s say you were going to toss a coin 2 times and wanted to find the probability of getting 2 heads. This is an example of independent events, because the outcome of one event does not affect the outcome of the second event. What does this mean? Well, when you flip the coin once, you have an equal chance of getting a head (H) or a tail (T). On the second flip, you also have an equal chance of getting a a head or a tail. In other words, whether the first flip was heads or tails, the second flip could just as likely be heads as tails. You can represent the outcomes of these events on a tree diagram.

From the tree diagram, you can see that the probability of getting a head on the first flip is \frac{1}{2}. Starting with heads, the probability of getting a second head will again be \frac{1}{2}. But how do we calculate the probability of getting 2 heads? These are independent events, since the outcome of tossing the first coin in no way affects the outcome of tossing the second coin. Therefore, we can calculate the probability as follows:

P(A \ \text{and} \ B) &= \frac{1}{2} \times \frac{1}{2}\\    P(A \ \text{and} \ B) &= \frac{1}{4}

Therefore, we can conclude that the probability of getting 2 heads when tossing a coin twice is \frac{1}{4}, or 25%. Let’s try an example that is a little more challenging.

Example 1

Irvin opens up his sock drawer to get a pair socks to wear to school. He looks in the sock drawer and sees 4 red socks, 6 white socks, and 8 brown socks. Irvin reaches in the drawer and pulls out a red sock. He is wearing blue shorts, so he replaces it. He then draws out a white sock. What is the probability that Irvin pulls out a red sock, replaces it, and then pulls out a white sock?

Solution:

First let’s draw a tree diagram.

There are 18 socks in Irvin’s sock drawer. The probability of getting a red sock when he pulls out the first sock is:

P(\text{red}) &= \frac{4}{18}\\ P(\text{red}) &= \frac{2}{9}

Irvin puts the sock back in the drawer and pulls out the second sock. The probability of getting a white sock on the second draw is:

P(\text{white}) &= \frac{6}{18}\\ P(\text{white}) &= \frac{1}{3}

Therefore, the probability of getting a red sock and then a white sock when the first sock is replaced is:

P(\text{red and white}) &= \frac{2}{9} \times \frac{1}{3}\\    P(\text{red and white}) &= \frac{2}{27}

One important part of these types of problems is that order is not important.

Let’s say Irvin picked out a white sock, replaced it, and then picked out a red sock. Calculate this probability.

P(\text{white and red}) &= \frac{1}{3} \times \frac{2}{9}\\    P(\text{white and red}) &= \frac{2}{27}

So regardless of the order in which he takes the socks out, the probability is the same. In other words, P(\text{red and white}) = P(\text{white and red}).

Example 2

In Example 1, what happens if the first sock is not replaced?

Solution:

The probability that the first sock is red is:

P(\text{red}) &= \frac{4}{18}\\  P(\text{red}) &= \frac{2}{9}

The probability of picking a white sock on the second pick is now:

So now, the probability of selecting a red sock and then a white sock, without replacement, is:

P(\text{red and white}) &= \frac{2}{9} \times \frac{6}{17}\\    P(\text{red and white}) &= \frac{12}{153}\\  P(\text{red and white}) &= \frac{4}{51}

If the first sock is white, will P(\text{red and white}) = P(\text{white and red}) as we found in Example 1? Let's find out.

P(\text{white}) &= \frac{6}{18}\\  P(\text{white}) &= \frac{1}{3}

The probability of picking a red sock on the second pick is now:

As with the last example, P(\text{red and white}) = P(\text{white and red}). So when does order really matter?

Image Attributions

Files can only be attached to the latest version of None

Reviews

Please wait...
Please wait...
Image Detail
Sizes: Medium | Original
 
CK.MAT.ENG.SE.1.Prob-&-Stats-Basic-(Full-Course).2.1

Original text