Mutual Independence

Flash and JavaScript are required for this feature.

Download the video from iTunes U or the Internet Archive.

ALBERT MEYER: We've looked at independence for two events. What about when we have a bunch of events? Well, in that case we want to look at the idea of mutual independence. So let's check that out.

I'll say that if I have n different events, I'll say that they're mutually independent, intuitively. If the probability that one of them occurs is unchanged by which other ones happen to have occurred. So expressed in conditional probability, which is the way to make it precise, what we're really saying is that events A1 through An are mutually independent when the probability of Ai is equal to the probability of Ai, given the intersection of any of the other As as long as i is not one of them. So take A1, A2, or A1, A2, A3, and so on. And A5 is going to be independent of all of those other intersections.

If we shift over to the other definition of independence that we used for two sets, in terms of products, you could say that n sets are mutually independent when the probability of the intersection of any bunch of them is equal to the product of the individual probabilities of the events in the intersection. Let's look at an example of mutual independence. Maybe the simplest one is the one of independent coin flips, which by definition are independent.

So the idea is that I will flip a coin a bunch of times. And I will let Hi be the event that the ith time I flip I get a heads. So if you think about what's going on, what happens on the fifth flip has nothing to do with what happens on the first, fourth or seventh flip. There's no causal relationship between the flips before or after flip five. Flip five is an isolated event by itself. And the fact that there were a bunch of heads before or there will be a bunch of heads afterward doesn't have any impact on the probability that the fifth flip comes up with a head. At least that's what we believe and that's the way that we would model them.

So what that means, for example, is that the probability of a head on the fifth toss is equal to the probability of a head on the fifth toss given that the first toss was a head and the fourth toss was a head and the seventh toss was not a head. This is the complement of H7. So that would just be an example of one of the many different conditional equations that hold when you have mutual independence.

Let's look at an example. Suppose that I flip a fair coin twice. Now, the previous definition didn't require fairness at all in the coin flipping. But now I'm going to need it. So that means that heads and tails are equally likely. And suppose I flip the coin twice. Well, let H1 be as before, the event that a head comes up on the first flip. And H2, the event that a head comes up on the second flip. And let O be the event that there were an odd number of heads in the two flips.

Now, I claim that O is independent of whether or not there's a head on the first flip. That may seem a little weird because O depends on both the first flip and the second flip. It's whether or not there are an odd number of heads there, but nevertheless, I claim that whether or not there are an odd number of heads is independent of whether or not the first toss was a head. Let's just check it using the official definition.

First of all, O is the event HT TH. If I write out Hs and Ts, a pair of them for what the results of the first and second flips were, you get an odd number of heads exactly when there's first a head and then a tail or first a tail and then a head, which means that the probability of O is exactly a half. Because the other two outcomes are TT and HH, which is when you have an even number of heads.

Now, O into section H1 is saying that you have an odd number of heads and the first toss is a head. The only outcome that fits that description is HT, which means that-- and the probability of HT is a quarter-- so the probability of O intersection H1 is a quarter. O into section H1 is just a peculiar way of saying you got a head and then you got a tail. So that means that the probability of O intersection H1 is a quarter. And of course, that's equal to the probability of O, which we decided was a half, and the probability of H1, which of course is a half, because we said the coin was fair.

So I've verified the condition for the independence of O and H1, and therefore, I'm done. But the weird thing to notice now is that if you look at O, H1, and H2, the three of them, they are not mutually independent. Because in fact, if you know any two of them you can figure out what the third one was. But just explicitly in terms of conditional probabilities, the probability of there being an odd number of heads, given that the first toss was a head and the second toss was a head, is 0, because once you know H1 and H2 you know exactly how many heads there were. There were two. And that's not odd.

So the probability of odd given H1 intersection H2 is 0, which is not equal to the probability of odd by itself, which was a half. So the three of them are not independent. They're not mutually independent, even though any two of them are because O and H1 are. And obviously O and H2 are by symmetry. And H1 and H2 to are coin tosses, and they're independent.

So that leads us to the general idea of k-way independence. And an example would be if you flip a fair coin k times, let Hi be whether or not there's a head on the ith flip. And you let O, again, be whether or not there are an odd number of heads. And by the same argument, you can verify that any set of k of these events are mutually independent.

But if you give me all k plus 1, then they are not independent. In fact, any k of them will determine the k plus first one. But any k among themselves will be mutually independent. So that's why this notion of how independent a bunch of sets are comes up, and this is how to count it.

So in general, events A1 through an arbitrary set of events are k-way independent if any k of them are mutually independent. Pairwise, then, is just the case of two way independence. And what we saw was the example that with k coin flips the events odd and the outcomes of head or not on H1 through Hk are k-way independent, but not k plus one way independent.

By the way, now that we understand what k-way independence is, mutual independence of n sets is simply n-way independence. But I just wanted to close with the remark that checking with n events are mutually independent means that you actually have to check all the intersections equaling the products of the individual events in the intersection. So that there are two to the n possible collections of subsets of A1 through An and you have to check for each of them, that the intersection of those ones that you chose is equal to the product of their probabilities.

But of course, you don't need to check the empty selection. And you don't need to check the single [? set ?], so you just have to check the 2 to the n equations corresponding to all the subsets of size more than one. So it's 2 to the n minus n plus 1 equations to check. So in general, it's not going to be easy to verify mutual independence by doing this kind of a calculation. And you usually arrive at it really by assumption most of the time.

Free Downloads

Video


Caption

  • English-US (SRT)