## Game Theory (Part 8)

Last time we learned some rules for calculating probabilities. But we need a few more rules to get very far.

For example:

We say a coin is fair if it has probability 1/2 of landing heads up and probability 1/2 of landing tails up. What is the probability that if we flip two fair coins, both will land heads up?

Since each coin could land heads up or tails up, there are 4 events to consider here:

$(H,H), (H,T),$
$(T,H), (T,T)$

It seems plausible that each should be equally likely. If so, each has probability 1/4. So then the answer to our question would be 1/4.

But this is plausible only because we’re assuming that what one coin does doesn’t affect that the other one does! In other words, we’re assuming the two coin flips are ‘independent’.

If the coins were connected in some sneaky way, maybe each time one landed heads up, the other would land tails up. Then the answer to our question would be zero. Of course this seems silly. But it’s good to be very clear about this issue… because sometimes one event does affect another!

For example, suppose there’s a 5% probability of rain each day in the winter in Riverside. What’s the probability that it rains two days in a row? Remember that 5% is 0.05. So, you might guess the answer is

$0.05 \times 0.05 = 0.0025$

But this is wrong, because if it rains one day, that increases the probability that it will rain the next day. In other words, these events aren’t independent.

But if two events are independent, there’s an easy way to figure out the probability that they both happen: just multiply their probabilities! For example, if the chance that it will rain today in Riverside is 5% and the chance that it will rain tomorrow in Singapore is 60%, the chance that both these things will happen is

$0.05 \times 0.6 = 0.03$

or 3%, if these events are independent. I could try to persuade that this is a good rule, and maybe I will… but for now let’s just state it in a general way.

### Independence

So, let’s make a precise definition out of all this! Suppose we have two sets of events, $X$ and $Y.$ Remember that $X \times Y$, the Cartesian product of the sets $X$ and $Y$, is the set of all ordered pairs $(i,j)$ where $i \in X$ and $j \in Y$:

$X \times Y = \{ (i,j) : \; i \in X, j \in Y \}$

So, an event in $X \times Y$ consists of an event in $X$ and an event in $Y$. For example, if

$X = \{ \textrm{rain today}, \textrm{no rain today} \}$

and

$Y = \{ \textrm{rain tomorrow}, \textrm{no rain tomorrow} \}$

then

$X \times Y = \begin{array}{l} \{ \textrm{(rain today, rain tomorrow)}, \\ \textrm{(no rain today, rain tomorrow)}, \\ \textrm{(rain today, no rain tomorrow}, \\ \textrm{(no rain today, no rain tomorrow)} \} \end{array}$

Now we can define ‘independence’. It’s a rule for getting a probability distribution on $X \times Y$ from probability distributions on $X$ and $Y$:

Definition. Suppose $p$ is a probability distribution on a set of events $X,$ and $q$ is a probability distribution on a set of events $Y.$ If these events are independent, we use the probability distribution $r$ on $X \times Y$ given by

$r_{(i,j)} = p_i q_j$

People often call this probability distribution $p \times q$ instead of $r$.

### Examples

Example 1. Suppose we have a fair coin. This means we have a set of events

$X = \{H, T \}$

and a probability distribution $p$ with

$\displaystyle{ p_H = p_T = \frac{1}{2} }$

Now suppose we flip it twice. We get a set of four events:

$X \times X = \{(H,H), (H,T), (T,H), (T,T)\}$

Suppose the two coin flips are independent. Then we describe the pair of coin flips using the probability measure $r = p \times p$ on $X \times X,$ with

$\displaystyle{ r_{(H,H)} = p_H p_H = \frac{1}{4} }$

$\displaystyle{ r_{(H,T)} = p_H p_T = \frac{1}{4} }$

$\displaystyle{ r_{(T,H)} = p_T p_H = \frac{1}{4} }$

$\displaystyle{ r_{(T,T)} = p_T p_T = \frac{1}{4} }$

So, each of the four events—“heads, heads” and so on—has probability 1/4. This is fairly boring: you should have known this already!

But now we can do a harder example:

Example 2. Suppose we have an unfair coin that has a 60% chance of landing heads up and a 40% chance of landing tails up. Now we have a new probability distribution on $X,$ say $q$:

$\displaystyle{ q_H = .6, \quad q_T = .4 }$

Now say we flip this coin twice. What are the probabilities of the four different events that can happen? Let’s assume the two coin flips are independent. This means we should describe the pair of coin flips with a probability measure $s = q \times q$ on $X \times X.$ This tells us the answer to our question. We can work it out:

$\displaystyle{ s_{(H,H)} = q_H q_H = 0.6 \times 0.6 = 0.36 }$

$\displaystyle{ s_{(H,T)} = q_H q_T = 0.6 \times 0.4 = 0.24 }$

$\displaystyle{ s_{(T,H)} = q_T q_H = 0.4 \times 0.6 = 0.24 }$

$\displaystyle{ s_{(T,T)} = q_T q_T = 0.4 \times 0.4 = 0.16 }$

Puzzle 1. In this situation what is the probability that when we flip the coin twice it comes up heads exactly once?

Puzzle 2. In this situation what is the probability that when we flip the coin twice it comes up heads at least once?

For these puzzles you need to use what I told you in the section on ‘Probabilities of subsets’ near the end of Part 7.

Puzzle 3. Now suppose we have one fair coin and one coin that has a 60% chance of landing heads up. The first one is described by the probability distribution $p,$ while the second is described by $q.$ How likely is it that the first lands heads up and the second lands tails up? We can answer questions like this if the coin flips are independent. We do this by multiplying $p$ and $q$ to get a probability measure $t = p \times q$ on $X \times X.$ Remember the rule for how to do this:

$t_{(i,j)} = p_i q_j$

where each of $i$ and $j$ can be either $H$ or $T.$

What are these probabilities:

$\displaystyle{ t_{(H,H)} = ? }$

$\displaystyle{ t_{(H,T)} = ? }$

$\displaystyle{ t_{(T,H)} = ? }$

$\displaystyle{ t_{(T,T)} = ? }$

Puzzle 4. In this situation what is the probability that exactly one coin lands heads up?

Puzzle 5. In this situation what is the probability that at least one coin lands heads up?

Next time we’ll go a lot further…

### 8 Responses to Game Theory (Part 8)

1. evanberkowitz says:

It should be pointed out that if you believe that a coin is weighted but that each flip is independent, you can still use that coin to make a 50/50 decision by flipping it twice. The point is that the probability of heads-tails and tails-heads are (in the language of the post) $q_Hq_T$ and $q_Tq_H$ respectively, but multiplication (of numbers, anyway!) is commutative, and so each of these possibilities is equally likely. If it comes up heads twice or tails twice, try again.

• John Baez says:

Nice! That would make a good puzzle!

• Robert says:

You can go further, and used the weighted independent coin to make a decision with any random rational probability, p/q.

Let n be the smallest integer such that 2^n>q. Use the above procedure to generate n binary digits, forming a number, r, which will be equally distributed over the range 1 to 2^n. If r>q discard the digits and start again. Otherwise, you’ve now got a number between 1 and q inclusive, equidistributed on that range, which will be between 1 and p inclusive with probability p/q.

Off-hand, I don’t see any way arbitrary irrational probabilities can be generated using a procedure which, like the ones above, on average takes only a finite number of steps, using an independent possibly weighted coin. If the chance of the coin coming up heads happens to be $\frac{1}{\sqrt{2}}$ you can generate some set of irrational probabilities (which set?) but not, I think, $\frac{1}{\sqrt{3}}$

• John Baez says:

You might be interested in the work of Penrose where he uses quantum spins to get probabilities without ‘fine-tuning the angle of measurement’… but only rational probabilities:

• Roger Penrose, Angular momentum: an approach to combinatorial space-time, in Quantum Theory and Beyond, ed. T. Bastin, Cambridge University Press, Cambridge, 1971. Available in PDF and Postscript.

2. Derek Wise says:

typo: the definition of the set Y should be about “tomorrow”, not “today”.

• John Baez says:

Thanks! There were no clever ideas or jokes in this part, I’m afraid…

3. Last time we talked about independence of a pair of events, but we can easily go on and talk about independence of a longer sequence of events. For example, suppose we have three coins. […]

4. Now let’s work out the expected value of the payoff to each player. To do this, we’ll assume:

1) Player A uses mixed strategy $p.$
2) Player B uses mixed strategy $q.$
3) Player A and player B’s choices are independent.

If you forget what ‘independent’ means, look at Part 8. […]