## Game Theory (Part 4)

Last time we talked about Nash equilibria for a 2-player normal form game. We saw that sometimes a Nash equilibrium doesn’t exist! Sometimes there’s more than one! But suppose there is at least one Nash equilibrium. How do we find one? Sometimes it’s hard.

### Strict domination

But there’s a simple trick to rule out some possibilities. Sometimes player A will have a choice $i$ that gives them a bigger payoff than some choice $i'$ no matter what choice player B makes. In this case we say choice $i$ strictly dominates choice $i'.$ And in this case, there’s no way that $i'$ could be A’s choice in a Nash equilibrium, because player A can always improve their payoff by switching to choice $i.$

For example, look at this game:

 1 2 1 (0,0) (-1,1) 2 (2,-1) (0,0) 3 (-2,1) (1,-1) 4 (0,1) (2,0)

For player A, choice 4 strictly dominates choice 3. No matter what player B does, player A gets a bigger payoff if they make choice 4 than if they make choice 3. Stare at the black numbers in the table in rows 3 and 4 until you see this.

You can also see it using the payoff matrix for player A:

$A = \left( \begin{array}{rr} 0 & -1 \\ 2 & 0 \\ -2 & 1 \\ 0 & 2 \end{array} \right)$

Each number in row 4 here is bigger than the number directly above it in row 3. In other words:

$0 = A_{41} > A_{31} = -2$

and

$2 = A_{42} > A_{32} = 0$

Puzzle. Are there any other examples of a choice for player A that strictly dominates another choice?

And of course the same sort of thing works for player B. If player B has a choice $j$ that gives them a better payoff than some choice $j'$ no matter what player A does, then $j'$ can’t show up as B’s choice in a Nash equilibrium.

We can make our remarks more precise:

Definition. A choice $i$ for player A strictly dominates a choice $i'$ if

$A_{ij} > A_{i'j}$

for all $j.$ Similarly, a choice $j$ for player B strictly dominates a choice $j'$ if

$B_{ij} > B_{ij'}$

for all $i'.$

Theorem 1. If a choice $i$ for player A strictly dominates a choice $i',$ then no choice $j$ for player B gives a Nash equilibrium $(i',j).$ Similarly, if a choice $j$ for player B strictly dominates a choice $j',$ then no choice $i$ for player A gives a Nash equilibrium $(i,j').$

Proof. We’ll only prove the first statement since the second one works the exact same way. Suppose $i$ strictly dominates $i'.$ This means that

$A_{i j} > A_{i' j}$

for all $j.$ In this case, there is no way that the choice $i'$ can be part of a Nash equilibrium. After all, if $(i',j)$ is a Nash equilibrium we have

$A_{i' j} \ge A_{i'' j}$

for all $i''$. But this contradicts the previous inequality—just take $i'' = i.$

### Strict dominance

Definition. We say a choice is strictly dominant if it strictly dominates all other choices for that player.

Let me remind you again that a pure strategy is a way for one player to play a game where they always make the same choice. So, there is one pure strategy for each choice, and one choice for each pure strategy. This means we can be a bit relaxed about the difference between these words. So, what we’re calling strictly dominant choices, most people call strictly dominant pure strategies. You can read more about these ideas here:

Strategic dominance, Wikipedia.

We’ve seen that if one choice strictly dominates another, that other choice can’t be part of a Nash equilibrium. So if one choices strictly dominates all others, that choice is the only one that can be part of a Nash equilibrium! In other words:

Theorem 2. If a choice $i$ for player A is strictly dominant, then any Nash equilibrium $(i',j')$ must have $i' = i$. Similarly, if a choice $j$ for player B is strictly dominant, then any Nash equilibrium $(i',j')$ must have $j' = j$.

Proof. We’ll only prove the first statement since the second one works the exact same way. Theorem 1 says that if a choice $i$ for player A strictly dominates a choice $i',$ then no choice $j$ for player B gives a Nash equilibrium $(i',j).$ So, if $i$ strictly dominates all other choices for player A, it is impossible to have a Nash equilibrium $(i',j)$ unless $i' = i.$

We can go even further:

Theorem 3. If choice $i$ for player A is strictly dominant and choice $j$ for player B is strictly dominant, then $(i, j)$ is a Nash equilibrium, and there is no other Nash equilibrium.

Proof. Suppose choice $i$ for player A is strictly dominant and choice $j$ for player B is strictly dominant. Then by Theorem 2 any Nash equilibrium $(i',j')$ must have $i' = i$ and $j' = j$. So, there is certainly no Nash equilibrium other than $(i, j).$

But we still need to check that $(i, j)$ is a Nash equilibrium! This means we need to check that:

1) For all $1 \le i' \le m,$ $A_{i'j} \le A_{ij}.$

2) For all $1 \le j' \le m,$ $B_{ij'} \le B_{ij}.$

Part 1) is true because choice $i$ is strictly dominant. Part 2) is true because choice $j$ is strictly dominant. █

We can restate Theorem 3 is a less precise but rather pretty way:

Corollary. If both players have a strictly dominant pure strategy, there exists a unique Nash equilibrium.

Here I’m saying ‘pure strategy’ instead of ‘choice’ just to prepare you for people who talk that way!

### Domination

I realize that the terminology here has a kind of S&M flavor to it, with all this talk of ‘strict domination’ and the like. But there’s nothing I can do about that—it’s standard!

Anyway, there’s also a less strict kind of dominance:

Definition. A choice $i$ for player A dominates a choice $i'$ if

$A_{ij} \ge A_{i'j}$

for all $j.$ Similarly, a choice $j$ for player B dominates a choice $j'$ if

$B_{ij} \ge B_{ij'}$

for all $i.$

But there is less we can do with this definition. Why? Here’s why:

Puzzle. Find a normal-form two player game with 2 choices for each player, where $(i,j)$ is a Nash equilibrium even though $i$ is dominated by the other choice for player A and $j$ is dominated by the other choice for player B.

### 14 Responses to Game Theory (Part 4)

Small typo note: At the end of the “Strict Dominance” section, last sentence, last word should be “way”.

2. NeoMaths says:

(10,10) (-10,10)
(10,10) (-10,10)

Thus no unique Nash equilibrium when dominance is not strict,

• John Baez says:

Great! You clobbered it!

Here’s a simpler example:

 1 2 1 (0,0) (0,0) 2 (0,0) (0,0)

But of course your example is vastly more interesting, because the players get different payoffs in the different Nash equilibria!

If you want to eliminate stupid examples like mine, you can eliminate duplicate rows or columns in the table for a game. Every Nash equilibrium in the original game will map to a Nash equilibrium in the new simplified game, and this map is onto.

3. Blake Stacey says:

Typo: “just to prepare you for people who talk that what!”

4. Ramandeep Kaur says:

In your “dominates” definition, for player B, you wrote for all $i'$‘s; shouldn’t it be for all $i$?

• John Baez says:

Yes, it should be for all $i.$ Thanks! I’ll fix this now.

5. Is ” 0 = A_{41} > A_{31} = -1
and
2 = A_{42} > A_{32} = 0 ” part correct?

• Toussaint says:

I think there is a typo:

$0 = A_{41} > A_{31} = -2$

The second one is correct, i think.

• John Baez says:

Thanks, there was typo in the first inequality. It should be:

$0 = A_{41} > A_{31} = -2$

I’ve fixed it now. I’m sorry for not noticing konformistliberal’s correction earlier!

6. Randy Rodriguez says:

Puzzle: There is another example of choices for Player A that strictly dominates another choice.

2 = A_{21} > A_{11} = 0
and
0 = A_{22} > A_{12} = -1

7. Javier Salazar says:

I am going to say this because I missed it in the midterm exam: Shouldn’t the definition for dominance for player A include for all i’s also (apart from all j’s)? And if so why?

• John Baez says:

I suspect you may be mixing up two ideas. The words ‘dominate’ and ‘dominant’ sound very similar, but they mean very different things:

Definition. A choice $i$ for player A dominates a choice $i'$ if

$A_{ij} \ge A_{i'j}$

for all $j.$

Definition. We say a choice is dominant if it dominates all other choices for that player. In particular, choice $i$ for player A is dominant if

$A_{ij} \ge A_{i'j}$

for all $j$ and all $i.$