Last time we started looking at 2-player normal form games. The idea is that player A has different choices of which move to make, while player B has choices, and we have two matrices of payoffs, and If player A makes choice and player B makes choice then the payoff to player A is while the payoff to player B is
What should you do if you’re playing a game like this? As we saw when playing rock-paper-scissors, sometimes the best thing is to adopt a strategy with some built-in randomness. In this game, if your opponent knows what you’re going to do, he can exploit that and beat you!
A strategy involving randomness is called a mixed strategy. They’re very important, and we’ll talk about them a lot. But today we’ll only consider pure strategies, where we definitely choose one of the choices available, with no randomness.
Notice: there are as many pure strategies as there are choices! For each choice, there’s a pure strategy where you always make that choice when playing the game. So, a lot of people say ‘pure strategy’ for what I call a ‘choice’. Other people use the word ‘action’.
Anyway: let’s try to figure out the best pure strategy for each player, if we can.
Sometimes this is impossible. For example, rock-paper-scissors has no best pure strategy. Could playing rock be best? No, because paper beats rock. Could playing paper be best? No, because scissors beats paper. Could playing scissors be best? No, because rock beats scissors.
But sometimes we can find a best pure strategy, and the ideas we meet will help us later when we study mixed strategies. So, let’s dive in!
Nash equilibria
Suppose player A makes choice and player B makes choice . Then we say is a Nash equilibrium if neither player can improve their payoff by changing their choice.
For example, look at this game:
1 | 2 | |
1 | (10,10) | (9,-1) |
2 | (-1,1) | (-1,-1) |
Here we are writing the matrices and together in one table, like last time, with in black and in red. But we could also separate them out:
It’s just a different way of conveying the same information.
Either way, it’s pretty clear what the players should do in this game! Both players should make choice 1. That way, they both win 10 points.
Furthermore, this pair of choices is certainly a Nash equilibrium. In other words: neither player can improve their payoff by unilaterally changing their choice! If player A switches to choice 2, their payoff drops to -1 points. If player B switches to choice 2, their payoff drops to 9 points.
Let’s give an official definition of Nash equilibrium and then look at some trickier examples. Remember, player A’s choices are while player B’s choices are
Definition. Given a 2-player normal form game, a pair of choices is a Nash equilibrium if:
1) For all ,
2) For all ,
Condition 1) says that player A can’t improve their payoff by switching their choice from to any other choice Condition 2) says that player B can’t improve their payoff by switching their choice from to any other choice
Examples
Let’s look at more examples of Nash equilibria, to see some funny things that can happen. First let’s modify the last game a little:
1 | 2 | |
1 | (10,10) | (-1,10) |
2 | (10,-1) | (-1,-1) |
Is there a Nash equilibrium? Yes—and just as before, it happens when both players pick pure strategy 1. Now player A’s payoff doesn’t get any worse when they switch to pure strategy 2. But it doesn’t improve, either! It stays equal to 10. Similarly, player B’s payoff doesn’t get any worse when they switch to pure strategy 2… but it doesn’t improve. So, neither player is motivated to change their strategy.
I said a Nash equilibrium is a situation where neither player can improve their payoff by changing their choice. But it might be clearer to say: neither player can improve their payoff by unilaterally changing their choice.
What do I mean by ‘unilaterally’ changing their choice? I mean that one player changes their choice while the other player does not change their choice.
But if both players change their choices simultaneously, sometimes they can improve both their payoffs! Lets see an example of that:
1 | 2 | |
1 | (10,10) | (9,-1) |
2 | (-1,8) | (20,20) |
Now it looks like both players will be happiest if they pick pure strategy 2. And it’s true! Moreover, this is a Nash equilibrium. Check and see.
But what if both players pick choice 1? This is also a Nash equilibrium! Shocking but true. If they both use pure strategy 1, neither player can improve their payoff by unilaterally changing their choice. If player A changes her choice, her payoff drops to -1. And if player B changes his choice, his payoff drops to -1.
Of course, they can improve their payoff if they both simultaneously change their choice to 2. But that’s not what the concept of Nash equilibrium is about.
This raises the question of whether Nash equilibrium is a good definition of ‘the best’ choice for each player. The big problem is figuring out what ‘best’ means—we’ll have to talk about that more. But we’re also seeing some problems with the word ‘the’. There may be more than one Nash equilibrium, so we can’t always talk about ‘the’ Nash equilibrium.
In other words, our example shows there isn’t always a unique Nash equilibrium.
Furthermore, a Nash equilibrium doesn’t always exist! Check out the game of rock-paper-scissors:
rock | paper | scissors | |
rock | (0,0) | (-1,1) | (1,-1) |
paper | (1,-1) | (0,0) | (-1,1) |
scissors | (-1,1) | (1,-1) | (0,0) |
It’s easy to see that there’s no Nash equilibrium. No matter what both players choose, at least one of them can always improve their payoff by switching to a different choice. If one of them wins the game, the loser can improve their payoff by switching to a different choice. If it’s a tie, either player can improve their payoff by switching to a different choice.
So: Nash equilibria are interesting, but they don’t always exist— at least not when we consider only pure strategies, as we’ve been doing. And even when they do, they’re not always unique! This is a bit frustrating. We’d like game theory to tell us how to play games well.
We can improve the situation by allowing mixed strategies, where a player picks different choices with different probabilities. If we do this, there is always at least one Nash equilibrium! This result was proved by—you guessed it!—John Nash. He did it in 1950, in this astoundingly short paper:
• John F. Nash, Jr., Equilibrium points in n-person games, Proceedings of the National Academy of Sciences 36 (1950), 48–9.
He eventually won the Nobel prize in economics for this work.
In fact, Nash was not the first to think about Nash equilibria; this idea goes back to the French economist Antoine Cournot, who wrote about it way back in 1838! However, Nash was the first to consider Nash equilibria for mixed strategies for general simultaneous multi-player games, and prove they always exist. (As we’ll see later, von Neumann and Morgenstern had already done this for the zero-sum 2-player games.)
John Nash
If you haven’t seen the movie about Nash called A Beautiful Mind, you should! Here’s the scene where Nash figures out something about multiplayer games:
It’s not very clear, mathematically speaking… but oh well. It’s a good movie and it gives you some sense of how Nash battled with mental illness for much of his life. He has now largely recovered and spends his time at Princeton University.
Here’s the young Nash—click for more details:
A ‘practical’ question
A student in my class asked what’s the best book or website on how to play blackjack well. I asked this on Google+ and got over 50 replies, which you can read here. If you know more, please tell me!
Last time we talked about Nash equilibria for a 2-player normal form game. We saw that sometimes a Nash equilibrium doesn’t exist! Sometimes there’s more than one! But suppose there is at least one Nash equilibrium. How do we find one? [...]
Possible typo: in the definition of Nash equilibrium, for the second condition, might it be 1<=j'<=n (instead of m) ? I recall that the second player can choose between n strategies, not m.
Thanks for these nice and useful entries.
Thanks! I fixed that mistake.
Nice post—thanks.
In the second example, is there more than one Nash equilibrium? E.g., cell (1, 2) with and : if A switches from 1 to 2, then and if B switches from 2 to 1, then . So neither can improve with a unilateral move. Is that right? (I suspect I’m confused about something….)
Nice question! I’ll let someone else answer, so they can get a bit of practice. Either a student in the class, or someone reading this blog…
It looks like you are right. Good observation!
We’ll have to look at examples to understand this stuff better, but let me charge ahead and define ‘Nash equilibria’ for mixed strategies. The idea is similar to the idea we’ve already seen. A pair of mixed strategies, one for A and one for B, is a Nash equilibrium if neither player can improve the expected value of their payoff by unilaterally changing their mixed strategy. [...]
There might be a couple of mix-ups:
Shortly before the definition of Nash equilibrium, regarding the first example…
“If player A switches to choice 2, their payoff drops to 9 points. If player B switches to choice 2, their payoff drops to 1 point.”
It looks like player A’s payoff would drop to -1 by switching to choice 2; on the other hand, player B’s payoff would drop to 9 by switching to choice 2.
Similarly in the third example…
“If player A changes her choice, her payoff drops to 9. And if player B changes his choice, his payoff drops to 8.”
It looks like if player A changes to choice 2, her payoff drops to -1; if player B changes to choice 2, his payoff drops to -1.
Thanks! Fixed, I hope! I really appreciate all these corrections.