Game Theory (Part 7)

We need to learn a little probability theory to go further in our work on game theory.

We’ll start with some finite set X of ‘events’. The idea is that these are things that can happen—for example, choices you could make while playing a game. A ‘probability distribution’ on this set assigns to each event a number called a ‘probability’—which says, roughly speaking, how likely that event is. If we’ve got some event i, we’ll call its probability p_i.

For example, suppose we’re interested in whether it will rain today or not. Then we might look at a set of two events:

X = \{\textrm{rain}, \textrm{no rain} \}

If the weatherman says the chance of rain is 20%, then

p_{\textrm{rain} } = 0.2

since 20% is just a fancy way of saying 0.2. The chance of no rain will then be 80%, or 0.8, since the probabilities should add up to 1:

p_{\textrm{no rain}} = 0.8

Let’s make this precise with an official definition:

Definition. Given a finite set X of events, a probability distribution p assigns a real number p_i called a probability to each event i \in X, such that:

1) 0 \le p_i \le 1

and

2) \displaystyle{ \sum_{i \in X} p_i = 1}

Note that this official definition doesn’t say what an event really is, and it doesn’t say what probabilities really mean. But that’s how it should be! As usual with math definitions, the words in boldface could be replaced by any other words and the definition would still do its main job, which is to let us prove theorems involving these words. If we wanted, we could call an event a doohickey, and call a probability a schnoofus. All our theorems would still be true.

Of course we hope our theorems will be useful in real world applications. And in these applications, the probabilities p_i will be some way of measuring ‘how likely’ events are. But it’s actually quite hard to say precisely what probabilities really mean! People have been arguing about this for centuries. So it’s good that we separate this hard task from our definition above, which is quite simple and 100% precise.

Why is it hard to say what probabilities really are? Well, what does it mean to say “the probability of rain is 20%”? Suppose you see a weather report and read this. What does it mean?

A student suggests: “it means that if you looked at a lot of similar days, it would rain on 20% of them.”

Yes, that’s pretty good. But what counts as a “similar day”? How similar does it have to be? Does everyone have to wear the same clothes? No, that probably doesn’t matter, because presumably doesn’t affect the weather. But what does affect the weather? A lot of things! Do all those things have to be exactly the same for it count as similar day.

And what counts as a “lot” of days? How many do we need?

And it won’t rain on exactly 20% of those days. How close do we need to get?

Imagine I have a coin and I claim it lands heads up 50% of the time. Say I flip it 10 times and it lands heads up every time. Does that mean I was wrong? Not necessarily. It’s possible that the coin will do this. It’s just not very probable.

But look: now we’re using the word ‘probable’, which is the word we’re trying to understand! It’s getting sort of circular: we’re saying a coin has a 50% probability of landing heads up if when you flip it a lot of times, it probably lands head up close to 50% of the time. That’s not very helpful if you don’t already have some idea what ‘probability’ means.

For all these reasons, and many more, it’s tricky to say exactly what probabilities really mean. People have made a lot of progress on this question, but we will sidestep it and focus on learning to calculate with probabilities.

If you want to dig in a bit deeper, try this:

Probability interpretations, Wikipedia.

Equally likely events

As I’ve tried to convince you, it can be hard to figure out the probabilities of events. But it’s easy if we assume all the events are equally likely.

Suppose we have a set X consisting of n events. And suppose that all the probabilities p_i are equal: say for some constant c we have

p_i = c

for all i \in X. Then by rule 1) above,

\displaystyle{ 1 = \sum_{i \in X} p_i = \sum_{i \in X} c = n c }

since we’re just adding the number c to itself n times. So,

\displaystyle{  c = \frac{1}{n} }

and thus

\displaystyle{ p_i = \frac{1}{n} }

for all i \in X.

I made this look harder than it really is. I was just trying to show you that it follows from the definitions, not any intuition. But it’s obvious: if you have n events that are equally likely, each one has probability 1/n.

Example 1. Suppose we have a coin that can land either heads up or tails up—let’s ignore the possibility that it lands on its edge! Then

X = \{ H, T\}

If we assume these two events are equally probable, we must have

\displaystyle{ p_H = p_T =  \frac{1}{2} }

Note I said “if we assume” these two events are equally probable. I didn’t say they actually are! Are they? Suppose we take a penny and flip it a zillion times. Will it land heads up almost exactly half a zillion times?

Probably not! The treasury isn’t interested in making pennies that do this. They’re interested in making the head look like Lincoln, and the tail look like the Lincoln national monument:

Or at least they used to. Since the two sides are different, there’s no reason they should have the exact same probability of landing on top.

In fact nobody seems to have measured the difference between heads and tails in probabilities for flipping pennies. For hand-flipped pennies, it seems whatever side that starts on top has a roughly 51% chance of landing on top! But if you spin a penny, it’s much more likely to land tails up:

The coin flip: a fundamentally unfair proposition?, Coding the Wheel.

Example 2. Suppose we have a standard deck of cards, well-shuffled, and assume that when I draw a card from this deck, each card is equally likely to be chosen. What is the probability that I draw the ace of spades?

If there’s no joker in the deck, there are 52 cards, so the answer is 1/52.

Let me remind you how a deck of cards works: I wouldn’t want someone to fail the course because they didn’t ever play cards! Here are the 52 cards in a standard deck. Here’s what they all look like (click to enlarge):

As you can see, they come in 4 kinds, called suits. The suits are:

• clubs: ♣

• spades: ♠

diamonds: ♦

hearts: ♥

Two suits are black and two are red. Each suit has 13 cards in it, for a total of 4 × 13 = 52. The cards in each suit are numbered from 1 to 13, except for four exceptions. They go like this:

A, 2, 3, 4, 5, 6, 7, 8, 9, 10, J, Q, K

A stands for ‘ace’, J for ‘jack’, Q for ‘queen’ and K for ‘king’.

Probabilities of subsets

If we know a probability distribution on a finite set X, we can define the probability that an event in some subset S \subseteq X will occur. We define this to be

\displaystyle{p(S) = \sum_{i \in S} p_i }

For example, I usually have one of three things for breakfast:

X = \{ \textrm{oatmeal}, \textrm{waffles}, \textrm{eggs} \}

I have an 86% chance of eating oatmeal for breakfast, a 10% chance of eating waffles, and a 4% chance of eating eggs and toast. What’s the probability that I will eat oatmeal or waffles? These choices form the subset

S = \{ \textrm{oatmeal}, \textrm{waffles} \}

and the probability for this subset is

p(S) = p_{\textrm{oatmeal}} + p_{\textrm{waffles}} = 0.86 + 0.1 = 0.96

Here’s an example from cards:

Example 2. Suppose we have a standard deck of cards, well-shuffled, and assume that when I draw a card from this deck, each card is equally likely to be chosen. What is the probability that I draw a card in the suit of hearts?

Since there are 13 cards in the suit of hearts, each with probability 1/52, we add up their probabilities and get

\displaystyle{ 13 \times \frac{1}{52} = \frac{1}{4} }

This should make sense, since there are 4 suits, and as many cards in each suit.

Card tricks

This is just a fun digression. The deck of cards involves some weird numerology. For starters, it has 52 cards. That’s a strange number! Where else have you seen this number?

A student says: “It’s the number of weeks in a year.”

Right! And these 52 cards are grouped in 4 suits. What does the year have 4 of?

A student says: “Seasons!”

Right! And we have 52 = 4 × 13. So what are there 13 of?

A student says: “Weeks in a season!”

Right! I have no idea if this is a coincidence or not. And have you ever added up the values of all the cards in a suit, where we count the ace as 1, and so on? We get

1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 + 10 + 11 + 12 + 13

And what’s that equal to?

After a long pause, a student says “91.”

Yes, that’s a really strange number. But let’s say we total up the values of all the cards in the deck, not just one suit. What do we get?

A student says “We get 4 × 91… or 364.”

Right. Three-hundred and sixty-four. Almost the number of days in year.

“So add one more: the joker! Then you get 365!”

Right, maybe that’s why they put an extra card called the joker in the deck:

One extra card for one extra day, joker-day… April Fool’s Day! That brings the total up to 365.

Again, I have no idea if this is a coincidence or not. But the people who invented the Tarot deck were pretty weird—they packed it with symbolism—so maybe the ordinary cards were designed this way on purpose too.

Puzzle. What are the prime factors of the number 91? You should know by now… and you should know what they have to do with the calendar!

11 Responses to Game Theory (Part 7)

  1. Hi John,

    The prime factors of 91 are 7 and 13. These numbers are representative of 7 days in a week, and 13 weeks in a season (and a quarter of the year, which could also be applied to a fiscal quarter for the business cycle, perhaps).

    Thank you for your very engaging and interesting topics that you cover on your blog, John!

    -Christian Luca

    • John Baez says:

      Right! 7 days in a week, 13 weeks in a season makes 91 days in a season. As Harald Hanche-Olsen pointed out:

      Another special property of 91: Of all two digit composite numbers, it is the one looking the most like it’s a prime.

      This is because it’s easy to test small numbers for divisibility by 2,3,5, and 11… and we all know the perfect squares, which rules out 49 and 81… so we can tell what we need to do to get a good fake prime less than 100: multiply 7 and 13, to get 91.

    • davidtweed says:

      This is probably implicit in the answer above, but another way of putting it is that you can write 91 as 13*(14/2). So it’s a puzzle why we’ve ended up with a week which is half “number of weeks in a season+1″… (I imagine 4 seasons is strongly “suggested by the weather”, but the choice of week length seems less externally suggested.)

      Incidentally, another area to look if you’re trying to win real-world games is into how well cards are shuffled: what often happens is that the structure of a game will cause certain cards to be played together, then the collected cards are inadequately shuffled, then the dealing process then redistributes the cards in a predictable way. So in a game like whist where all the cards are dealed out, the knowing how the previous hand played out and the cards you’ve been dealt gives probabilities on what other players have ended up with.

      • Zeus says:

        It’s plausible that the number of days in the week comes from sun, moon, Mercury, Venus, Mars, Jupiter, Saturn — the 7 non-fixed heavenly bodies visible to the ancients. The names of the days strongly suggest this, wouldn’t you agree?

        • John Baez says:

          Yes, and—especially with a name like yours—you might be interested to hear the best theory about why the days are named in the curious order they are: Sun, Moon, Mercury, Venus, Mars, Jupiter, Saturn.

          Astrologers like to list these seven heavenly bodies in order of decreasing orbital period, counting the Sun as having period 365 days, and the Moon as period 29 days:

          Saturn (29 years)
          Jupiter (12 years)
          Mars (687 days)
          Sun (365 days)
          Venus (224 days)
          Mercury (88 days)
          Moon (29.5 days)

          For the purposes of astrology they wanted to assign a planet to each hour of each day of the week. They did this in a reasonable way: they assigned Saturn to the first hour of the first day, Jupiter to the second hour of the first day, and so on, cycling through the list of planets over and over, until each of the 7 × 24 = 168 hours was assigned a planet. Each day was then named after the first hour in that day. Since 24 mod 7 equals 3, this amounts to taking the above list and reading every third planet in it (mod 7), getting:

          Saturn (Saturday)
          Sun (Sunday)
          Moon (Monday)
          Mars (Tuesday)
          Mercury (Wednesday)
          Jupiter (Thursday)
          Venus (Friday)

          I don’t think anyone is sure that this is how the days got the names they did; the earliest reference for this scheme is the Roman historian Dion Cassius (AD 150-235), who came long after the days were named. However, Dion says the scheme goes back to Egypt. In the Moralia of Plutarch (AD 46-120) there was an essay entitled “Why are the days named after the planets reckoned in a different order from the actual order?” Unfortunately this essay has been lost and only the title is known!

  2. John Baez says:

    And here is a followup:

    Puzzle. For how many seasons did Scheherazade tell stories to the king of Persia?

    • Walter Blackstock says:

      11 (1001/91)?

    • John Baez says:

      Yes! Her Tale of a Thousand and One Nights features a lovely number, not just because 1001 is a palindrome in base 10, but because 1001 = 7 × 11 × 13. This gives a good way to test large numbers for divisibility by these primes by hand! Thanks to its nice look in base 10, you can easily subtract multiples of 1001 from a large number and then test the remainder for divisibility by 7, 11, or 13.

      I’d known this for a long time, but only while teaching this class did I notice that if a ‘season’ has 91 = 7 × 13 days, or 13 weeks, then 1001 nights is exactly 11 seasons.

  3. Nathan Reed says:

    This reminds me of how when I was a kid, I came up with an idea for a calendar with 13 months of 28 days each. That gives 364 days, and the remaining day (or two, for leap years) would just be a special day that wasn’t part of a month. This would solve the difficult-to-remember month lengths.

    Additionally, we could declare the special day(s) not to be part of the week, either. That way the day-of-week wouldn’t shift against the day-of-month from one year to the next – in this calendar, Thanksgiving and Easter would fall on the same dates every year.

    • John Baez says:

      I agree, this would be a great system. You might also like the Icelandic calendar, which had 52 weeks of 7 days each, for a total of 364 days, and an occasional leap year with an extra whole week! But they had 12 months of 30 days each plus 4 extra days.

  4. [...] Last time we learned some rules for calculating probabilities. But we need a few more rules to get very far. [...]

You can use HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 2,805 other followers