Monday 9 April 2012

Game theory 2: Repeated Games


Remember my last blog post? I know, it was a while ago, but the main point of it was that the guy decided to 'split' the pot and the smug woman walked away with all the money. I asked at the end why the guy would split when I had shown that it was always the worst tactic. Here I'll attempt to make him feel a bit better, but he's still stupid.
The reason he's stupid is that Goldenballs is a one-off (sadly there's more than  one episode with Jasper Carrot, but the players are always new). He can, however, feel a bit better - a lot of people make the same mistake as him.

An economist called Güth has done work on a game called the Ultimatum game; a game with two players where player one has a pot of money and must split it with player two. Player one does not have to offer a 50:50 split, and player two decides whether to accept the split or reject it. If the offer is accepted then the deal is done; if player two decides the deal is 'unfair' then neither player receives anything. Just like Goldenballs, the offer is one-off - we can't negotiate. What does game theory tell us to do? 


Suppose we're player two, and there's a pot of £100. Obviously the best thing for us would be for player one to offer us all the money, but that's not going to happen. What would we settle for -  £50? £40 maybe? or even £30?  Game theory says we should accept anything. Suppose we are offered 1p, and player one takes £99.99. Whilst it's not exactly fair, it's still better than nothing so we should accept. If we reject the offer, we're worse off. It would be cutting our nose off to spite our face.
Suppose we're player one. We should put ourselves in player two's shoes, realise that their best strategy is to accept any offer, and offer them 1p. 


What's interesting - and what might make our Goldenballs chap feel better - is that most people don't act like this in the game. Why not is unclear - some suggest that we have a sense of fairness and cannot bring ourselves to offer someone just 1p, some (normally economists) label the players as stupid and irrational! There's also the fact that if we think player two is not going to be rational, we play more cautious and offer a fairer deal. Game theory strategies only work if both players act rationally.


I subscribe to the 'irrational player' hypothesis, but am hesitant to label players as stupid. I think that people forget that the game is a one-off. I think the same is true of our Goldenballs case study, and here I'll show when and why splitting can be the right tactic.


Remember the prisoner's dilemma. Imagine the same scenario, but that it's not one-off. If you snitch, your friend will get payback by sending his friends to beat you up. Let's have a look at the original payoff matrix, and what happens if we can send friends round to break your kneecaps.




In the original, John's best tactic was to confess (5 years in prison is better than 10, 0 years in prison is better than 1). Ron's best tactic was also to confess, since the problem is symmetrical.
But what if being beaten up was as bad as 6 years in prison (think of the most gruesome punishment!). Now the payoff matrix looks different.




Now the payoffs are a bit different. Now, if a player confesses they get the original payoff plus an extra six years. So for example if both players confess, they get five years in prison and get beaten up (worth six years in prison) - a total of 11. What's the best strategy now?


Let's be John. If Ron confesses we have the choice of either confessing (and getting the equivalent of 11 years) or staying quiet (and getting the equivalent of 10 years). So we'd choose the latter and keep quiet.
If Ron stays quiet we choose between confessing (and getting the equivalent of six years, all from being beaten up) and staying quiet (with just one year in prison). So we'd choose to stay quiet in this scenario too.
The game has changed - the dominant strategy is now to stay quiet whereas before we were best off if we snitched. "Repeated games", as they're called in the literature, are massively important.


What about Goldenballs? Well, imagine that the guy's friends are all kind-hearted people. If he stole the pot of money, he'd have played perfectly in economic terms but maybe his friends would disown him because he's nasty. His friends are worth £60 (just to put a value on them). Now what would the payoff matrix look like?


The original is first, then the new one.




Now we can see that the dominant strategy for player one is to split. If player two splits, we can either split and get £50, or steal and get £40 (we'd win £100, but lose our friends - worth £60). If player two steals, we can split and get nothing or steal and get nothing but lose our £60 friends.
Goldenballs is a one-off game. But life isn't - we factor in what happens after the game. The more valuable the player's friends are, the more likely they are to split in the new game. 


Repeated games are used in loads of situations. OPEC, the organisation of petroleum exporting countries, punishes members if they play the wrong strategy (produce more oil than they should). Next time, I'll look at a special case where repeated games don't work.

Monday 19 March 2012

Life's a game


If you haven't watched ITV's Goldenballs before, don't worry. All we need to worry about is the end bit, which is shown in the video above. (This saves about an hour of watching Jasper Carrot).

In the final part of the game, the contestants have amassed a prize pool, and can either choose to "split" or "steal". If both contestants choose split, they split the prize money equally. If they both "steal" they get nothing. If one steals, and the other splits, the stealer takes all.

What's the best tactic?

The answer is very similar to the "Prisoner's dilemma", which goes like this:

John and Ron are big-time criminals, and they've both been arrested and kept in separate cells. The police don't have enough evidence to convict them for the big bank job they just pulled off, but they've got enough other crimes to put them in prison for a year each. The police have a cunning plan. They go to John and offer him a deal: if he grasses on Ron, they'll ignore the lesser charges. John gets to walk free, but they can charge Ron for the bank job and put him in prison for ten years. They offer the same deal to Ron. There's one catch though: if they both confess then they'll both get five years in prison.
We can summarise the offers, in terms of the number of years in prison.



The pay-offs in the table show John's pay-off first, then Ron's. So if John confesses and Ron stays quiet (top right), then John gets freedom and Ron gets ten years in prison.
The dominant strategy, also a Nash equilibrium, is for both players to choose confess. This is easy to explain.

Suppose you're John. If Ron stays quiet, then you can either "confess" and get freedom or "stay quiet" and get a one year sentence. Freedom is better, so the best choice here is to confess.
If Ron confesses, then you can either "confess" and get a five year sentence, or "stay quiet" and get a ten year sentence. Five years in prison is better than ten, so the best choice is confess.
The same logic can be applied to Ron (since the case is absolutely symmetrical).
That is the basic underlying principle to game theory.


Returning to Goldenballs, there are four scenarios in the game, and they are laid out below.


Suppose the final prize money is £100, and that you're player one. Like John in the prisoner's dilemma you look at the options facing you. If player two splits, the best option is to steal and take all the money. If player two steals, you have no choice - you will always leave empty handed. So you may as well choose steal.

Theory tells us that steal is the only option to go for, so why did the guy choose to split? We'll leave that for another time.

Wednesday 14 March 2012

The economics of humble pie


You can't have football without pie.

At the start of the season, I made a bet with my brother that Manchester City would finish above their local rivals, Manchester United. £10, winner takes all. Not a huge amount of money, and it never was about the money. I was the older, wiser brother - this was about pride.

28 games down the line, and it's United who sit on the top of the league. There's only 10 more games to go, and City's fixtures are looking harder. And there's a crunch derby game in late April which could effectively decide our bet. United are suddenly favourites.

Here's the conundrum: the bookies are offering odds of 8/13 on Manchester United winning the league.
The question: to bet or not to bet?

If I don't bet, I have the option of either getting £10 or losing £10. And that's losing £10 to my brother.
I could, instead, hedge my bets by betting on United. This would offset the potential loss if United win, but would mean less money if City manage to beat their rivals.

For example, if I bet £5 on United I have the following two scenarios:

Scenario 1 - United win: I pay the £10 owed, but get a profit from the bookies of £3, so I lose £7
Scenario 2 - City win: I get £10 from my brother, but lose the bet with the bookies. Overall, up £5.

Some other bets and returns in each scenario are summed up in the following table:



What to do hinges on what I think the two associated probabilities are. Expected returns are given as

E(£) = p1S1 + p2S2

where p is the probability associated with each scenario, S1 the loss in scenario 1 and S2 the winnings in scenario 2. S1 will be negative if S2 is positive.

We can plug in some probabilities and see what the expected return is.

If the probabilities are 50/50 (so it's equally likely for city or united to win) then the expected return of the original bet is £0, since

E(£) = (0.5 x -£10) + (0.5 x £10) = 0

The table below has some other probabilities and their associated expected returns.


So if I think the probability of United winning is 80%, a bet of £10 reduces the expected loss from £6 to £3.08.

What have I gone for, and what do I think the probability is?

I think the probability is around 80/20, so I've bet £7.65. This might look odd, since I could reduce my expected loss to £3.08 by betting more. But I've put another condition in: I don't want to lose money if City win. £7.65 allows me to minimise my expected losses whilst still being able to celebrate a City win.

Remember, you can't have football without pie. It's my turn to eat humble pie, accept that maybe my brother could be right, and hedge my bets.