Główna
The Joy of Game Theory: An Introduction to Strategic Thinking
The Joy of Game Theory: An Introduction to Strategic Thinking
Presh Talwalkar
0 /
0
Jak bardzo podobała Ci się ta książka?
Jaka jest jakość pobranego pliku?
Pobierz książkę, aby ocenić jej jakość
Jaka jest jakość pobranych plików?
This book is a selection of the best articles from Game Theory Tuesdays, a column from the blog Mind Your Decisions. Articles from Game Theory Tuesdays have been referenced in The Freakonomics Blog, Yahoo Finance, and CNN.com. Game theory is the study of interactive decision makingthat is, in situations where each person's action affects the outcome for the whole group. Game theory is a beautiful subject and this book will teach you how to understand the theory and practically implement solutions through a series of stories and the aid of over 30 illustrations. This book has two primary objectives. (1) To help you recognize strategic games, like the Prisoner's Dilemma, Bertrand Duopoly, Hotelling's Game, the Game of Chicken, and Mutually Assured Destruction. (2) To show you how to make better decisions and change the game, a powerful concept that can transform nowin situations into mutually beneficial outcomes. You'll learn how to negotiate better by making your threats credible, sometimes limiting options or burning bridges, and thinking about new ways to create better outcomes. As these goals indicate, game theory is about more than board games and gambling. It all seems so simple, and yet that definition belies the complexity of game theory. While it may only take seconds to get a sense of game theory, it takes a lifetime to appreciate and master it. This book will get you started.
Rok:
2014
Wydawnictwo:
CreateSpace Independent Publishing Platform
Język:
english
Strony:
154
ISBN 10:
1500497444
ISBN 13:
9781500497446
Plik:
EPUB, 784 KB
Twoje tagi:
Ściągnij (epub, 784 KB)
 Checking other formats...
 Konwertować w FB2
 Konwertować w PDF
 Konwertować w MOBI
 Konwertować w TXT
 Konwertować w RTF
 Przekonwertowany plik może różnić się od oryginału. Jeśli to możliwe, lepiej pobierz plik w oryginalnym formacie.
Zgłoś problem
This book has a different problem? Report it to us
Check Yes if
Check Yes if
Check Yes if
Check Yes if
you were able to open the file
the file contains a book (comics are also acceptable)
the content of the book is acceptable
Title, Author and Language of the file match the book description. Ignore other fields as they are secondary!
Check No if
Check No if
Check No if
Check No if
 the file is damaged
 the file is DRM protected
 the file is not a book (e.g. executable, xls, html, xml)
 the file is an article
 the file is a book excerpt
 the file is a magazine
 the file is a test blank
 the file is a spam
you believe the content of the book is unacceptable and should be blocked
Title, Author or Language of the file do not match the book description. Ignore other fields.
Are you sure the file is of bad quality? Report about it
Change your answer
Thanks for your participation!
Together we will make our library even better
Together we will make our library even better
Plik zostanie dostarczony na Twój email w ciągu 15 minut.
Plik zostanie dostarczony do Twojego Kindle w ciągu 15 minut.
Uwaga: musisz zweryfikować każdą książkę, którą chcesz wysłać na swój Kindle. Sprawdź swoją skrzynkę pocztową pod kątem emaila weryfikacyjnego z Amazon Kindle Support.
Uwaga: musisz zweryfikować każdą książkę, którą chcesz wysłać na swój Kindle. Sprawdź swoją skrzynkę pocztową pod kątem emaila weryfikacyjnego z Amazon Kindle Support.
Conversion to is in progress
Conversion to is failed
Możesz być zainteresowany Powered by Rec2Me
Najbardziej popularne frazy
player^{78}
equilibrium^{59}
dilemma^{55}
prisoner's dilemma^{46}
nash^{44}
players^{44}
prize^{39}
customers^{38}
strategies^{34}
credible^{34}
nash equilibrium^{32}
payoff^{31}
voter^{28}
winning^{28}
dominant strategy^{27}
southwest^{27}
let's^{25}
cents^{24}
blonde^{24}
seats^{23}
outcomes^{23}
gossip^{23}
strategic^{23}
chicken^{23}
estate^{23}
focal^{23}
competitive^{23}
coordination^{22}
steal^{22}
adam^{22}
earlybird check^{22}
celebrity^{21}
probability^{21}
talmud^{20}
drivers^{20}
math^{19}
dominated^{19}
boarding^{19}
pill^{19}
margin^{19}
prices^{18}
seating^{18}
contested^{17}
tricky^{17}
auction^{17}
travelers^{17}
game of chicken^{17}
riaa^{17}
economics^{17}
wars^{17}
companies^{16}
creditors^{16}
profits^{16}
projects^{16}
limiting^{15}
bid^{15}
bridges^{15}
brunette^{15}
discount^{15}
professors^{15}
Powiązane listy książek
3 comments
Dharan
It was one of the best book about game theory .Explaining with basic examples . Presh talwaker is an youtuber,i loving watching his channel (mind your decisions)
27 April 2019 (21:22)
Rodrick
Mannnn, i feel so sorry to pirating his book :(, I really love his channel. And this book is really cheap,... once I get money I will buy 3 versions of the physical one
05 February 2021 (09:02)
Rodrick
Everyone who loves mathematics needs to know his channel, indispensable. The name is Mind Your Decisions
05 February 2021 (09:04)
Możesz zostawić recenzję książki i podzielić się swoimi doświadczeniami. Inni czytelnicy będą zainteresowani Twoją opinią na temat przeczytanych książek. Niezależnie od tego, czy książka ci się podoba, czy nie, jeśli powiesz im szczerze i szczegółowo, ludzie będą mogli znaleźć dla siebie nowe książki, które ich zainteresują.
1

2

About The Author Presh Talwalkar studied Economics and Mathematics at Stanford University. His site Mind Your Decisions has blog posts and original videos about math that have been viewed millions of times. More Books By Presh Talwalkar Math Puzzles Volume 1: Classic Riddles And Brain Teasers In Counting, Geometry, Probability, And Game Theory. This book contains 70 interesting brainteasers. Math Puzzles Volume 2: More Riddles And Brain Teasers In Counting, Geometry, Probability, And Game Theory. This is a followup puzzle book with more delightful problems. Math Puzzles Volume 3: Even More Riddles And Brain Teasers In Geometry, Logic, Number Theory, And Probability. This is the third in the series with 70 more problems. But I only got the soup! This fun book discusses the mathematics of splitting the bill fairly. 40 Paradoxes in Logic, Probability, and Game Theory. Is it ever logically correct to ask “May I disturb you?” How can a football team be ranked 6th or worse in several polls, but end up as 5th overall when the polls are averaged? These are a few of the thoughtprovoking paradoxes covered in the book. Multiply By Lines. It is possible to multiply large numbers simply by drawing lines and counting intersections. Some people call it “how the Japanese multiply” or “Chinese stick multiplication.” This book is a reference guide for how to do the method and why it works. The Best Mental Math Tricks. Can you multiply 97 by 96 in your head? Or can you figure out the day of the week when you are given a date? This book is a collection of methods that will help you solve math problems in your head and make you look like a genius. Table of Contents Why Learn Game Theory? Section I: Introducing Strategic Games Why Are Gas Stations Often Located Next To Each Other? Could Price Match Guarantees Help Businesses Instead Of You? Dominated Strategies Winning A “Beauty Contest” Dominant Strategies Focal Points (Schelling Points) The Game Of Bicycle Collisions Splitting A Soda Evenly How Game Theory S; olved A Religious Mystery Coordination Failures Nash Equilibrium Should You Buy Used Products? The Prisoner's Dilemma An Ancient Indian Proverb Visiting The Doctor Prisoner's Dilemma At The Casino JC Penney Loses $163 Million Unbranding The Competitive Edge Southwest Airlines Makes $144 Million Golden Balls TV Show Beating The Prisoner's Dilemma Section II: Changing the Game The Office Phone Problem 4 Tips For Winning A Game Of Chicken Credible Threats The Ultimatum Game Gossip Wars And Mutually Assured Destruction (MAD) Honesty Isn't The Best Policy Threatening Many People At Once The Strategy Of Limiting Options Limiting Options In Salary Negotiation The Braess Paradox Burning Bridges The Leader's Dilemma Market Unraveling A Smart Pill A Voting Paradox My Teacher's Clever Tissue Scheme The Dollar Auction Game Waiting In Line, High Heels, And Studying For Finals Jealousy Conclusion More from Presh Talwalkar Why Learn Game Theory? Game theory is a beautiful subject and this book will teach you how to understand the theory and practically implement solutions. This book has two primary objectives. (1) To help you recognize strategic situations. (2) To show you how to make better decisions and change the game, a powerful concept that can transform nowin situations into mutually beneficial outcomes. As these goals indicate, game theory is about more than board games and gambling. Specifically, game theory is a branch of economics that considers how players of a mathematically formalized game can optimize their decisions. In these games, what each person does has an effect on other people in the group. Game theory studies how to make the best choice in situations of interdependent decisionmaking. Game theory offers practical insights that have been applied to various fields, including political science, business, evolutionary biology, computer science, and philosophy. The Nobel Prize in Economics was awarded for work in game theory in the years 1994 (John C. Harsanyi, John F. Nash Jr. and Reinhard Selten), 2005 (Robert J. Aumann and Thomas C. Schelling), 2007 (Leonid Hurwicz, Eric S. Maskin and Roger B. Myerson), and 2012 (Alvin E. Roth and Lloyd S. Shapley). Before proceeding, it is worth providing a functional definition. In this book, a game will mean a situation characterized by three components: (1) a set of people involved called players, (2), a set of allowable moves that each player can make, known as strategies, and (3) a description of how each player feels about the possible outcomes mathematically described by a payoff or utility function. It all seems so simple, and yet that definition belies the complexity of game theory. While it may only take a few seconds to define what a game is, it takes a lifetime to appreciate and master game theory. This book will get you started. Section I: Introducing Strategic Games This section opens with several examples of game theory and defines basic concepts, such as a dominant strategy and a Nash equilibrium. The goal is to get familiar with how games are described and solved, and how game theory is relevant for everyday situations. Unlike a textbook approach where one learns definitions and theories first, this book is based on understanding strategy with reallife situations and stories. We will jump right in with a puzzle about why gas stations tend to cluster in location. Why Are Gas Stations Often Located Next To Each Other? There are hundreds of gas stations around San Francisco in the California Bay Area. One might think that gas stations would spread out to serve local neighborhoods. But this idea is contradicted by a common observation. Whenever you visit a gas station, there is almost always another in the vicinity, often just across the street. In general, gas stations are highly clustered. The phenomenon is partly due to population clustering. Gas stations will be more common where demand is high, like in a city, rather than in sparsely populated areas like cornfields. But why do stations locate right across the street from each other? Why don't they spread out? There are many factors at play. Locating a gas station is an optimization problem involving demand, real estate prices, estimates of population growth, and supply considerations such as the ease of refueling. As the problem is complex, any simplified explanation will have its shortcomings. Nevertheless, there is a simple game about location competition that provides valuable insight. While the game involves only a few rules, it illustrates how businesses might compete on location and end up clustering together. The game also has an application to political science in the strategy for campaigning in elections. The following game is based on a model described in the 1929 paper “Stability in Competition” by the mathematician Harold Hotelling. Hotelling's Game There are two players in this game. In this exposition, imagine each player is a hot dog stand on a beach that competes for customers. The beach is a straight shoreline, in which customers are uniformly spread out. The beach is represented by a number line ranging from 1 at one endpoint to 1 at the other. The hot dog stands compete purely on location and sell identical products. Each stand picks a location, which is represented by a number between 1 and 1. Conditional on where the stands locate, customers will simply choose the stand closer to them. If the stands are in the same spot, customers will split up and both stands end up with an equal number of customers. For instance, if a customer is at point 0.5, and the stands are located at 1 and 1, the customer will be closer and choose the stand located at point 1. The following figure is a graphical representation of the game, with the labels S1 and S2 for the locations of hot dog stands 1 and 2, respectively. Note the endpoints of the shore and the placeholders for each stand's location. If the two hot dog stands compete for the most customers, where will each hot dog stand end up? (For reference, the solution to the game is referred to as a Nash equilibrium, which will be explained in subsequent chapters). Finding The Solution (Intuitive) One way to approach the game is to ignore the competition. Assume you are the only hot dog stand on the beach. Where might you want to locate? The answer is easy: any place you want. You are a monopolist so customers will have to walk to you regardless. If you choose to locate all the way at one endpoint 1, customers even from the other side of the beach at endpoint 1 will have to walk all the way. For you, it is nice to be a monopoly. But you are a paranoid monopoly, and common sense would push you closer to the center, labeled point 0. The problem is that if you locate all the way on the far left, or all the way on the far right, a competitor might choose a more central location and cut you off. If you favor the left side, for instance, an entrant could locate slightly to your right, closer to the center, and capture the majority of the market. See the following figure where the market share of the monopoly (solid line segment) is overtaken by a new entrant (dashed line segment). Such a problem does not happen if you locate in the center. A new entrant to either your left or right side would gain less than half the market. The logic shows why the center point is favorable. Furthermore, note that if either hot dog stand chooses the center point, the other will want to copy, since it is better to split the market than end up on one side that yields less than half the market. The above logic is correct but not mathematically precise. To develop a fuller appreciation for game theory, it is necessary to wade through a mathematical argument. This is not necessary for reading the book and understanding game theory, but it is presented here as a taste of a mathematical proof. As such, I highly encourage you to at least skim through this section. If you do choose to skip it, continue reading at the section labeled “The Social Optimum.” Finding The Solution (Mathematical) Each hot dog stand is simultaneously picking a location, a number between 1 and 1. Each stand needs to take into account where the other might locate. That's the key factor in game theory—decisions are interdependent. I will break the problem down into two steps. This is a process you can use to solve other games. Step 1: Think About Payoffs Imagine the two stands locate at points 0.2 and 0.4. How much of the beach would each stand capture? You can see it in the following figure. It indicates the segments of the beach where the customers are closer to the first stand (solid line segment) or to the second stand (dashed line segment). Here is how I came up with that picture. The first stand clearly gets any customer located at a point less than 0.2 (the left), and the second hot dog stand gets any customer standing at a point higher than 0.4 (the right). Halfway between the two stands, at the point 0.3, is where customers are equally happy. Therefore, anyone standing at a numerical value larger than the halfway point goes to the stand at 0.4, and anyone standing at a numerical value less than the halfway point goes to the stand at 0.2. The lengths of the solid and dashed lines represent the market shares of each hot dog stand. In this example, the first stand gets 65% of the line compared to 35% for the second. This arrangement of the stands, however, is not a solution to the game, as it is possible for one of the stands to find a better position. For example, if the second stand inches closer to the middle, say, at the point 0.2, then both stands would be equally desirable options for the customers, and they would split the market 50% to each. But that again is not a solution to the game, as the first stand could then retaliate by moving even closer to the center point to get more customers. There is a mathematical way to describe how each stand would react to the other's choice. Step 2: Determine the Best Responses A best response is a location that one stand would choose optimally in response to a given position of the other stand's location. More generally, a best response is a strategy that one player would choose in response to the given strategy profile of another player, or a group of players. To simplify matters, instead of dealing with market share percentages, consider the game as one that a hot dog stand can either win or lose. Imagine a hot dog stand “wins” the game by having a majority of the market. Suppose the first stand chooses a location k. What is a best response for second stand? What are the locations that will capture over half of the market? There is not a single answer. As explained earlier, anything closer to the center will capture a majority of the market. These are points that are less than distance k from the center, corresponding to numbers between the values k and k. Here is a diagram of the best response for a given location. If k = 0, the point in the center, then the unique best response is to pick the center. This is the solution of the game: when both pick the center, both are playing best responses to each other. The two hot dog stands will split the market evenly. The Social Optimum Game theory suggests an outcome for what players will do. But that is unfortunately not always the best outcome for society. The equilibrium of the hot dog game is an annoying situation for many customers. The hot dog stands are located in the center of the beach instead of spreading out and being closer to beachgoers. In fact, if the stands could be spaced out across the entire beach (at points 0.50 and 0.50) then everyone would be happier. Since the stands are spread out, each would still get 50% of the market. The advantage to this location is that the stands are located closer to more customers: no customer would have to travel more than a distance of 0.50 to reach a hot dog stand. In the solution where both stands are at 0, the customers at the endpoints have to walk a distance of 1. As desirable as this outcome is, it is not sustainable. The reason is that each stand has an incentive to deviate. Either one could choose to locate closer to the center and gain more than half the customers. The other stand would retaliate by moving closer to the center as well, and the game would go on until both end up in the center. Gas Stations And Other Examples The model explains the strategy of why competitors locate so close to each other and compete on real estate. Think about big burger chains, supermarkets, and coffee shops. You will almost always see them clustered even though it would be nicer for customers if they spread out. The model can also be applied to political candidates. Imagine two candidates picking a platform on a political spectrum from 1 (very liberal) to 1 (very conservative). If voters pick the candidate closest to their own political views, and voters are spread out across the political spectrum, then both candidates have an incentive to converge to the middle, moderate position. It is no surprise that politicians seek the “average vote.” The game also explains why it is so hard to tell the difference between candidates while they are campaigning for an election. A final application of this game is local TV news. Note that local news channels compete for attention, and each chooses a set of stories during a given show. It would be nice if different news stations talked about different topics, but that does not happen. We end up with the same story being reported on virtually all news outlets instead of having hundreds of different important stories being reported. How many times have you watched a local news report about one story, and then switched to another channel to see exactly the same story reported? Surprisingly they all are covering the same news in the same order, as news stations converge in reporting the most interesting stories. As a concluding note, that is one reason the web has been liberating. Entry is cheap, so blogs and websites can serve smaller interests, allowing for niche topics to get increased coverage. Could Price Match Guarantees Help Businesses Instead Of You? When a store advertises it will match the price of any competitor, it sounds like the store is offering a good deal. Not only are they confident their prices are the lowest, but if you find a lower price anywhere else, they will even match it. The apparent winner of a price matching policy is the consumer. In fact, that is how price matching is often reported in the news. For example, an article in The Wall Street Journal on November 2013 claimed: “And in electronics, where the products are the same from store to store, the one real competitive weapon is price. Four of the country's biggest consumer electronics retailers—Best Buy, WalMart, Target Corp. and Staples Inc.—are all pledging to match their rivals' instore prices during most of the holidays if shoppers ask.” But does price matching always mean lower prices? One would have to do a detailed study of prices to understand the full effects of price matching. But even without that, a game theory model can suggest how price matching might affect prices. One game paradoxically suggests price matching can do the opposite: it can help prices stay high! Monopoly (One Business) Consider a hypothetical example where Lears is a monopolist and makes refrigerators at a cost of $200. Lacking competition, Lears can raise the selling price of the refrigerator until it maximizes revenue. Say this price is $300. Lears is happy, but society would be better off with a lower price. Duopoly (Two Businesses) Lucky for the consumer, Sowe’s has decided to enter the market and it is able to make an identical refrigerator for the same cost of $200. What price should Sowe’s set? If it sets a price of $300, customers are indifferent between them and Lears, so Sowe’s will acquire half of the market and split the profits with Lears. But if Sowe’s sets a lower price of $299, all consumers will prefer the lower price and switch to that store. Sowe’s will effectively capture the entire market at a price just lower than the monopoly price. But Lears will not be happy to lose all of its customers. It would respond by an even lower price of $298 so that it can recapture the market. Since the firms cannot trust each other and simply agree on a price because of antitrust laws, they are forced to compete. Ultimately, both firms keep bidding the down the price until the price drops all the way to $200. At this point, both businesses would lose money by lowering the price so they will not do that. And neither can raise the price, as the other company would then be cheaper. The game presented above is known as the Bertrand Duopoly. It is characterized by two firms competing solely on price to customers that have no loyalty and simply pick the firm with the lowest price. It is often thought that a market with just a few firms will have high prices, as the firms will collude to keep profits high. The stunning result of the Bertrand Duopoly model is that a market with only two firms can end up competing to the lowest price due to a bidding war. How you feel about that largely depends on your perspective. Someone that comparison shops is happy with the result since it means lower prices. Someone that runs a business is scared that prices drop down, and that is why businesses refer to the “ruinous effects of a price war.” Ideally businesses would hope to keep prices high for everyone so they could share larger profits. And it turns out that price matching is a policy that could allow them to do precisely that. How Price Matching Affects The Game One thing to note is the Bertrand Duopoly model does not mean heavily concentrated markets are price competitive. Businesses do not simply compete on price for customers; they take actions to maintain profits by locking in customers and trying to increase loyalty. Theoretically, a price match guarantee could also help businesses keep prices high. To see why, suppose Sowe’s and Lears advertise price matching policies, and even offer an extra 10% of the price difference. That sure seems proconsumer, but how are the firms' incentives changed? Recall that at first Lears and Sowe’s both start with a price of $300. They split the market and have healthy profits. In the standard Bertrand Duopoly, each competes by lowering the price to gain customers. But with price matching guarantees at both stores, will a bidding war begin? What happens if Sowe’s decides to lower its price to $299? When Sowe’s lowers its prices, customers would not actually go to Sowe's. They would choose to shop at Lears and get an additional discount from the price matching policy. In essence, when Sowe’s lowers its price, it cannot gain customers as it did in the standard Bertrand Duopoly. In effect, price matching has ruined the incentive to create a bidding war, and the bidding war will never start! Both firms have tacitly cooperated to keep prices high. Credible Threats There are also other reasons a store might institute a price matching policy. For one, it is a good public relations technique. Businesses want to say, “Don’t go to my competition. I have the lowest price.” But there is no reason to trust a business whose main motivation is to profit. So a business could try to make the claim more credible with a price matching policy. The policy has the effect of a business saying, “Look, I have the lowest price. I’m so sure of it, just look at my price matching policy. Heck, I’ll even give you a 10% discount.” Now that sure sounds nice. The positive image of price matching guarantees is perhaps one reason that most stores have them. And while we feel like all the stores are helping us, it is possible they are using price matching as a subtle method to increase profits. Source Fitzgerald, Drew, and Paul Ziobro. “Price War Looms for Electronics.” The Wall Street Journal. 20 Nov 2013. Web. http://online.wsj.com/news/articles/SB10001424052702303755504579208382032930724. Dominated Strategies We will now change gears. The first examples in this book have been about business applications. But game theory is a tool for making smart decisions any time, as the following story illustrates. One night I was trying to catch a cab in San Francisco. I decided it best to go near a popular intersection but stay away from the crowd. I was lucky and soon a cab approached. Just as I was getting in, a woman complained she was outside first and I was “stealing” her cab. I politely replied that I had not seen her, but since another cab was coming after mine, she should hail that cab. She scowled before taking my advice. I recounted the incident from the cab. I was particularly puzzled as to why she was yelling at me. If she was primarily concerned with getting a cab, it seemed yelling was about the worst thing she could do. Why is that? It is because she needed to think about the game from my perspective. You see, since I was closer to the approaching cab, I had full control to take the cab. Given that, she had a range of choices to increase her chances of getting a cab. She could have asked me politely to give up the cab, moved to another location, or even called a cab on her phone. All of these choices would have improved her chances of getting a cab. But by yelling instead, she rubbed me the wrong way and lowered her chances. In game theory, when a player's action (like her yelling) leads to a worse outcome than another action (her hailing the other cab going by), for every possible way other players decide to act (in each situation, the chance I would give her the cab), the action is said to be dominated. There is an obvious lesson from game theory: never choose a dominated action. If you learn one thing from this book, let this be the lesson: please, never, never play dominated strategies. Dominated strategies are not just bad decisions; they are the worst possible decisions. Buying lottery tickets is a losing bet and generally not smart, but even then you have a chance to win. So you can think of playing a dominated strategy as worse than buying a lottery ticket. You are always better off avoiding dominated strategies. Winning A “Beauty Contest” The idea of not picking dominated strategies sounds simple, but the theoretical prediction is not always the same as the practical outcome. I will illustrate this with an example when I was a student at Stanford. In a memorable lecture, my game theory professor staked $250 to teach a lesson about crowd behavior. The lecture began innocently enough. We were going to play a simple game with the following rules. 1. Everyone would secretly submit a whole number from 0 to 20. 2. All entries would be collected and averaged together. 3. The winning number would be chosen as twothirds of the average, rounded to the closest number. For instance, if the average of all entries was 3, then the winning number would be chosen as 2. Or if the average was 4, the winning number would be 3 (rounded from 2.66...). 4. Entries closest to the winning number would get a prize of meeting with the professor over a $5 smoothie. (The textbook version of the game has multiple winners split the prize. My professor was being generous). Before you read on, I would like you to seriously consider what number you might pick. Imagine you are sitting in a lecture hall and actually playing this game. You seek the glory of outsmarting 49 other students, and you really want to meet with the professor since you find game theory fascinating. You have 10 seconds to decide before ballots are collected. Which number would you pick? Some Guiding Logic The game is called a “pbeauty contest.” The “p” refers to the proportion the average is multiplied by—in this case, p is twothirds. It turns out the game has a similar outcome for any value of p less than 1. Why is it called a beauty contest? The name is because the game is the numbersanalog to a beauty contest developed by John Maynard Keynes. Imagine a newspaper runs a contest to determine the prettiest face in town. Readers can vote for the prettiest face, and the face with the most votes will be the winner. Readers voting for the prettiest face will be entered in a raffle for a big prize. How does the game play out? Keynes wanted to point out the group dynamics. The naive strategy would be to pick the face you found to be the most attractive. A better strategy would be to pick the face that you thought other people would find attractive. The number “beauty contest” has the same kind of logic. You do not pick a number you like. You should pick a number based on what others will pick, so that your number is closest to twothirds of the average. The twist of both games is that your guess affects the average outcome. And each person is trying to outsmart everyone else. Given the subtlety of the game, my professor was banking on paying out to only a few winners. Although it was mathematically possible for each of us to win, he was taking that risk. In fact, he knew that if we were all rational, we would all win. He would have to pay out a $5 smoothie to 50 students—that is, he made a $250 gamble playing this game. Why was he so confident? Let us explore the solution to the game and see why it is hard to be logically rational. Numbers You Should Not Pick Even though it is not possible to know what other people are guessing, this game has a solution. If everyone acts completely logically, there are only two possible winning numbers. It takes some crafty thinking, but the result is based on two principles I think you will accept. Principle 1: Do Not Play Stupid Strategies (Eliminate Dominated Strategies) The first principle is that players should avoid writing down numbers that could never win. That sounds logical enough, but it is not always the case. We can all agree that writing a number that could never win is a very bad strategy. You are picking an option that is inferior to something else, which if you recall, is called a dominated strategy. Are there any dominated strategies in the beauty contest? To answer that question, we need to figure out which numbers would never win. A natural question is: what is the highest winning number? You would never want to pick a number larger than that, unless you wanted to lose. You know that the highest number anyone can pick is the number 20. If every single person picked 20, then the average would be 20. The winning number would be twothirds of 20, which is 13 when rounded. Should you ever find yourself submitting 20? The answer is no—there is always a better choice, like the number 19. The only time 20 wins is precisely when everyone else picks it and everyone shares the prize. In that case, you would be better off writing 19 to win the prize unshared. Plus, by writing 19, you can possibly win in other cases, like when everyone else picks 19. You are always better off writing 19 than 20. The guess of 20 is a dominated strategy. You should never choose 20. And your rational opponents should be thinking the same way. So here is the big result: you can reason that no player would ever choose 20. Principle 2: Trim The Game And Iterate By principle 1, no player would ever choose 20. Therefore, you can essentially remove 20 as a choice. The game trims to a smaller beauty contest in which everyone is picking a number between 0 and 19. The smaller game has survived one round of principle 1. Now, repeat! Ask yourself: in the reduced game, are there any dominated strategies? Now 19 takes the role of 20 from the last analysis. Since 19 is the highest possible average, it will never be a good idea to guess it. Applying principle 1, you can reason that 18 is always a better choice than 19. Thus, 19 is dominated and should be eliminated as a choice for every player. The game is now trimmed to picking numbers from 0 to 18. This is the result of two iterations of principle 1. There is no reason to stop now. You can iterate principle 1 to successively eliminate the choices of 18, 17, 16, and so on. The process ends after eliminating all numbers except 0 and 1. There is a name for this thought process. It is aptly named, but a mouthful: iterated elimination of dominated strategies (IEDS). The idea is to eliminate bad moves, trim the game, and iterate the process to find the surviving moves. These remaining strategies are considered to be rationalizable moves, that is, moves that can possibly win. Here is a schematic for IEDS. The Equilibria The only strategies that survive IEDS are the numbers 0 and 1. Is either a better choice? This is unfortunately where IEDS cannot give insight. It is possible to have 0 as a winning number. If all 50 students picked 0, then the winning number would be 0. Similarly, it is possible to have 1 as a winning number. If all 50 students picked 1, then the winning number would be 2/3, which rounds to 1. What actually happens depends on what people think everyone else will be guessing. Both equilibria—all 0 and all 1—are achievable. Back To The Classroom None of us in the class had this deep understanding of IEDS. We were just learning game theory—it was actually our third lecture. My professor was pretty sure our guesses would be all over the place. But Stanford kids can be crafty. One student used some sharp thinking and realized that coordination would help; he asked if we could talk to each other. The professor, still feeling we were novices, confidently replied with a smile, “Sure. Go ahead.” We only had 10 seconds to write down our answers anyway. Before the professor could change his mind, the student quickly shouted to all of us, “If we all write down 0, we all win.” It was remarkable. He figured out the equilibrium and told us what to do! He couldn't be tricking us because the math was clear: if we all picked 0, we would all have winning numbers. How Smart Are Stanford Kids? The professor was relieved after he tallied the votes. He told us that admirably most of us wrote down the number 0 (I was among those who did). But there were larger answers too, ranging from 1 to 10. Someone actually wrote down 10! And this was after being told the answer. After all was said and done, the winning number turned out to be 2, and the prize was awarded to three students. Thanks to our lack of coordination, my professor only paid out a prize of $15. It was even better. My professor questioned the students who wrote down larger numbers. They all squirmed and explained reasons like “it was my lucky number” or “I don't know. I wasn't really thinking.” The Practical Lesson What is going on? This is a group of smart students that was told the answer to the game. The example illustrates a flaw of IEDS. It can get you reasonable answers if you think players are reasoning out further and further in nested logic. We often do not have an infinite capacity to reason logically, only a bounded ability to reason rationality. The practical answer to what you should write depends on the book answer plus your subjective beliefs about what other people do. It is the combination of book smarts plus social smarts that matters. The people who wrote down the winning numbers told the class they suspected some people would deviate for irrational reasons. And they were rewarded for not confusing theory with practice. Dominant Strategies A dominated strategy is one that you should never play. The flip side is a dominant strategy, a strategy that always makes sense and should be played. A dominant strategy is one that gives you the best outcome, regardless of what your other choices are and what other people are doing. A simple example is when you are choosing the best checkout line in a store. Say there are two lines A and B with equally competent cashiers. You see that line A has a wait of one person while line B has no wait. What should you do? It is clear that you should go to line B where there is no wait: this is a dominant strategy. A more nuanced example is picking the best lane while driving. Imagine the right lane has one car in front of you while the left lane has no cars, and there are no other cars around. All you want to do is drive straight for several miles. What do you do? Picking the left lane is generally a better option: the person in the right lane might drive slowly, or they might slow down to turn at an intersection. In this example, picking the left lane is a dominant strategy. Some examples of dominant strategies are obvious. But others are harder to see. A few years ago, I was running errands and remembered I needed to visit the bank. But the time was 5:02pm, so I was not sure if the bank had closed already. I was debating whether to take a few minutes to drive by my bank, or to take care of other errands, one which would take 20 minutes and the other 40 minutes. I wanted to get everything done, and I really wanted to start off succeeding in my first task. Which one should I do first? a. check the bank b. do the 20 minute errand c. do the 40 minute errand The answer is not difficult, and in fact you can probably figure it out without knowledge of game theory. But I find the situation is instructive for the idea of a dominant strategy. Dominant Strategies I was with my dad who instantly came up with the proper decision (I jokingly said it is because I've taught him some game theory). The proper decision is to do the 20 minute errand first. Why? Consider the likely times the bank would close and how each option performs. Case 1: The bank already closed at 5:00. The choice of checking the bank first is bad since I want to succeed in my first task. Case 2: The bank closes at 5:30. Within 30 minutes I could check the bank and do the 20 minute errand in either order. I could not do the 40 minute errand first as the bank would close. So the 40 minute errand is not a good choice for the first task. Case 3: The bank closes at 6 (or later). Now it is possible to do one errand and visit the bank, in any order, and then do the other errand. As you can see, visiting the bank first is bad in case 1, and doing the 40 minute errand is bad in case 2. Doing the 20 minute errand first is sensible in all of the cases. Therefore, doing the 20 minute errand is at least as good, or better, than doing anything else first and is a dominant strategy. Once a dominant strategy is identified, the decisionmaking process is simple: you should play the dominant strategy. In this example the closing time of the bank was an important variable in identifying the dominant strategy. The specification of the closing times itself reveals another example of game theory called focal points. Focal Points (Schelling Points) When I saw the time was 5:02pm, I was worried the bank might not be open. Why was that? Notice that in theory banks can close at any time they wish. If they choose to close at 5:03pm every day, then they are legally allowed to do so. But there is something very strange about picking a closing time like 5:03pm. Employees would be annoyed at having to stay 3 minutes past the hour, and customers might be confused as to why a bank would pick an unusual number. It is customary and more natural for banks to close at a “round” time, like 5:00, 5:30, or 6:00. These closing times are examples of the game theory concept of a focal point, also referred to as a Schelling point in honor of the economist Thomas Schelling who described them. A focal point is a time or strategy that is natural or special in some way. Focal points are important because they allow people to coordinate without communication. In the classic example by Thomas Schelling, people in an experiment were told they were to meet a stranger in New York City. Where would they choose to meet? Overwhelmingly people chose to meet at the Grand Central train station at noon—they would try to pick a prominent spot at a special time to increase the odds of meeting. To return to the problem at hand, the concept of focal points allows us to infer the bank would close at a round time, such as 5:00, 5:30, or 6:00. You may say you know this by experience, but deep down that psychological reason has a strategic element to it. The interesting part of focal points is that they help us coordinate. Let us now explore a few more classic examples of focal points. Focal Points (Schelling Points) Let's do a brief experiment. For the following seven tasks, your goal is to answer so as to match a partner who is separately answering the same questions. What would you pick? 1. Pick one: 'heads' or 'tails.' 2. Pick a number from the list: 7, 13, 99, 100, 261, 555. 3. We are to meet in a city but we cannot communicate in advance. Which of the following cities would you choose to have the highest chance of meeting me: Rome, Berlin, Paris, New York, London? 4. We have agreed to meet on a specific date, but the time was left unspecified and we cannot communicate. We have to meet at an exact minute. Which time will you choose? 5. Write a positive number. 6. Name an amount of money. 7. You are given 100 dollars to split into piles A and B. If your split of piles A and B matches your partner's, you get the amount in pile A and your partner gets the amount in pile B. How will you split up the money? What Is The Purpose Of The Quiz? These seven questions have a common theme: the goal in each is to coordinate the outcome without communicating. Situations like this are more generally called coordination games, where the strategy is to match what the other party is doing. Questions similar to those above were asked to a group of 199 people with different cultural backgrounds. Here are the results that indicate some choices were more “obvious” and popular. Results 1. Pick one: 'heads' or 'tails.' The common answer was 'heads', chosen by 69 percent of the people in the group. 2. Pick a number from the list: 7, 13, 99, 100, 261, 555. The common answer was 7, chosen by 36% of the people. This was followed by 100 (17%), 13 (14%), 261 (11%), 99 (13%), and 555 (9%). 3. We are to meet in a city but we cannot communicate in advance. Which of the following cities would you choose to have the highest change of meeting me: Rome, Berlin, Paris, New York, London? The top answers were nearly evenly divided between Paris (27%) and London (26%). 4. We have agreed to meet on a specific date, but the time was left unspecified and we cannot communicate. We have to meet at an exact minute. Which time will you choose? The most common answer was 12 noon (30%). The next most common answer of 2pm had only 6%. 5. Write a positive number. The most common answer was 7 (16%), followed by 2 (14%). It seems there was not too much consensus on this question. 6. Name an amount of money. The top answer of 1 million was chosen by 30% of people, followed by 100 with 11%. 7. You are given 100 dollars to split into piles A and B. If your split of piles A and B matches your partner's, you get the amount in pile A and your partner gets the amount in pile B. How will you split up the money? The answer of splitting the piles into 50/50 was given by 80% of people. Coordination Although these questions can be answered in a number of different ways, it turned out that many people gave the same answers. This is an example of how certain choices are more “natural” and their existence can help people coordinate in the lack of communication. In the next section, we give an application of focal points to public safety. Source 1. Abitbol, Pablo. “An Experiment on Intercultural Tacit Coordination  Preliminary Report.” MRPA paper. October 2009. http://mpra.ub.unimuenchen.de/23474/1/MPRA_paper_23474.pdf. The Game Of Bicycle Collisions Bike safety is a big topic. Wear a helmet. Follow traffic laws, like halting at stop signs. Do not go too fast, and make sure your brakes work. This is all useful advice. It has been beaten over my head since elementary school. Nonetheless, it did not help me from getting into bike accidents during college. To tackle that problem, I took advice that depended on game theory and the concept of focal points. The Bike Game Avoiding bike crashes is type of coordination game. Bikers want to coordinate to avoid occupying the same real estate at the same time. Consider a game of two bikers moving in opposing directions. Imagine each biker has a choice of going straight, moving to his left, or moving to his right. Crashes occur when both bikers choose straight or they both swerve towards each other on either the left or right sides. The remaining combinations, say one goes straight and other swerves right, will result in a safe exchange. There are six “safe” cases compared to the three “crash” cases depicted in the following figure. Thanks to the nature of the game, by random chance, it is twice as likely that the bikers will avoid each other rather than crash. Furthermore, there is a focal point that can reduce crashes. We Americans drive on the right hand side of the road, so it is natural that both bikers would swerve to their respective right sides. This rule of thumb helps avoid crashes. Plus, it doubles as a good tip for walking in hallways of American offices. Unfortunately, our friendly human nature tends to undermine the safe focal point. What many of us do is get nervous and try to make eye contact with the other biker. When you make eye contact, you feel the need to mirror what the other person is doing. Because you have limited time to respond, you do not react fast enough. You even ruin the random chance of a safe passage and make crash the likely course. And that is why the best advice I ever got about bike safety was this rule: avoid eye contact with an oncoming biker. It is not mindblowing advice, nor does it always seem nice, but it is very practical. When you do not stare at the other person, you both will rely on the focal point of swerving to the right. (My friends and I joked in college that the “no stare” advice is virtually impossible to follow when the oncoming biker is attractive. But then the game is different, as you probably wouldn't mind bumping into that person and starting up a conversation). Some Students Make The Game More Difficult As I mentioned earlier, focal points are dependent on culture. While avoiding eye contact gets to the “swerve right” equilibrium for Americans, it fails for many international students who drive on the left side of the road, and have the instinct to “swerve left.” Then, there is another twist. Some bikers aren't playing the game of coordination. They view the interaction more like a game of chicken. These people will never swerve because they do not want to slow down and often they enjoy the opportunity to yell at others. The game is even more complicated since there are more variables. The realistic situation involves fourway intersections with high traffic. Now each biker has to coordinate with more than one other biker, and there are more bikers on the road. Even if natural bike safety reduced the crash rate to one percent, an intersection with 1,000 such games played per day would have about 10 crashes, which is quite a lot. What can be done? Well, think back to the idea of focal points. A designer that hopes to increase the odds of success could nudge bikers into the right direction by creating a focal point. The implication is the bike game can be improved with better design. Here is one implementation: create a bike circle with arrows to indicate the flow of oneway traffic. If enough bikers follow the rules, then traffic will flow in oneway. Enforcement should come naturally. New bikers will also follow the traffic as that will be the safest route. For its part, Stanford University did implement a traffic circle in a busy spot that was called the “Intersection of Death.” It is not clear if the number of accidents has gone down, as the same spot is now called the “Circle of Death.” Splitting A Soda Evenly Continuing with the theme of coordination, I want to offer a story about sharing with siblings. I have to thank my fifth grade math teacher for the story, which unintentionally introduced me to game theory. The game theory is hidden in the following extracredit problem that he asked us. My mother would often give a can of soda to me and my two brothers and tell us to split it. Naturally, we all wanted more soda, but our Mom told us to be fair and split it—without arguing. After we failed, she came upon a solution that suited all of us. What method did we use to split the soda? Most of us in the class thought mathematically and submitted answers about pouring 1/3 of the volume into each glass. My teacher told us these answers were incomplete because they described an outcome but not how the outcome would be achieved. Who would pour the soda? And what order do people pick? And how do you make everyone in the group trust each other? Here is the solution the mom devised: one person was chosen to do the pouring. After the soda can was empty, the person who did the pouring would be the last to choose his glass. The method proved to be successful—the soda was always split evenly. Why does the method work? It is because the method gives the person pouring an incentive to make the glasses as even as possible. If he does not pour the soda evenly, he will suffer because the other brothers pick the fuller glasses first. Another way of thinking about the solution is that the other brothers are made to trust the person pouring. And this is a remarkable trait because the brothers' interests are diametrically opposed. This problem is example of mechanism design, which is the study of creating rules and incentives to allocate resources in what the designer sees as an efficient or fair way. Mechanism design is the theoretical basis to make markets work then they are not perfect, but it also comes up in many situations, like how airlines creatively price tickets. The theory of mechanism design can also be used for your personal finances. I'm going to explain a motivational example based on the following simple question. How Should You Split Your Paycheck? How do you answer this question? Or in other words, what is your mechanism for achieving good financial outcomes? Unless you plan, you are likely to use a greedy mechanism in which you use all of your paycheck (or even go into debt) for instant gratification. You will not have savings for expected purchases, like a down payment on a house, or for retirement. If you are unhappy with your outcomes, why not change the design of your mechanism? Just as the three boys were made to split the soda evenly, you can create a mechanism to improve your finances. In this example, it is instructive to think about yourself as three distinct people: the inner child of instant gratification, the inner teenager who can think at most five years into the future, and the inner adult who wants to have money during retirement. And think about your paycheck as the proverbial soda that your three “selves” need to split up. Now it is very clear why people fail without planning: they satisfy only the inner child and leave nothing for the future selves. Their strategy is the equivalent of letting the person who pours the soda be the first one to pick. Take a lesson from the soda mechanism: the person who chooses last needs to be the person pouring. Your retirement self is the one who is picking last, and by analogy, that is the person who should decide how to divide the check. An example might be to deduct 10% for retirement, 10% for the medium term, and the rest for current expenses. This strategy is commonly known as “paying yourself first,” but you are really “paying your later self first,” like a game theorist might. How Game Theory Solved A Religious Mystery While we are on the topic of debt and fair division, consider the following situation. A man owes debts of 100, 200, and 300, but dies with insufficient funds to pay everyone. How should his estate be divided? As we all know, there might not be one correct answer. Fair division is a concept that depends as much on logic as it does on social custom. To see why, consider the following three situations that afford very different solutions. 1. A parent promises gifts to his children, but has to back off when a bonus is smaller than expected. 2. A publicly traded company issues shares of stocks and bonds, but soon goes bankrupt in an accounting scandal. 3. Partygoers order items at a restaurant, with promises to pay, and then end up arguing over the best way to split the bill. There is not a single right way to approach any of these problems. That's what family fights, lawsuits, and restaurant arguments demonstrate every day. The conflict is a matter of perspective. Some people prefer proportional division that depends on debt size. An example is the classic “pay what you ordered” method in restaurants where guests put money based on food they ordered. As logical as this sounds, not everyone desires this method. Others prefer splitting things up equally. They argue it is the person—not the debt size—that matters. Equal division is a method for dividing gifts to children. During Christmas or holiday time, parents may choose to give every child the same gift size regardless of age or behavior. What gets accepted depends on social custom. Getting everyone to agree is an exercise in persuasion, not in economics. It is possible for emotionally pleasing methods to beat more logically consistent systems. One of the earliest discussions of fair division comes from the Babylonian Talmud, a record of discussions about Jewish laws and customs. The Talmud contains discusses a bankruptcy problem in the context of a man offering debts to his wives in excess of his assets. The Talmud answer is not immediately obvious, and in fact, the answer baffled academics for over almost 2,000 years. Let's see why. The Talmud Answer How should an estate be divided among three creditors claiming sums of 100, 200, and 300? The Talmud offers answers through three examples. The text does not contain a general rule, which is what makes these answers seemingly contradictory. The three cases are when the estate size is 100, 200, and 300. In the first case when the estate size is 100, the Talmud awards 33 1/3 to each party. The division suggests a principle of an equal division, which is easy mathematically and holds social appeal. But strangely this is not the same idea used in the other cases. In the third case of 300, the Talmud offers a division of 50, 100, and 150. The math here is a proportional division based on the size of the debt. In modern times, proportional division holds wide appeal among lawyers and economists. But in this example, the puzzle is why is the 300 case treated differently than the 100 case? If that question bothers you, then get ready for another surprise in the division for 200. In this case, the estate is supposed to be divided as 50, 75, and 75. Not only does the division not classify as an equal division nor a proportional division, but it is simply a curious decision altogether. Why should the second and third creditors be given the same amount of money? And where do the numbers come from? Before I proceed, it is worth summarizing the claims in a table. We can think about the Talmud answers as a table that illustrates how an estate would be divided. I provide an illustration below, in which the rows are estate sizes, the columns are claims, and the table entries are the division size. The division defied a proper explanation for almost 2,000 years, filling volumes of critical review. Some scholars had essentially given up and suggested the 200 case might be an issue of faulty transcription. And this is the unlikely background for which game theory enters and possibly saves the day. Game Theory Offers An Answer In the 1980s, Professors Robert Aumann and Michael Maschler wrote a paper claiming to have cracked the mystery. They suggest there is no inconsistency in the Talmud answer. Aumann and Maschler demonstrate the Talmud answer can be viewed as a consistent application of a game theory principle. Why was game theory used? It turns out the Talmud answer is the solution (the nucleolus) of a properly defined coalitional game (sometimes called cooperative game theory). Aumann and Maschler explain the concept in lay terms as a single and consistent principle: equal division of the contested sum. It is worth being skeptical before proceeding. Is the explanation simply a coincidence? After all, there are probably an infinite number of explanations that might produce the same split. Aumann and Maschler justify their answer by examining other Talmudic passages and suggesting the same principle is applied in many topics. “Equal division of the contested sum” was apparently a social custom and that would help explain why it might seem strange to us but could have been natural for their culture. Equal Division Of The Contested Sum The Talmud examines a situation that might have been common to their times. Suppose two people are arguing over a garment. One claims half belongs to him while the other claims the whole is his. A judge is asked to decide who gets what. What would you do? There are naturally various answers. One could propose an even split (1/2, 1/2) or a proportional split (1/3, 2/3). But the Talmud offers a different answer, an answer that turns out to be an equal division of the contested sum (1/4, 3/4). How does this principle work? There are three stages. First, decide what portion of the cloth is being disputed. In this case, exactly half of the garment is being claimed by both parties. Second, split the disputed division among both parties—so 1/4 of the cloth is awarded to each. And third, give the remaining cloth—the “undisputed” portion—entirely to the person whose claim is not disputed. This logic yields a split of 1/4 for the person claiming half of the garment and 3/4 for the person claiming the whole. This answer might seem strange, but remember that fair division methods depend on social custom. The same method can be used for any problem among two parties, using the same three steps above. 1. Determine which portion is contested or claimed by both parties. 2. Split the contested portion equally. 3. Assign the uncontested portion to the sole person claiming it. How else might this principle be applied? It can actually be applied to many situations, like when the claims are larger than the asset to be divided, as in the case of dividing an estate. Equal Division Of The Contested Sum, Two Creditors It is worth going through a few examples to get a feel for the idea. Let's examine how to divide estates of various sizes with two creditors claiming 100 and 300. Example 1: (estate 66 2/3) If the estate is 66 2/3, then the entire estate is contested. The split should be even at 33 1/3 going to each party. Example 2: (estate 125) If the estate is 125, then the first 100 is contested by both parties and divided evenly. The remaining 25 is entirely awarded to the 300claimant. Hence, the division is 50 and 75. Example 3: (estate 200) Finally, if the estate is 200, then again the first 100 is contested by both parties and divided evenly. The remaining 100 is entirely awarded to the 300claimant. Hence, the division is 50 and 150. Here are the divisions in a table. Why stop there? Here are some examples when the claims are (100, 200) and (200, 300). Note that these are the remaining pairs of claims for the threeperson split that is motivating this article. Explaining The Talmud Puzzle Let's go back to the Talmud division for the three creditors. In the case of a 200 estate, the division was 50, 75, and 75 for parties that claimed debts of 100, 200, and 300. To analyze this answer, let's do the following exercise. Take any two creditors and consider how they might split the total money awarded to them. Why would we do that? It is a check of consistency. It makes sense that pairs of creditors should have claims divided in a manner consistent with the way a disputed garment would be divided. Consider the pair of creditors claiming 100 and 200. Together they are awarded a sum of 125. How is that sum split? It is split as 50 and 75. And amazingly, that is matches the work we did in examples above: this answer is consistent with an equal division of the contested sum! The logic is that the first 100 is contested by both parties and split evenly, and the uncontested 25 is awarded to the 200claimant. In fact, the same observation can be seen when considering other pairs of creditors. Look at how much the 100 and 300 parties are getting. Together they receive a sum of 125, and this is split as 50 and 75. Again, this answer is consistent with an equal division of the contested sum. Finally, consider the total reward to the 200 and 300 parties. In this case, the total sum of 150 is split as 75 to each. As the total sum is contested, this once again reflects an equal division of the contested sum. In other words, when the mysterious Talmud solution is broken down by pairs of creditors, there is a consistent principle. I think this is quite remarkable. Aumann and Maschler demonstrate the method can be extended, whether the claims are for three creditors, a hundred creditors, or even a million creditors. The same condition needs to be met: the assets are divided up such that the amount received by any two people reflects the principle of equal division of the contested sum. Furthermore, the division is a unique solution. An Algorithm It is good enough to see certain divisions are pairwise equal divisions of contested sums. But how do you find them starting from scratch? Aumann and Maschler show there is in fact only one division that is consistent. And this answer can be described by the following seven step algorithm. 1. Order the creditors from lowest to highest claims. 2. Divide the estate equally among all parties until the lowest creditor receives one half of the claim. 3. Divide the estate equally among all parties except the lowest creditor until the next lowest creditor receives one half of the claim. 4. Proceed until each creditor has reached onehalf of the original claim. 5. Now, work in reverse. Start giving the highestclaim money from the estate until the loss, the difference between the claim and the award, equals the loss for the next highest creditor. 6. Then divide the estate equally among the highest creditors until the loss of the highest creditors equals the loss of the next highest. 7. Continue until all money has been awarded. Here is how the claims would be divided in the Talmud example. Mystery solved? I think so. Not only do the Talmud answers follow a consistent principle, but they also rely on an idea that was mentioned as a custom. In that case, it is surely an interesting case that a tool of logic and rationality—game theory—was needed to decode the Talmud solution, which primarily depended on social custom. Source Journal of Economic Theory 36 (1985), pp. 195213. Coordination Failures Focal points—choices that stand out and are attractive—are useful in coordination games because they help everyone make the same choice without communication. But it is not always the case that attractive choices are useful in all games. In competitive games, it is often the naturally appealing choice that creates conflict. For instance, sports teams often compete for the same “star” player in bidding wars; journalists and news outlets seek to cover the most prominent story, often topping each other in coverage; bright students apply to the best ranked colleges, competing for limited seats; and commonly people seek to win the love of the most attractive mates, sometimes leading to vicious tactics. In a coordination game, a naturally attractive option is a good thing. In a competitive game, each person seeks to be the winner, leading to a breakdown of cooperation. The rest of this section will be devoted to covering competitive games in which individual interest is often at odds with group interest, leading to failures of coordination. We will provide several examples of how coordination failures arise in common examples. The first example is of a group of men hoping to win a date at a bar. Nash Equilibrium We jump right in with a stylized version of a bar scene. You and three male friends are at a bar trying to pick up women. Suddenly one blonde and four brunettes enter in a group. What's the individual strategy? Here are the rules. Each of you wants to talk to the blonde. If more than one of you tries to talk to her, however, she will be put off and talk to no one. At that point it will also be too late to talk to a brunette, as no one likes being second choice. Assume anyone who starts out talking to a brunette will succeed. This scene is depicted in the move A Beautiful Mind, a Hollywood dramatization of the book of the same name about the game theorist John Nash. The blonde woman is like a focal point in that everyone finds her naturally the most attractive. However, the fact that all of them want the blonde is a problem. If they all go for her, then at most one of them might succeed. Furthermore, the men will have ruined their chances for the night, as the brunettes will feel offended as no one likes to be secondchoice. What is the group to do? In the film, John Nash proposes a scheme to the group on how they should cooperate. He suggests, what if everyone goes for a brunette to start? Then each person has a high chance of succeeding, and everyone ends up with a good option. So what do you think, is this a good plan? Let's analyze the situation strategically. Definition: Nash Equilibrium This is one of the most important concepts of game theory so it will be useful to take a break to define it and consider some examples. A Nash equilibrium is a situation in which no person can improve his or her payout, given what others are doing. To put it another way, given the choices that everyone else is making, you are making the best possible choice that you can—the formal term, if you recall from Hotelling's game, is you are picking a best response. Let's recap the concept of Nash equilibrium with some of the examples already covered in the book. 1) In Hotelling's game, the two hot dog stands sought the most customers. We analyzed that given where one hot dog stand was located, the other hot dog stand preferred to locate more centrally to win more customers. As each wishes to locate more centrally, they both end up preferring to be exactly in the center. At this point, neither could improve their profits, as that would mean locating noncentrally. That is why both locating in the center was a Nash equilibrium. 2) In the Bertrand Duopoly model, two firms compete solely on price. Each firm finds it profitable to undercut the other firm's price. The result is that each ends up picking the lowest possible price. At this point, neither firm can lower its price (as that would mean negative profits) nor can either firm raise its price (as that would lose all customers to the other firm). They are “stuck” into this outcome of a low price, and hence that is a Nash equilibrium. As seen in these examples, a Nash equilibrium is not necessarily socially optimal (as we argued before, it would be better if the hot dog stands spread out on the beach), nor is it the best for the players in the game (firms in a Bertrand Duopoly game do not wish their prices were so low). The point of a Nash equilibrium is that it is the result of competition when players take into account what others will do and how they can influence the game. Some Nash equilibria will seem fair while others will not. However, when the Nash equilibrium is an undesirable outcome, we will not be fatalistic and pessimistic. We will explore in Section II techniques to change the game, which can bring players out of traps and bring about coordination. For now, we will work on understanding the concept of a Nash equilibrium in more games. We must first understand why coordination failures happen and develop ideas on how to best fix them. Go For The Blonde? With the definition in mind, let us solve for the Nash equilibrium of the bar scene. When four guys are competing for brunettes and a single blonde, is it a smart strategy for everyone to go for a brunette? We will answer this question by doing a small thought experiment. Let's suppose that everyone else in the group follows the plan, and your three friends agree to go for the brunettes. What is your best response? That is, what is the best thing you can do, given that they are going for the brunettes, and no one is going for the blonde? You have two choices in this situation: you can either go for the brunette or the blonde. If you go for the brunette, you have a good shot at getting her. But then a thought creeps into your mind. With your friends going for brunettes, you have no competition if you go for the blonde. You realize that you now have a good shot at the blonde, and you thus would prefer to go for the blonde. So the answer is clear: if your friends go for the brunette, you should go for the blonde. Now the question is, is this a Nash equilibrium? What we have to check is that no single person can do better given what everyone else is doing. We already derived that you are playing a best response. It remains to check if your friends can do any better. Each of your friends has two choices: each can either stick a brunette, or each can try talking to the blonde. But what happens if another person goes for the blonde? The stylized rules indicated that when two people go for the blonde—the friend and you—then both people would strike out. So clearly it is better for each of your friends to stick with their initial choice. Therefore, we can conclude that you going for a blonde, and each friend going for a brunette, is a Nash equilibrium. Given what everyone has chosen, no one can do better. (As a sidebar, the scene in the movie A Beautiful Mind does not actually depict a Nash equilibrium. John Nash proposes they all go for the brunettes, but then he has a flash of an idea and leaves the bar.) This example also raises another point. In Hotelling's Game and the Bertrand Duopoly, each had a unique Nash equilibrium. In this bar game, as described, there are in fact several Nash equilibria. It was a Nash equilibrium when you went for the blonde, and everyone else went for the brunette. It is also a Nash equilibrium when exactly one person (not just you) goes for the blonde and everyone else goes for the brunette. The logic is symmetrical to the Nash equilibrium we just derived: when exactly one person goes for the blonde, obviously that person is playing a best response; additionally no other person can change from talking to a brunette to a better outcome. So what is the actual strategy in this game, given that there are several Nash equilibria? This is a bit tricky, but we can take a hint from what happens in real life. Very likely each player is going to try to convince the others to go for the brunettes first so that he can go for the blonde. Another complication is that in practical matters it will be hard to achieve the equilibrium that only one person goes for a blonde. There is going to be competition and someone in the group will probably sabotage the mission. So there are several ways people use strategies outside of this stylized game. One strategy is to ignore the current group and wait for another group of blondes (the classic “wait and see” strategy). Another is to let a random group member go for the blonde as the others distract the brunettes (also practiced as “wingman theory”). But before taking the story too much to heart, remember this entire example treats the women extremely inaccurately and simplistically, in that they respond to what the men do. As we all know, dating is a game in which both sides can employ sophisticated strategies. The point of this discussion is to remember that game theory is often about solving stylized and hypothetical games. The extent to which it can be applied depends on practical considerations. We will discuss much more of this in examples throughout this book. Should You Buy Used Products? Many financial articles praise used products, like “The Stuff I Never Buy Used” (Wisebread) or “Why FirstRate Folks Love SecondHand Stuff” (MSN Money). On face value, this advice is good because used products are usually cheaper. But how useful is the advice actually? It is very, very important to think about advice critically in a game theory context. Recall in the bar scene from A Beautiful Mind the person who said “you should go for the brunette” really meant “I hope this sucker goes for the brunette so that I can go for the blonde.” It turns out that the bar game might have an analogy in this situation of financial advice. Let's run through a thought experiment to see why. Imagine you and all the readers of those articles actually follow the advice to buy used products. It is possible that as a consequence there is an increased demand for used products and a decreased demand for new products. With enough demand, that would mean the price of used products would rise. Continuing with the logic, a sudden shift in demand would actually make used products relatively expensive, and consequently, new products would be relatively less expensive (both because demand dwindles and because those new products would have a better resale value, as buying used products becomes fashionable). In the extreme case, if the advice to buy used were actually adopted, it would actually make buying new the correct choice! Now comes the kicker. The first people to recognize that new products are a good deal would be personal finance experts. And once this trend is clear, the same advisers would turn around and write articles advising you to buy new products, starting the cycle once again! The overall point is that friendly financial advice has implications. The advice might be slightly good for you, but it is often better for the adviser. This is why you should be skeptical of personal finance advice, it is the reason I find a lot of advice insulting. So much of it is the equivalent of a friend telling me to go for the brunette while they go for the blonde. The Prisoner's Dilemma The Prisoner's Dilemma is arguably the most famous example of game theory. People have written entire books about it, such as William Poundstone's The Prisoner's Dilemma (1992). The game was developed at the RAND Institute in the 1950s. While the game has a tragic implication, the game helped popularize game theory because the story is captivating and it has many connections to common experience. The classic version of the Prisoner's Dilemma is set at a police station. Two suspects are being questioned for a crime. While the police are pretty sure the suspects are guilty, they lack physical evidence and need at least one confession for a strong conviction. The suspects are separated and interrogated in different rooms. The police do not use the usual tactics of bluffing to gain a confession. They instead tell each suspect that each will be rewarded or penalized based upon how each person acts. Here are the possible scenarios. If both suspects conceal information, then each will serve a 1 year sentence based on the minimal physical evidence. If both disclose information and confess, then both will be convicted and serve 3 years. If only one discloses information, then that suspect will be rewarded by being set free while the other partner will serve a heavier 4 year sentence as a penalty for not confessing. What will be the outcome of this situation? At first glance, the suspects seem to be in a powerful situation. If both stay quiet, then the police have little evidence and they both serve a light sentence. This is the best joint outcome, and they are completely in control of their fates. But will it happen if each is thinking strategically? At this point a diagram will help in illustrating the game. The table summarizes the four possible outcomes based on whether each suspect conceals or discloses information (as each suspect has two choices, there are four outcomes in all). The two numbers in each cell indicate the payoff for each suspect as an ordered pair: (jail term for suspect 1, jail term for suspect 2). The numbers are negative to indicate the jail term is an undesirable outcome. This table is an example of a game theory matrix, and it will be used extensively throughout this book. So what will happen in this game? Recall that a Nash equilibrium is about each person playing a best response to the other person. In other words, the way to approach the decision is to consider the best thing to do in response to each choice the other person might make. In this game, each suspect has to consider what is in his best interest when the other person discloses or conceals information. Let's analyze each case by playing the role of a suspect. First, if the other suspect discloses information, what is the best thing for you to do? If you disclose information as well, then you face a sentence of 3 years. If, on the other hand you conceal information, then you will face a heavy sentence of 4 years. Clearly it is a best response to disclose if you know your partner is disclosing information by confessing. Now, what is the best thing to do if your partner stays quiet and conceals information? In that case, you can either also conceal information, and face a 1 year sentence; or you can disclose information and cooperate with the police, in which case you will be set free. Again, it is better to disclose information. You come to the conclusion that you should disclose information, as it is better for you regardless of what your partner does. In other words, you have concluded that disclosing information is a dominant strategy, and the lesson from before was that if you have a dominant strategy, then you should play it. So you go ahead and disclose information. As it turns out, your partner also went through the same logic and also concluded that disclosing was a dominant strategy. Both of you disclose information to the crime and end up serving 3 years each. The strange part is, if you had both simply stayed quiet (and ignored the dominant strategy), then you both would have only served 1 year each. In other words, by playing the best response, you both ended up in a worse outcome! That in a nutshell is the dilemma that the prisoners face. When each person thinks about individual interest, the result is a Nash equilibrium that is worse for each in the group. At this point there are a number of practical issues that might jump into your mind. First, it is not true in general that suspects always confess. Partly this is because they may fear being called a snitch and that comes with its own negative payoff (not modeled in this game). Second, if the suspects could communicate, then it might be possible that they both agree to stay quiet and avoid disclosing information. It is a key part of this game that the suspects are held separately exactly so they cannot communicate. Additionally, even if they agreed to work together, there is no contract that binds them to doing so. In a way, talk is cheap and one of the suspects might say he will stay quiet but then secretly change his mind and backstab the other. The Prisoner's Dilemma is both tragic, in that the suspects could not cooperate, and delightful, in that the logic of each person was impeccable and the temptation to act selfish was almost unavoidable. The next few sections will illustrate several applications in everyday situations where the Prisoner's Dilemma affects how people act. An Ancient Indian Proverb While the Prisoner's Dilemma was formulated in the 1950s, the conflict between individual and group interest was well known since antiquity. There is an interesting example of the Prisoner’s Dilemma in A Collection of Telugu Proverbs (Telugu is one of the languages spoken in India). The proverb must be hundreds of years old, as this book was published in 1868. The proverb is “Cheating with sand, cheating with cowdung” and it refers to the following story. Two travelers met and exchanged goods concealed in opaque bundles. One trader was offering raw rice in exchange for the other's parboiled rice. They agreed to the trade, and immediately ran off in different directions. But in the end, each found himself outwitted by the other. One trader ended up with sand instead of rice, and the other ended up with cow dung instead of parboiled rice. This is a direct example of a Prisoner's Dilemma! To see why, consider the incentives of each trader. If you think the other trader will be honest, then it is more profitable to cheat and offer nothing than to honestly offer up your good. If you think the other trader will cheat you, then it is obvious you want to cheat in return, rather than giving up something for nothing. Each trader has a dominant strategy to cheat and they do. But had they both played fairly, then they would have traded and actually ended up with the exchange they initially wanted. The story is a warning that any bilateral transaction could possibly degenerate into a Prisoner’s Dilemma if the parties do not trust each other. It’s a testament to modern society and finance that a variety of mechanisms (like reputation and punishment) can generate cooperation and trust. Left to fend for ourselves, we might never accomplish anything. As the Spanish proverb goes: “One trick is met by another.” Visiting The Doctor “So take these pills, get a blood report, and see me in a few weeks,” said my doctor. I was somewhat taken aback. It was my first physical as an adult. I was healthy overall, but I was being advised to take a prescription drug as a precaution that possibly would be needed for the rest of my life. I was not sure what to do. Could I trust my doctor? Generally doctors can be trusted and are good people. However, patients may be skeptical about the necessity of drug prescriptions, tests, and surgeries, because American health care tends to pay for activity instead of health outcomes. The issue of trust can become a Prisoner's Dilemma. The Medical Consultation Game Consider an obese adult who requires but does not want medical attention. At the request of friends, he visits a doctor. The doctor has two choices when he meets with the patient. He can choose whether to spend 5 minutes to prescribe mildly effective medicines or he can spend 15 minutes and describe more effective lifestyle changes to diet and exercise (extra effort). The adult also has two choices when he meets the doctor. He can either choose to ignore the advice and get a second opinion, or he follow the doctor's recommendation (extra effort). There are four possible outcomes from these choices. Both put in the extra effort: doctor gives lifestyle advice and patient complies. This is the best outcome for both. Only the doctor puts in the extra effort: doctor gives lifestyle advice and patient ignores it. Only the patient puts in the extra effort: doctor gives medicine and patient takes it. Neither one puts in the extra effort: doctor gives medicine and patient does not take it. The best outcome is that both put in the extra effort. But how might each think about the situation strategically? Unfortunately, the payoffs resemble the Prisoner's Dilemma and both the doctor and the patient are more likely to avoid putting in the extra effort. Here is why. The patient thinks about whether to ignore the advice and possibly seek a second opinion. If the doctor prescribes ineffective medicine, the patient is better off ignoring the advice. If the doctor gives the good advice on lifestyle, then the patient is still tempted to ignore it, as the patient can always seek corroboration with a second opinion. The doctor's perspective can be somewhat similar. Knowing the patient will ignore the advice, the doctor is better off dispensing medicine quickly as at least the doctor uses less time. Suspecting the patient will follow the advice, the doctor is still tempted to prescribe medicine, as in general there is a belief that patients do not change their lifestyle but might take their prescriptions. In the end, it is a dominant strategy for neither party to put forth extra effort. The result is that doctors write prescriptions and patients do not take them or they get a second opinion. So in spite of the possible gains, the best outcome where both cooperate cannot be achieved. The bad news is this outcome is not too much of a stretch from what commonly happens in a medical consultation. The good news is that the situation is a model, and not all doctors and patients behave this way. Some doctors are great and will make the extra effort, as will some patients. There is also a potential gain where patients can learn to trust their doctors, and that will make both more likely to put in the extra effort. The issue of trust is important, and even in the medical room the Prisoner's Dilemma influences outcomes. Source The above story was adapted from a 2004 issue of Quality and Safety in Health Care called “Models of the medical consultation: opportunities and limitations of a game theory perspective” by C. Tarrant, T. Stokes, and A. M. Colman. Prisoner's Dilemma At The Casino My friend Jamie is a professional poker player, and he is used to analyzing problems strategically. Once while he was in Las Vegas, he noticed a bit of game theory in the entry rules. Jamie was registering for a tournament at Caesar's Palace. The tournament rules specified a $65 entry fee that came with 2,500 chips. But there was an option to buy an additional 500 chips for $5 more. Should he buy the extra chips? Jamie did a quick mental calculation. The initial buyin values translated to 2.5 cents per chips, whereas the optional buyin translated to getting chips at the rate of 1 cent per chip. This made the optional buyin a nobrainer and Jamie bought the chips. But there was a catch. The optional buyin did not add to the prize pool of the tournament. The optional buyin went directly to the casino. This detail made the game a multiperson Prisoner's Dilemma. Why is that? Think about the decision to buy extra chips. Each individual player thinks as follows: if the other players do not buy the extra chips, then it is certainly a good idea to buy the extra chips at a bargain price. Having extra chips at a poker table (being the “big stack”) can be a huge advantage in terms of betting power. What if the other players buy in the extra chips? Well, then it is again a nobrainer: the extra buyin is needed just to stay even with the other players. In the end, every poker player finds it a dominant strategy to buy the extra chips. The problem is that all the extra money goes directly to the casino, not the prize pool. Thus, when everyone does the optional $5 buyin, everyone starts out with the same size stack of 3,000 chips, but the prize pool remains the same. The net effect is everyone has paid $5 extra to compete for the same amount of prize pool money. This is a remarkable example of the Prisoner's Dilemma. Jamie bought the extra chips, and I have no doubt I would do the same. Jamie hears that most casinos in Las Vegas have a similar policy of optional buyins. And that would make sense, as it is a smart move for the casino. JC Penney Loses $163 Million JC Penney wanted to change the game. The large retailer felt customers were sick of complicated clearance sales and annoying pricing tricks. In 2012, JC Penney launched a simplified pricing scheme with predictable low prices every day. They even used whole numbers like selling a shirt for $7 instead of $6.99. The move was considered a big risk, and it turned out the gamble did not pay off. In 2011, JC Penney made $64 million in the first quarter. In 2012, just one year later, it lost $163 million in the first quarter. What went wrong? I think partly this can be understood as an example of the Prisoner's Dilemma. Honesty Versus Trickery There is nothing inherently wrong with JC Penney's pricing. In fact, I'd say honest pricing is a refreshing change from the standard nickel and dime tactics of most stores. The problem, however, was the context of its strategic move. The problem was that other companies did not adopt honest pricing. To illustrate why, consider the following game. Imagine two companies who can either choose to use “honest” pricing or “tricky” pricing. Suppose the game has the following characteristics: The marketplace is worth 100 units of profit. It costs money and resources (10 units) to play “tricky” pricing. If both play “honest” or “tricky”, each splits the market profits. If one company is “honest” and the other “tricky,” the tricky company gets nearly all the market (nets 80) against the few that stick with honest pricing (10). Here are the net payoffs to the game. One thing you will notice is this: if both companies play “honest,” the total value of the marketplace is 100. If they both play “tricky,” however, then each loses 10 units for the cost of constantly running promotions. So they both only get 40 units and the market is only worth 80 units in all. The same deadweight loss happens when one company is “honest” and another is “tricky.” The “honest” company gets 10, but the “tricky” gets a net 80 so the total marketplace is worth 90—with 10 units lost for the cost of being “tricky.” How does this game play out? The Prisoner's Dilemma Of Honest Pricing In theory, both could realize the destructive nature of running “tricky” pricing: they each have to waste 10 units to run the promotions which are a net waste for society. If both played the “honest” strategy, then each would get 50 each. But what is the strategy to the game? Here is how a company might think. If the other company plays “honest,” then I can either get 50 units for playing “honest,” or I can get 80 units for playing “tricky.” Clearly it is better to play “tricky.” If the other company plays “tricky,” then I can either get 10 units for playing “honest,” or I can get 40 units for playing “tricky.” Again, it is better to play “tricky.” The conclusion is clear: it is best to play a “tricky” pricing strategy in this game, regardless of what the other company does! A nice article from MSNBC quotes behavioral economist Xavier Gabaix on why fair pricing was not a good idea: “Once you educate consumers on the right way to shop, they will seek out the lowest cost store, and that will be the one with the shrouded prices. Once they are savvier consumers, you make less money from them.” This problem is analogous to the Prisoner's Dilemma: both companies could benefit if they played honestly, but instead they are tempted to discount and steal customers from the other company. The result is an equilibrium of discounting and lots of time spent shopping for discounts. And that is probably a bad thing for everyone, JC Penney and consumers alike. Source Sullivan, Bob. “Nbc News Technology.” NBC News. 25 May 2012. Web. http://www.nbcnews.com/business/consumer/fairsquarepricingthatllneverworkjcpenneywebeingf794530. Unbranding So far we have been discussing the Prisoner's Dilemma and how it is bad for the players. This example will be a twist in which there will be a beneficiary from the incentives. A celebrity endorsement is usually a winwin: the company increases visibility for its product, and the celebrity gets paid in compensation. But things get trickier when the celebrity has a controversial or trashy image: the company may want to distance itself, but it also does not wish to alienate the celebrity's fans either. Jersey Shore was a shortlived reality TV show on MTV from 2009 to 2012. The premise of the show was to have Americans of Italian heritage party in the summertime at the Jersey Shore. The show was tremendously popular and it received record ratings for MTV. The partying on the show was wild, and often members of the cast were shown partying too hard. One example was Nicole Polizzi, nicknamed “Snooki,” who was often shown vomiting in designer handbags like Gucci and Coach. Evidently the handbag companies were worried the negative publicity might tarnish their reputations. And so they started playing a little game, by means of “unbranding.” Snooki was allegedly receiving handbags from both Gucci and Coach. But NBC Philadelphia reported there was a twist in this paid celebrity endorsement: “The kicker: Coach is not sending [Snooki] Coach bags. They're sending her Gucci bags, and any other competing designer product they can...” And yet, it is funny how each company was fighting by trying to destroy the competitor. They were probably thinking along the logic in terms like “the enemy of my enemy is my friend.” But on closer analysis, the game is not good for the companies. In fact, this type of brand warfare is a Prisoner's Dilemma. Why is that? Think about the game between Coach and Gucci. Each company has the choice of sending nothing, or spending money to hurt the competitor. How will each play the game? It is pretty easy to see the dominant strategy is for each to try and backstab the other. If the other company is backstabbing by sending your handbags, you will retaliate by sending their handbags. And if the other company does nothing, all the more reason to try and unbrand your company by sending Snooki a competitor handbag. In the end, both companies end up sending Snooki a designer handbag of the other company. But because they both do this, both of their images get hurt anyway, and they have spent money to participate. In spite of the seemingly clever strategies, the companies are the clear losers of this unbranding game. And the winner of the game is...Snooki! It is not often when a negative image leads to getting free designer handbags. But that is the power of the Prisoner's Dilemma: if you can get others to play it, you can end up a winner. Source Masterson, Teresa. “No Backsies! Designers Unload Competitors' Swag on Snooki.” NBC 10 Philadelphia. NBCUniversal Media, 20 Aug. 2010. Web. http://www.nbcphiladelphia.com/news/local/NoBacksiesDesignersUnloadCompetitorsSwagonSnooki101166409.html. The Competitive Edge I'm going to shift gears from the TV show Jersey Shore to the esteemed periodical The Economist (I bet no one has written that sentence before!). But there is a connection between Snooki's accidental winning strategy and The Economist's subscription promotion strategy as the Prisoner's Dilemma applies to both. I like reading The Economist and occasionally the magazine will contain promotional inserts to encourage gift subscriptions for students. The ad copy tends to focus on how the magazine offers “worldclass insights” that “inform and inspire.” But one of the ads I saw was a bit more interesting from a strategic perspective. The text of the ad is “Give the student the competitive edge with a gift subscription to The Economist.” It was the specific phrase “competitive edge” that caught my attention. It struck me as a clever way that The Economist's framing the game of subscribing as a Prisoner's Dilemma–with the magazine as the winner. Let's think about the game to understand why. What Is Good Trivia? Consider the question, “What is the first song that played on MTV?” As far as trivia goes, this has all the makings of a good question. It’s about an interesting moment in TV history and the answer is also fitting–“Video Killed the Radio Star.” But there’s just one glaring issue with this potential question: everyone knows this is the answer and that makes it fairly useless trivia. People who smugly bring up the fact are not seen as smart; they are quietly mocked for not realizing that everyone else already knows it. The point of this story is that sometimes knowledge can be a competitive good. To have an edge and win in trivia, you can’t just answer things everyone else knows. You have to know something and hope that other people don’t know it. And that brings us back to the claim made by The Economist. The Prisoner’s Dilemma Of Reading What is the benefit to subscribing to The Economist? One perspective is to consider individual gain. In that decision, you only need to judge whether the magazine provides value in excess of its subscription cost. But you can also evaluate the decision in terms of gaining a competitive edge. In that case, you have to think about the decisions of other people. That is where game theory comes into play. Consider the game where you choose “read” or “don’t read” the magazine. Other people are making the same decision. The payoff will depend on the actions that both parties choose. Here are the possible scenarios that can take place: Neither party reads and they stay in the status quo (a payoff of 0 to each). One party reads and the other doesn’t. The party that reads gets a competitive edge (a payoff of +1) and the party that doesn’t is at a competitive disadvantage (a payoff of 2). Both parties read. In this case there is no “edge” since both parties learn the same knowledge. Furthermore, both parties incur costs of the magazine subscription fee and the time expended to read. The net result is that both parties are worse off than if neither had read (a payoff of 1). Here is a game matrix of the payouts. What is the strategy in this game? Imagine how each party thinks. If the other party doesn’t read the magazine, then it will be better to read (+1) than to not read and stay in the status quo (0). On the other hand, if the other party does read the magazine, then it becomes a game of not being left behind. You would rather expend energy and effort to read the issue (1) instead of being lazy and falling into a competitive disadvantage (2). Both parties see that reading is a dominant strategy, and they both end up subscribing to The Economist. In the end, both end up worse off than the status quo as they spend money and time just to make sure the other does not gain a competitive edge. The winner of the game is The Economist which gained a couple of new subscriptions! Caveats The analysis above is strictly limited to the question of whether reading a highly circulated magazine could provide a competitive advantage. There are, of course, other ways to read the news—for free or at a low cost—which have definitely hurt magazine subscriptions. And just because someone subscribes to a magazine does not mean the person reads it or understands it. It is possible for both parties to gain as well, if one person benefits from the political articles while another person benefits from the business articles. Finally, knowledge can be mutually beneficial too. It’s a good thing when society becomes more educated and people think more critically. Plus, on a personal note, it’s fun to talk to people who read interesting books or magazines. The students might lose the Prisoner’s Dilemma of gaining a competitive edge, but it seems they, and society, can win in the longrun. Southwest Airlines Makes $144 Million A pattern is emerging from the past few examples. My friend Jamie and the retailer JC Penney both lost money when participating in a Prisoner's Dilemma. It is generally bad to be a player in a Prisoner's Dilemma as selfish motives, and dominant strategies, lead to a destructive outcome for the group. On the other hand, Snooki accidentally engaged Coach and Gucci in a Prisoner's Dilemma and she benefited tremendously. Similarly, The Economist advertising is enticing new readers to play Prisoner's Dilemma, hoping to increase subscriptions. The general lesson is that you do not want to play the Prisoner's Dilemma. What you want to do is create a situation where others have to play it and you can profit. The players' loss can be your gain. This is a very important business lesson that will be illustrated in this section. Southwest Airlines is unique in its boarding process. Rather than providing assigned seats, Southwest has a policy of “openseating.” This means that during the boarding process travelers are free to sit in any available seats. As an aside, people have a love/hate relationship with openseating. The bad part is that groups and families cannot reserve seats and might get split up. The good part is that openseating is much faster than assigned seating. Shorter times at the gate save Southwest money, and that indirectly keeps airfares low. The cost savings are a major reason Southwest has employed openseating for its entire 42year history. Since 2007, however, there have been a couple of notable changes. One change, in particular, translated into revenue of $98 million in 2010 and $144 million in 2011. And, as I'll explain below, the revenue is a consequence of pitting customers into a Prisoner's Dilemma game. Recent Changes To Open Seating While Southwest does not assign individual seats, it does have an organized procedure for how passengers board the aircraft. Prior to 2007, passengers generally boarded on a firstcome, firstserve basis. That is, travelers who arrived at the gate earliest could board first and pick the most favorable seats. This lead to passengers “camping out” at the gate to secure a good boarding position and it was accompanied by minor arguments as people tried to save seats or cut in line. In 2007 Southwest decided to end this “cattle call” process. The new process assigned each traveler a boarding group A, B, or C and boarding number. Travelers in group A went first, then B, and then C. Within each group, a traveler with a lower boarding number got on the airplane first. Travelers could still choose seats once on board the airplane, but there was much less chaos at the gate because people could line up in a specified order. How were travelers assigned a boarding number? The number was based on the time at which the traveler checked in for the flight. Someone who checked in at the earliest time, 24 hours in advance of the flight, could secure a favorable boarding assignment. A traveler who forgot to check in online was often doomed and would have to wait at the end of the line. The boarding assignment changed the game in a very interesting way. Instead of rewarding passengers who waited at the gate the longest (people who did not value their time), the boarding assignment generally rewarded people who could check in online in advance (people who were organized and generally welloff). A secondary market sprung up to capitalize on the technology of onlineboarding, with some websites offering to automatically check a traveler in at the earliest time 24 hours before a flight. The service was reliable and cost $1, a very appealing offer for the busy business traveler. These thirdparty websites were tolerated until Southwest shut them down and decided that it should be the one profiting. In 2009, Southwest unveiled the most recent change to openboarding called EarlyBird Checkin. It was this option that lead to millions of dollars in extra fees. The EarlyBird Prisoner's Dilemma Southwest explained the EarlyBird Checkin in a press release, “Don't race. We'll save your place! Southwest is proud to announce its newest product, EarlyBird Checkin, which gives Customers the option to score an early boarding position by adding an additional $10 to the price of a oneway fare. The lowcost service automatically reserves a boarding position for Customers prior to general checkin...” Instead of playing the seating lottery, customers who were willing to pay could secure priority seating by paying $10 per oneway flight. The nuance was that EarlyBird Checkin did not guarantee a good seat. It only meant the airline checked that traveler in automatically. In many travel columns, people wondered if the service provided a good value. But on a la