The point of the ultimatum game is that it is explicitly set up as a one-shot experiment (as you point out, an iterated version would have a completely different Nash equilibrium). If players are really predicating their decisions on something that they know to be false, then I would have no problem with labelling that irrational.
However, while the experiment ends after one iteration, the participants continue to exist. The person who accepts a lowball offer will have to live with their decision. They may go home and yell at the kids, kick the dog or generally make the world a slightly worse place in any number of ways. The tools provided by economics have no way of dealing with this, so economics models what it can rather than what is.
You could argue that we should consider the process instead—a person is rational if they explicitly weigh these factors when making a decision, and irrational if they simply rely on heuristics that have served them well in other situations. This does not help us, though, because the difference is not observable. In any case, what is a rational weighting of these competing demands?
Here is my real point: we may redefine ‘rationality’ in any way we choose, but who decides which heuristics are ‘rational’ and which are not? Is it rational, in this definition, to offer a 50:50 split if the voices in your head tell you to? I would argue yes—hearing voices makes you crazy; listening to them seems perfectly sensible. If God speaks to you, you will do what He says: that’s just common sense. But I bet I could find somebody who disagrees, and therein lies the rub. The definition is arbitrary, varies from person to person and from time to time, and is therefore the legitimate subject of political debate.
So we may fix the definition of rationality and have its role in society determined by politics, as I argue in the essay. Alternatively, we may fix its role in society and have the definition determined by politics. To allow one group to determine both is dictatorship.
Nick Butcher writes:
Good stuff. Two comments - I’ll post them separately so they can be replied to separately as they’re on two different topics. The first - rationality in ‘ultimatum’ games.
I think the alleged “irrationality” of the typical persons response in these games is unjustly assigned. As you say, the usual representation is that “people act irrationally from a perspective of pure commercial self interest, therefore they must be considering other factors”. I think this is a flawed interpretation, as a ration person knows that their actions are not perceived in isolation. A purely rational person has a strong incentive to make sure they get a “fair deal”, because a rational person knows that this deal will influence their positioning in future deals. By fostering behaviours in others that promote fair trading (i.e. if it’s not at least close to fair than no-one gets anything) they increase the likelyhood of their getting fair deals in future, and hence their overall return. Even if they have to refuse 19 people offering them 98/2 splits to get the message home, they recover it all with the next person who offers them 60/40.
I think it’s also useful to consider the stakes. People will only take the above “rational” behaviour if there’s a high enough event frequency that they can play the averages game. The average person would probably say “no” to a $99/$1 split… but I think the average person would pause long and think very hard before saying “no” to a $990Mil/$10Mil split. In fact I think the average person would say yes, simply because despite the fact that they’re getting ripped off it’s probably the only chance they’ll ever have to be ripped off with such a strong net reward regardless.
So, both cases actually are rational. Usually irrational behaviour is either attributed by people who haven’t considered the appropriate externalities… or in reference to people who aren’t.
Hence my proposal to my old man, a economic consultant for ~30 years (who was claiming the “people always act in their own best interests” stance) for a revision of the rule to: “People act in their own best interests to the best of their ability”. Ability which, with the complexity of most systems these days, is pretty poor.
Should be an app for that…