Economists who buy lottery tickets: Reflections on Less Wrong
Faithful reader Robert asked me to respond to an article by Eliezer Yudkowsky of “Less Wrong” called “Outside the Laboratory.” The article deals with the rationality (and irrationality) of scientists. Robert Gressis added that he too would like to see my response, though “it seems like a massive undertaking.”
Always up for a challenge, I read through the article and then began writing a response. However, I found myself getting hung up on a point only a third of the way through the article. That point concerned whether economists can rationally buy lottery tickets. As we will see, this simple example requires some unpacking before we consider the rest of the article in a subsequent post.
“‘Outside the laboratory, scientists are no wiser than anyone else.’ Sometimes this proverb is spoken by scientists, humbly, sadly, to remind themselves of their own fallibility. Sometimes this proverb is said for rather less praiseworthy reasons, to devalue unwanted expert advice. Is the proverb true? Probably not in an absolute sense. It seems much too pessimistic to say that scientists are literally no wiser than average, that there is literally zero correlation.
“But the proverb does appear true to some degree, and I propose that we should be very disturbed by this fact.”
The reader’s interest is piqued. What might this less-than-rational stance of a scientifically minded individual look like? Yudkowsky then provides the following example and it is here that we shall focus:
“Now suppose we discover that a Ph.D. economist buys a lottery ticket every week. We have to ask ourselves: Does this person really understand expected utility, on a gut level? Or have they just been trained to perform certain algebra tricks?”
Yudkowsky seems to assume that there is some inconsistency between the purchase of a lottery ticket and knowledge of expected utility. But what precisely is that assumption? Let’s try formulating “Yudkowsky’s Assumption”:
(YA) “If an individual really understands expected utility s/he will not buy a lottery ticket.”
I think it is quite clear that (YA) is not in fact Yudkowsky’s assumption because his article is motivated by the very problem that some scientists hold prima facie irrational beliefs and engage in prima facie irrational behavior precisely like that described in the economist example.
Let’s try a second version of the assumption:
(YA-2) “If an individual really understands expected utility and is psychologically integrated and rationally consistent s/he will not buy a lottery ticket.”
This formulation allows for the economist who understands expected utility and still purchases a lottery ticket due to a lack of psychological integration or other irrational forces.
Unfortunately (YA-2) is false. You see it is very easy to think of cases where an economist understands expected utility, is psychologically integrated and still rationally purchases a lottery ticket. For example, that economist may have an ailing mother in the retirement home who has asked him to purchase a ticket for her. The economist recognizes that the anticipation of the draw provides much needed excitement for his mom’s otherwise dreary life and so he willingly purchases the ticket.
Let’s try another formulation of Yudkowsky’s assumption to see if we can make it come out true.
(YA-3) “If an individual really understands expected utility and is psychologically integrated and rationally consistent s/he will not buy a lottery ticket with the expectation that s/he has a reasonable chance of winning.”
(YA-3) allows for the purchase of a lottery ticket so long as the economist does not believe that ticket has a reasonable chance of winning. While this might seem to provide a true principle, unfortunately one can readily envision situations where (YA-3) is false. And here I won’t bother to mention cases where, for example, only ten tickets were sold. Instead I’m assuming that we’re dealing with a typical lottery where millions of tickets have been purchased. Even in that case (YA-3) is not true. For example, consider the following scenario:
Don the economist’s old high school friend Mack is now President and CEO of the Lottery Corporation. One day Don sees Mack at a coffee shop. Don is unloading on Mack the need to secure capital to invest in his micro-lending agency which is working in Bangladesh. Suddenly Mack chuckes and replies with a twinkle in his eye: “Don’t worry about it Don. Just make sure you purchase a ticket for the ‘Lucky 9′ draw this Thursday.” Don replies incredulously: “Mack, I’m an economist. I understand expected utility.” Mack’s trademark mischievous grin flashes across his face. “Trust me Don. Just buy your ticket for the ‘Lucky 9′ draw.”
Based on this exchange would it be rational for Don to purchase a lottery ticket? Assuming that Don has good reason to trust Mack, it would indeed be fully rational for Don to purchase the ticket. But this means (YA-3) too is false.
So now we come to a fourth and final revision:
(YA-4) “If an individual really understands expected utility and is psychologically integrated and rationally consistent s/he will not buy a lottery ticket with the expectation that s/he has a reasonable chance of winning unless s/he has some overriding reason to believe it is likely that s/he will win the lottery.”
Finally, we have arrived at a principle which seems likely to be true. Of course, I have no idea whether (YA-4) was in fact Yudkowsky’s assumption or not. But this little exercise in finding a defensible assumption does have an important payoff. You see, it reminds us that it is actually more difficult to identify a belief or action as being irrational than we might think. A person may hold a belief or engage in an action which would appear on the surface to strike us as irrational or otherwise inconsistent, but which may be fully rational based on all the knowledge, reasonable beliefs and experiences to which that individual has access. We will have to keep this in mind as we consider the rest of Yudkowsky’s article.