Monday, September 23, 2013

Context and Decision Making


The following short clip from The Wire, a drama about the Baltimore drug trade, perfectly illustrates the role of context in decision making. Wallace, a boy employed in one of the lowest rungs of the drug trade, is woken by one of his younger siblings for some help with her maths homework. It's a simple problem about the number of passengers on a bus. The girl struggles, nonetheless. (Link to clip here). 

At 1:30 in the video, Wallace reframes the problem into a context that is much more relevant to their world: the mathematics of working as a low level drug runner. The girl, formerly unable to “do the math”, answers him instantly. The scene might be intended as a portrayal of the world that they live in – poor formal education from the schooling system, and a shockingly early introduction to the illegal industry that surrounds them – but it also serves as a great example of how context can greatly affect our ability to reason correctly about the world.

Some psychologists, who see the human mind as being inherently limited and often unable to cope with the demands of the world around us, tend to underplay the importance of context in decision making. The Nobel prizewinner Daniel Kahneman, for example, as one of the leaders of the “heuristics and biases” programme of research in psychology and economics, has done a lot of work on situations where average responses to certain problems are systematically wrong. Although these problems are usually more complicated than the examples in the clip, they nonetheless also have a “correct” answer which people predictably stray from. Predictable errors are known as biases, and the common explanation for these mistakes is that people use “heuristics”: shortcuts in decision making which, although not intended to yield a perfect answer, are often required to simplify the computational burden that these problems impose. Kahneman has recently written a non-technical book about his life’s work called Thinking, Fast and Slow.

The Wason selection task is one problem which has stimulated much debate in the psychological community due to the errors in reasoning that many people show. There are four cards in the task. Each has a number on one side, and a letter on the other. The cards are displayed with only one side showing, as follows:

A K 4 7

It is then required for people to turn over only the cards necessary to test the proposition that, “if there is a vowel on one side, then there is an even number on the other”. Have a think about what cards you would turn over before proceeding.

It should be immediately obvious that the A card requires inspection. And the vast majority of people do turn this card over in experiments. But are there any other cards that should be turned over? Many people also turn the 4 card over. This seems to make sense: the question does, after all, mention vowels and even cards. However, there is no way that turning this card over can disprove the hypothesis. If the card is turned to display a vowel, then this appears to be evidence in favour of the hypothesis. But turning the 4 over to show a consonant tells you nothing. This is because the statement begins with “if there is a vowel”, it says nothing about if there is a consonant on one side. Now consider the 7 card – which is infrequently selected by people in this experiment. Finding a consonant will tell you nothing on its own, but will be enough to test the rule in conjunction with the results from the A card. But what if you find a vowel? In that case, the hypothesis is false, and the proposition has been answered.

Few people reason this way (often less than 5%). So in that respect, the heuristics and biases programme is correct: human reasoning is flawed, at least when it comes to problems like this. But why do they make errors? It has been shown that this is partly due to the problem’s abstract content. It is a pure test of logical deductive reasoning. There is little knowledge from the outside world that can be brought to bear on the problem (except for textbooks on logic). And just like in the clip from The Wire, transforming the problem into a more familiar context leads to a huge improvement in performance.

The following, known as the drinking age problem, is logically equivalent to the abstract Wason selection task. Again, there are four cards. On one side of each card is a drink (either alcoholic or not), and an age (either above 18 or not). The rule is, “if someone is drinking alcohol, then they must be over 18”. The four cards are shown as follows:

drinking beer drinking coke 25 16

In this context the correct solution jumps out from the page at you. And, interestingly, the problem is logically identical to the abstract task. The beer drinker must be checked (card A in the problem above). The coke drinker can be skipped (K). The 25 year old can obviously be skipped (4). Interestingly, this is the card that causes so many problems in the abstract task. The rule is asymmetric in both problems, and can be presented in the general form of: if P, then Q. If there is a vowel on one side, there is an even number on the other. If someone is drinking alcohol, then they must be over 18. In the abstract format, many people check Q, even though this can never tell you anything about the validity of the rule. Not looking at Q is obvious in the drinking age problem: of course you don't have to check the 25 year old! Furthermore, it is also obvious in the drinking age problem that the 16 year old's drink should be checked (7), which accounts to not-Q. This is because if a P is found on the other side – either someone drinking alcohol, or a vowel – then the rule is invalidated. In the abstract task people look for evidence that confirms the rule; in the drinking age problem people correctly look for information which might disprove the validity of the rule.

Simplifying a problem by adding a familiar context is just one example of a broader approach to improving judgements. The heuristics and biases programme focuses on situations where heuristics (decision making shortcuts) lead to errors. The programme of ecological rationality looks at situations where heuristics lead to accurate decisions despite their simplicity. When it comes to the drinking age problem, it is argued that we have developed a set of heuristics related to the social world and enforcing social norms (drinking alcohol) against cheaters. (This paper by Cosmides and Tooby is the essential reference for performance on the abstract Wason selection task and various context-laden versions.)

Interestingly, this strand of research has been largely left on the riverbed by the current of research in psychology, economics, and the other behavioural sciences aiming to improve real world outcomes through policy changes and other “nudges”. The heuristics and biases subset is used almost exclusively by behavioural economists looking to add psychological realism to their models, and it is also what has been picked up the most by the media and popular press. The irony is that while heuristics and biases describe mistakes, a number of ecological rationality effects have shown robust ways of decreasing errors. In my own opinion, this paper by Gerd Gigerenzer should be read much more widely due to its survey of how a number of famous cognitive illusions can be reduced.

The positive result for the field is that there are a number of psychological effects that could be added to the behavioural scientist's toolbox of nudges. Many researches are running randomised controlled trials in the field – often drawing from a small set of techniques such as default options or plain financial incentives. But we should also consider following Wallace's strategy from The Wire: seeking to add real world context for novel and unusual decision problems.

Take the world of retirement saving. Trends in the industry are pushing individuals to make a number of their own investment decisions, either alone or with a financial adviser. This is a truly novel and difficult problem for many of us: financial theory is mathematically nuanced. We should focus on creating an optimised portfolio of many different assets to minimise risk for a given return. This is called diversification. Many people diversify completely naively: putting an equal amount into each asset. In contrast, optimal diversification depends on the correlation between assets. Risk reduction is enhanced when assets are very different and hence display low correlations. Smart investors should therefore diversify broadly between different asset classes such as stocks, cash and property. Within a single asset class, however, such as property, we should optimise (by buying the best house possible). This is because there is much less diversification benefit from buying two houses (especially if they are in the same area), than there is from buying one house and owning a portfolio of stocks.

The correct strategy is fairly simple to implement but may simply not occur to many people too busy to think about retirement, a time often far in the future and lacking in salience. But perhaps adding familiar context to the situation can be used as a frame to simplify the problem – like Wallace, and in the drinking age problem. Here's one possibility. Optimal diversification is similar to healthy eating. There are three broad food groups, protein, carbohydrates and fats. Each food group has further subgroups, such as (un-)saturated fats. Each food group is essential to a healthy diet to meet the body's needs, but the best subcategories should be emphasised (e.g. unsaturated fats, starchy over sugary carbohydrates, and high quality sources of protein). Diversify between groups, and optimise within them.

It's similar to the optimal investment strategy. Maybe investment advice could be promoted to the population with this useful context. People are nowadays forced to make their own decisions in often unfamiliar problems, where important information is often presented in an unfamiliar context. Financial decision making is a game with its own rules and mathematical structure, but the same can apply in say, medical decision making, where the framing of risk is critical to good outcomes. When it comes to “nudge” policy, there are many ways of potentially adding context to simplify important decision making tasks. My argument is that this possibility should be explored in greater depth as a complement to more popular nudges such as the use of default options, as the current standard techniques fail to make use of many potentially beneficial psychological effects. Maybe we could even start educating children as young as Wallace's sister in this way.  

1 comment:

Liam Delaney said...

Nice post Philip. That clip is fantastic as well.