Decision-making is part of politics, and indeed of every individual’s life. While science is not meant to provide ready-made solutions for politicians, it can reveal how processes and policies can be improved by looking at their underlying foundations. In the same way science can look at decision-making itself, how individuals arrive at certain conclusions, why they hold particular beliefs, and to what extent those are products of rational thinking or biases.
One book comes to mind in this respect, namely Daniel Kahneman’s bestseller ‘Thinking, Fast and Slow’. A Nobel Prize winner in economics and psychologist by training, Kahneman showcases how our decisions are often shaped by unconscious biases. To illustrate his point, he speaks of an individual ‘self’ as consisting of two systems. ‘System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control’. It generates impressions, feelings, and inclinations. System 2 is our ‘conscious reasoning self that has beliefs, makes choices, and decides what to think about and what to do’. Kahneman continues by saying that when the impressions and feelings produced by system 1 are endorsed by system 2, they become beliefs, attitudes, and intentions. Sure, it has by now become common knowledge that we often retrospectively rationalize or explain our actions or decisions, so they fit with subsequent developments. However, the book showcases that this often happens on a much subtler level, much sooner after an action or decision has been taken, if not almost concurrently with it and certainly without our control over this process.
While the separation in System 1 and 2 is artificial, it makes the point in demonstrating that humans are far from being able to act purely rationally and should not be treated as such by social sciences either (as they, for example, were treated by economists for a long time). Emotions and intuitive judgements play a much larger role than we think in how we take decisions.
In his book Kahneman dissects cognitive biases. Bias is not to be understood negatively, it is a way for us to make sense of reality. We put things into categories, assign labels to them to be able to process vast amounts of information pouring at us every day. While useful, biases can often lead to fallacies in our thinking.
Another core theme of the book is the role of effort in how we approach information. The mind is prone to avoid making much effort to analyze things. The law of least effort is valid for both mental and physical activity. System 2 is the ‘lazier’ and slower system, but it is the one that is able to tackle complexities. Below, I give some examples of cognitive biases that Kahneman discusses extensively in the book. While he mostly speaks of the individual level, one can also extrapolate his findings to the societal, group level.
Affect heuristic. Emotions play a larger role in shaping our beliefs and opinions than we might think. ‘Your political preference determines the arguments that you find compelling. If you like the current health policy, you believe its benefits are substantial and its costs more manageable than the costs of alternatives’. The dominance of conclusions over arguments is most pronounced where emotions are involved. Politics know many such events where convictions rather than the information at hand drove the action. The infamous example is the 2003 invasion of Iraq. It was justified by the ‘evidence’ possessed by the US intelligence services that Saddam Hussein had developed a weapons-of-mass-destruction program. American political establishment was convinced about that, ignoring plenty of facts that spoke to the contrary. Today it seems so obvious to state that, but at the time, how come so many high-ranking officials were led to believe the false information?
Randomness vs. causality. We tend to find causes and effects in phenomena that in reality are mere coincidences, confluence of events. Our associative System 1 looks for connections and explanations in things that do not have causal relationship. The following example illustrates that. One study of the incidence of kidney cancer in over three thousand counties in the US demonstrated that the incidence of the illness is lowest in counties that are mostly rural, sparsely populated, and located in traditionally Republican states. After having received these findings, we intuitively start looking for connections between the kidney cancer incidence rate and the characteristics of the above-mentioned counties. Probably, we will soon dismiss the idea that the low rate is somehow connected to the political attitudes of the inhabitants. But the fact that the counties are rural strikes us most and thus we conclude that people in rural areas are less likely to get kidney cancer.
However, as Kahneman explains, rural areas have small populations and thus the sample taken for the study will also be smaller. Smaller samples are less reliable for broad generalizations, as they usually produce more extreme results, or higher variance. Thus, the sample of that particular study demonstrated lower kidney cancer incidence rate. But if the study were conducted repeatedly, for example every year, it might well be the case that the same rural counties would demonstrate a much higher incidence rate. Randomness of sampling needs to be taken into account. ‘The small population of a county neither causes nor prevents cancer; it merely allows the incidence of cancer to be much higher (or much lower) than it is in the large population’.
Optimistic bias and overconfidence. Connected to causality is the so-called optimistic bias. We often overestimate the control we have over situations and fail to account for randomness, call it luck or misfortune.
Optimistic people tend to overestimate their odds of success. The probability of a success of small business in the US is 35%, but over 80% of aspirant business owners assess their odds of success to be 7 out of 10 or higher. In order to account for risks and assess the environment more objectively, one needs to take an ‘outside view’. Kahneman reminds us that eventually, only a few succeed but it is them who end up influencing our lives much more than anyone else. Optimistic individuals are ‘the inventors, the entrepreneurs, the political and military leaders – not average people. They got to where they are by seeking challenges and risks.’ However, they too overestimate their skill over the chance and luck and other factors outside of their control (such as competition).
This on one hand has moral implications, on the other – practical ones. While merit will be rewarded, same as failure, it should not be perceived as purely a result of an individual’s effort – ‘circumstances’ also play a role. In policy terms, this would be an argument for a state welfare system that, as it is indeed often portrayed, supports the ‘less lucky’ ones. One study on the role of luck is particular relevant in this respect. Using computer modeling, researchers at the University of Catania have demonstrated that the wealthiest people in the world are not the most talented ones (although they do possess a certain level of talent), but the luckiest ones.
Another cognitive bias is abbreviated as WYSIATI - ‘what you see is all there is’. By that is meant our tendency to reduce a situation or problem to the information available to us, or dominant at the moment of decision-making. Overconfidence arises because of this bias too. We focus on what we know, what we can/are capable of, what our plan is. But we are less conscious about the plans, skills, emotions, etc. of others that also shape the given situation.
Can overconfident optimism be overcome by training? Kahneman is skeptical. The studies so far have shown only modest success. ‘The main obstacle is that subjective confidence is determined by the coherence of the story one has constructed, not by the quality and amount of the information that supports it… Organizations may be better able to tame optimism of individuals than individuals are’.
What takeaways for politicians Kahneman’s study of individual behavior might entail? A few come to mind: be aware that overconfidence might be harmful, and that however informed and qualified your advisors are, there are things that you still might not know, try to take an outside view. Make sure you distinguish between causes and effects and be aware that many events are not necessarily connected to each other, however coherent a picture they create if put together. Causality is convincing but randomness is more pervasive. Make effort to think about important issues with your System2, instead of being led by System 1, that is emotions, feelings and immediate reactions. Politics is usually happening on a high speed but those who will manage to slow down will be rewarded.
I thank Jakub Simek for his insightful comments and examples.
Marylia Hushcha is a Research Assistant at the International Institute for Peace in Vienna and is a board member of Think Tank Ponto. She previously worked at Pontis Foundation in Slovakia, where she managed a capacity-building project for NGOs in Russia. Maryia has completed training and fellowship programmes at the United Nations Office in Belarus, the European Academy of Diplomacy in Warsaw, and the University of San Diego. She holds a Master’s degree in European Studies from Comenius University in Bratislava.