In this series of articles I share ideas on how we make decisions as individuals, groups, and organisations and offer my advice on how to design the decision-making processes that combine traditional methods of analysis with more flexible, adaptive approaches—to be better suited to the problems we face in the Age of Uncertainty. Applications are numerous—from individual choices, to business management, public policies, security and conflict management, and international development assistance. In the posts to follow I will share some practical solutions I have developed for various situations or practice domains and issues. I hope that you find them useful—they are intended to be.
Decisions We Make
In everyday life we make most of our decisions intuitively (some say automatically), without involving much computation. For instance, if the situation seems familiar we will tend to draw on our past experiences. We usually use simple techniques, such as rules of thumb (or heuristics) to make judgements in such situations.
Moreover, we employ the same techniques when making judgements under uncertainty, when we know little about an object or have limited, if at all, past experience. This allows us saving time and navigating not too complex situations, or being under constraints of time and with limited amount of information available. In most cases it works well.
However, behavioural scientists have found that our intuitive decision-making has a number of biases (systemic errors) which hinder our ability to choose an optimal option. They also claim that they are predictable (and thus manageable). Drawing on years-long experimentation, psychologists Amos Tversky and Daniel Kahneman have developed a prospect theory back in the 1970’s, to explain how we make certain biased (or statistically flawed) assumptions, especially when weighing probability of something, in the face of uncertainty (years later, Kahneman was awarded a Nobel Prize in economics for their research).
The method used is frequently referred to as ‘heuristics-and-biases’. I am sure many of them are familiar to you, and not only from individual experience but also that of groups, business organisations, government bodies (we remain humans at work, don’t we?):
We usually use assumptions based on stereotypes when judging on probability of some object or event belonging to certain category. For example, we judge someone’s behaviour on the degree to which their actions are representative of a particular category. Apparently, so does jury in the court of law when categorising the alleged crime of a defendant. How many of us, as customers or in entrepreneurial capacity, have been fooled by someone’s behaviour or appearance of business premises of some firm—simply because they ‘represent’ our idea of what a successful businessman/business should look like? Richard Thaler, behavioural economist and co-author of ‘Nudge’ gives an example: ‘People can nudge you for their own purposes. Bernie Madoff [the Ponzi scheme fraudster] was a master in the art of winning people’s confidence and taking advantage of it. I don’t think he needed to read my book. I think he could have written a better version of it himself.’
Stereotypes are powerful; just look around to see how they influence our inter-personal and inter-group relations in society. Understanding how they influence, through representativeness heuristic, our judgements is very important – for preventing crime (especially hate crime which is on the rise both in Europe and in the US), improving relations at workplace and interactions in various public spaces and undertakings.
We employ adjustment from known things (using them as anchors) to estimate unknown things or make predictions. Sometimes the initial value is automatically suggested by problem formulation. What if you defined the problem inaccurately? Or the value of similar case you rely upon is too specific to serve as an adjustable target for your case? As experiments have shown, we also tend to use the information available (often-time what comes to mind first) without critically examining it for relevance. Whatever adjustments you make afterwards it won’t help, because the baseline is already incorrect.
Have you noticed that when negotiating price, terms and conditions of a deal (may be salary, loan conditions, business contract)? Who first suggests the value (numerical, percentile or monetary as relevant to the topic discussed) sets an anchor, and the rest of discussion will in fact be a bargaining exercise around this very value, whatever distant of the other side’s initial idea it is. Try it, if you haven’t done so before and you will see that it works. I see it as a narrow application of a saying that ‘who sets the agenda controls the outcome’: in this case you control the outcome by establishing the mean value that is favourable to you.
We use mental shortcuts to immediate events or facts, as they first come to mind, when making inferences. It equally refers to those evoking positive and negative emotions. We perceive them as more familiar and common (or rare, depending on perspective). And more we are occupied with these facts, more we become convinced. You see a car crash—you start driving cautiously (at least for some time). You read a shocking story (also with pictures) about food poisoning from certain products, you will avoid consuming them. Couple of your former colleagues lost job, and you immediately think unemployment in your sector is high.
Another well known example is lottery. Do you know how many people get encouraged to buy lottery tickets or go much beyond their normal limit of spending immediately after they hear of X winning forty million ‘just like that’? And this effect occurs contrary to the logic of rationality, as an unexpectedly large jackpot won by X will be resetting to a lower level, thus chances eventually decrease to win so much.
Entrepreneurs know it well—news are all around that Y got very rich, and quickly so, by producing/serving/selling something; we immediately think of it as the most promising opportunity for our business growth and thus overestimate the likelihood of success and overspend. Similarly, the availability bias affects the investment decisions: for example, in the years immediately following the financial crash of 2008 investors’ persistent perceptions of a dire market environment was causing them to view investment opportunities through an overly negative lens and thus avoiding risks in favour of ‘safe’ investments, no matter how small their returns.
We tend to overestimate own strengths and capacity while underestimating potential barriers. Partly it is explained by our memory’s leaning more toward positive experiences than to failures (we work hard to forget, to block, or at least to portray in rosier tones past unpleasant experiences, don’t we? This appears to be no good for learning though). Therefore, when planning, estimating scenarios we refer to best-ever achievements from our internal archive. It is quite natural for us—as psychologists claim, we assess ourselves by our best intentions while others judge us by our worst deeds.
This selective reference to positive examples makes us prone to be overly optimistic in situations when more cautious approach is advisable. It frequently results in unrealistic plans (both time and effort, cost-wise) which then have to be revised, sometimes repeatedly. Have been there? Building a vacation house, developing or implementing a project, weighing plans to enter a new market? And it applies to all endeavours, big and small. Take, for example, Sydney’s Opera House. Budgeted at an initial cost of $7 million, it ended up costing more than $100 million and took more than a decade to construct (what makes it the most expensive cost blowout in the history of mega-projects around the world).
One useful method to tackle the optimism bias is to review the initial plan when equipped with the findings of risk assessment. Put under scrutiny timeline, budget, supplies etc. item by item against quantified operational, political, technological, customer and other relevant/applicable risks. You will be greatly surprised to see how ‘shining’ numbers would shrink and the plan would immediately drop from ‘highly advantageous’ to some lower category, if not abandoned altogether. I have seen projects that as a result would go from confidently positive Net Present Value (NPV of cost-benefit analysis) into negative, prohibitive value in fact.
Well, I think it is better than being fooled by your (own or team) overconfidence. You still can go ahead with implementation, but with eyes open this time around. The UK Treasury even uses software to mitigate optimism bias (especially in infrastructure, capital investment projects of the government).
Or take a phenomenon known as loss aversion. When evaluating various options and assessing their potential benefits and risks, we weigh losses higher than gains. It appears that loss of things we possess hurts twice more than gains make us feel good. Therefore if the values of potential win and loss are in the same range and probability of losing and winning are equal, chances are high that we will opt for not taking risks. Psychologists like this mental bias because it is easy to prove in controlled, laboratory environments: simply toss a coin. Try it yourself. Imagine that you were offered a gamble on the toss of a coin (that is 50/50 chance) in which you might lose $50. What would be your preferred payoff? I bet you will demand much more than loss, most probably something around $100.
This in part explains another phenomenon known as sunk cost fallacy, when more we invest in something harder it is to abandon it (and we obviously find many reasons to justify our ‘rational’ decision). Just recall your experiences. Very simple one: you drive to an outlet mall to buy a certain product but don’t find it there; you will buy something else. Of course you will justify your purchase that you needed it, but in reality you don’t want go home empty-handed after driving too far.
Or think how many companies you know of, who suffered heavy losses because of stubbornly refusing to give up a project or product they had invested in considerable funds? One famous example: British and French governments continued funding their joint venture of Concorde even when it was crystal clear that the airplane sales would fall short of the expected level of return to keep the business going. Or think of overseas wars: ‘We have invested too much in this campaign for too long. Too many lives lost, too much money spent. We simply cannot stop it now; it would equal to surrender, so we must fight it to victorious end’.
…and many other
There are many other examples of our mind’s work that will amaze you. One of my favourites is halo effect, our tendency to assume that if product X is good for doing Z, it is perfectly suitable for Y and X, too. Or, if product S of a certain company is good, other products N, M, L and W (even from a totally different product line) will be equally good. Or, because certain person is good at doing D they will be good at doing B, A and C (or the reverse—because they are bad at doing A they will be bad at doing B, C and D). Sounds familiar from the workplace?
* * *
Although everyone seems to agree that we do use various mental shortcuts to make decisions easy, not everyone accepts our intuitive decision making as deficient (as compared to computed solutions) and erroneous. Many management practitioners rely more on so called ‘naturalistic’ methods which make best of our in-built cognitive capabilities, and rightfully so. With regards to using big data and sophisticated software some argue that abundance of information is costly and often-time confusing, while much better decisions (especially under pressing circumstances) are made with less but relevant information.
Both sides have a point to make, and I think that we have to exhibit more flexibility in adopting a variety of analytical methods and use them in a complementary manner. Remember? It is not the quantity or even quality of data but quality of decisions we make upon them that matters.