Corporate Decision Making & Sustainability, with Olivier Sibony

Olivier Sibony teaches Strategy, Decision Making and Problem Solving at HEC Paris. He is also an Associate Fellow of Saïd Business School in Oxford University. Olivier specialises in strategic decision making and behavioural strategy. As he prepares to release his new book, Noise, co-authored with Cass Sunstein and Nobel Memorial Prize winner Daniel Kahneman, Olivier explains in this interview how to make effective group decisions and shares his views on sustainability.

Behaven — Hi Olivier, it’s a pleasure to have you as our second guest! First of all, can you tell us more about your area of ​​expertise?

Olivier Sibony — Hi. Before being professor in strategy and decision-making, I was a strategy consultant. I spent 25 years working at McKinsey. It was the observation that strategic decisions are not always what they seem to be that led me to change careers and teach strategy. The decisions of consumers or economic agents are not always rational, at least not in the sense of traditional economic rationality. Strategic decisions that we think are made with care, attention, and method, are just as prone to errors, biases, or distortions as any other decisions.

My area of ​​expertise is the application of behavioural sciences to strategic decision-making. Behavioural strategy is to strategy what behavioural economics is to economics, behavioural marketing to marketing, and behavioural finance to finance.

Getting to the heart of the matter, we know that our individual decisions are influenced by our cognitive biases. Can group decision making within an organisation avoid the influence of these biases?

Group decisions are the best, but also the worst. In fact, all the strategic decisions made within organisations are taken as a group. Decisions that are made by a CEO alone at the top of his ivory tower only exist in the movies.

In reality, there is always a management team, people who prepare the decisions, people who advise them and boards of directors which supervise them. A big part of behavioural strategy is to question whether these groups, and not just these individuals, work together to make decisions that are not always optimal.

Why are these decisions not always optimal?

First, there are incentives that are perfectly rational in the sense of traditional economics. When you are the merchant bank advising a company on an acquisition, the fact that you get paid only if the acquisition occurs will naturally lead you to recommend it rather than advise against it. It's nothing new psychologically-speaking, but those incentives are what they are and don't change, and it doesn't seem to bother anyone. These are subjects that deserve to be examined. In the same vein, it has long been known that leaders have empire-building tendencies, and want to expand their organisations. This is explained economically, because you are almost always better paid when you run a large company rather than a small one, but more prosaically and immediately, because it is better to be the one who is on the cover of the magazine because he won the battle to acquire Company X, than the one that nobody talks about because he lost it. These incentives mean that groups do not necessarily make optimal decisions.

There is also a group dynamic which means that groups can sometimes make decisions much worse than those of the individuals who compose them. On the one hand, groups can become polarised: they can strengthen themselves in a belief that was held by the majority at the start and that they will accentuate. They end up with a more extreme opinion than the starting median opinion, and trust this extreme opinion more than they did at the beginning. On the other hand, groups can align themselves with the perspective of the dominant actor who, in a company, is usually the director. They will try to find out the director’s position and align with it.

On top of that, a natural tendency to group-think can emerge when people take turns speaking. It's a silly thing that you see in all organisations. Say you go around the table and it happens that the first, second, and third people to speak are all in favor of the proposal on the table. It then becomes extraordinarily difficult for the fourth or fifth person, who thinks exactly the opposite, to say that they think exactly the opposite. This is true for reasons of social pressure but also for a perfectly rational reason. When you are among a group of people you presume to be competent, who speak about a subject they know - perhaps better than you since they are the first to speak or because it is their field and not the yours - and that they say that an idea looks very good, it is quite normal, logical and rational that you adjust your beliefs to reflect what they think, even if you think differently. Only fools do not change their minds, especially when they hear advice from people who are presumed competent and well-intentioned, so for excellent, rational reasons.

To sum up, in addition to social pressure and hierarchical pressure, groups will tend to converge towards a dominant opinion which is not necessarily the right one.

How can we get around this?

Getting people to decide as a group is important, but that should be done carefully. My prescription is collective action and method: collegiality and a good decision-making process. One without the other is worse than nothing at all. All organisations have some form of collegiality or group work. But that doesn't necessarily lead to good decisions. It can be quite the opposite.

For a group to produce collective intelligence, having intelligent people around the table is not enough. Contrary to what many people think, having people who are demographically or sociologically diverse is not enough neither. It can help, but demographic diversity may not translate into cognitive diversity or diversity of opinion. And when it does result in a diversity of opinions, these opinions might not always be expressed. Diversity may be necessary, but it is not enough. Collective action and method are crucial.

What about experts in these group decisions? Do they change the dynamic?

Experts will not necessarily have more weight in the decision making process. In many organisations, conviction and self-confidence influence opinions way more than actual expertise. The person who speaks with great self-confidence will be listened to. The person with the most skills may or may not speak confidently. The influence of experts will vary depending on the relationship between their level of self-confidence and assertiveness on the one hand, and their level of competence on the other.

A lot of decisions are made by people who are poorly qualified, but very vocal. We see it in organisations, we also see it in public debate. For example in debates about the current health crisis. People whose incompetence is obvious to experts can speak with such confidence that you end up wondering if you should believe them.

In behavioural sciences, the mere exposure effect illustrates the fact that a statement, true or false, becomes credible when it is repeated. Fazio, Rand and Pennycook (2019) did a study in which they tested absolutely absurd assertions. The first time around, only 5% of people believed these claims. By repeating them a second time to the same people, this number increased from 5% to 10%. It's not much, but the number of people believing something obviously wrong doubled, just because they heard it twice.

When you say things with confidence and that you repeat them, you give them a certain credibility, regardless of any expertise. This is true everywhere, but it is especially true in organisations. You sometimes see beliefs absolutely devoid of factual basis emerge in organisations that should be rational.

Is it possible to get around these biases?

I'm tempted to tell you that we can't. One thing that I always tell my friends, my students or the executives that I train, is to never go and explain to their boss or colleagues that they are all biased and that they are all making very bad decisions. They won't make friends and they won’t influence anyone.

The only person who can make a difference is the one leading the decision making process. You can change the way decisions are made when the decisions depend on you. When you are leading a team, you can change the way that team works. When you are part of a team that is led by someone else, you cannot. You can suggest options, you can try to influence, you can offer my book to your boss (laughs). My book is called 'You're about to Make a Terrible Mistake' and I've found it to be a great gift for your boss. I think at least half of the people who buy the book give it to their boss. It's a way of delivering a subtle message that works very well.

All kidding aside, what you can do is change the way the decision is made when it's up to you. This requires the establishment of a decision making process that gives everyone an opportunity to express themselves so that the opinions of the first person to speak do not influence those of the second and third.

There are loads of practical techniques for addressing cognitive biases in group decision making. An extraordinarily simple and surprisingly rarely applied example: before going around the table, take 30 seconds and ask everyone to write down what they think. If you are the fourth person to speak, and the first three said white when you wrote black, you can always change your mind and say white, but it's difficult when you have a sheet in front of you detailing the five reasons why you thought black. Before starting this exercise, it is also useful to explain that the purpose of the exercise is not to reach a consensus, but rather to hear various and dissonant points of view. Alfred Sloan, CEO of General Motors, said: “If we are all in agreement on the decision then I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what the decision is all about.” Instead of aiming for consensus in your group discussion, try to have dissonance and debate, and signal this clearly so that everyone has their say.

Another simple idea is to do the opposite of what leaders always tell people to do, which is to ask everyone to speak in a nuanced way, rather than having a clear and definite opinion. When the subjects discussed are complicated, we have the right, and even the obligation, to nuance our opinion. Asking people to say that everything is white or black, just because everyone must have a clear position, amounts to pushing your responsibility as a leader onto others. As a leader, your responsibility is to make a decision, and to do that, it is in your best interest to hear different points of view instead of asking for strong opinions. And if it's not white nor black but gray, you want to know why it's gray, and what shades of gray it is.

It's things like that that change the dynamic and the tone of the debate and that will allow a group to make better decisions.

Let’s move to the subject of environmental sustainability. In your opinion, what are the cognitive biases that prevent us from making the decisions that will lead us to change our behaviour and act in a pro-environmental way?

This is a deceptively simple question. If we had wanted to design an ideal problem that cognitive biases would make insoluble, we would have chosen the problem of climate change. First of all, it's a long-term problem and we know that people are much more sensitive to short-term than to long-term issues. It is also a problem of collective action; If I do things that others don't do, I self-penalise and that's unfair. Why shouldn't I take a 35-euro flight to go on vacation, because it pollutes, while others are going on vacation?

Certain phenomena of social mobilisation, and in some countries political polarisation, lead us to people believing who they want to believe and who looks like them. Lots of good and honest people think there is no climate problem. And they're not all crazy or stupid. From our positions as academics or intellectuals, we tend to say that these people are irrational. But in reality, they are doing the exact same thing as us. They trust the people they trust to tell them the truth. I did not look at the raw data behind the IPCC reports on climate change, I trust the scientists from the IPCC who tell me that, if we do nothing, the temperature will rise by several degrees with tragic consequences. I chose to belong to this community and I believe these people with the same naive fervour as the people who believe their parish priest who promises them eternal life, or those who believe the scientists paid by Exxon who affirm that climate change does not exist. We cannot simply say that there are rational people on one side and irrational people on the other. Some issues are very complicated and cannot be explained simply.

On top of that, we currently live in a world where, for many people, the main sources of information have fallen into the hands of a few monopolies, namely Google and Facebook. These monopolies have an economic interest in strengthening these group phenomena. It is in their best interest that people who use them as sources of information only go to them for information, and that these people remain in groups that share similar opinions. They have an interest in these groups expressing their opinions more and more forcefully and in increasingly extreme ways. We know that negative feelings, such as being outraged, promote public engagement. And public engagement leads to the online presence of content that generates advertising revenue. That’s the business model of Facebook, YouTube, Google, etc. Therefore, it is now possible with social networks to never leave echo chambers in which false beliefs are constantly reinforced. The assault on the Capitol a few weeks ago is a striking example of that. What’s incredible about this story is not that people managed to storm the Capitol, but the fact that they lived in such a bubble that they imagined that the whole country was behind them at that time.

The same is true in the environmental domain. We can completely disconnect ourselves from reality by being informed exclusively by a group of people who share our opinions and who take them to an extreme. This disconnection from reality has no precedent in history. A lot of people are not at all convinced by the environmental problem. Awareness is moving slowly, and not for most people. In France, we are not the most reluctant, but in the United States, the number of people who do not believe in climate change is considerable, especially when it comes to a subject on which there is no scientific debate.

Another problem, because the list is endless, is that when it comes to environmental matters, intuitive solutions aren't necessarily good. For example, I believe that in France 69% of people (and 86% of people aged between 18-34 years old) think that the nuclear industry contributes to global warming, even though it emits way less CO2 than fossil fuels. The affect or affective heuristic makes us think that nuclear = bad thing, CO2 = bad thing, and therefore nuclear = CO2, when nuclear is obviously part of the solution to climate change. Today and in the short term, we don’t know how to produce energy without emitting CO2 other than with nuclear power. And yet some countries are closing nuclear power plants to open coal-fired power plants instead, which are increasing emissions. It's depressing.

In order to reach the people who are locked into their own information system or echo chambers, wouldn't it be useful to convince some charismatic people, ‘messengers’ part of their system, to disseminate environmental messages?

It is one of the levers for change. When opinions about the environment change, it’s not because of the IPCC reports. It’s because singers, actors and Swedish high school girls, perceived as credible messengers, take hold of the subject. The good thing is that they do promote awareness. The bad news is that the solutions people turn to are not necessarily the ones we would always want to promote.

The bottom line and the real question to ask ourselves is: which behaviours should we encourage? What do we want people to do? When we observe people’s CO2 emissions, we realise that they come from people heating their homes, using their cars, eating meat, having children and perhaps, going on vacation - even if this is not a huge part in their individual behaviour.

Realising the problem is one thing, but what do I personally to fight climate change? What am I changing today? We may have great confidence and great admiration for this or that opinion leader with whom we identify, we do not quite know what to do.

Do you have an opinion on which behaviours should be encouraged?

If you want behaviours to change, you have to suggest a specific behaviour. If you tell people to throw their plastic trash in the yellow bin, they will do so eventually. But if you tell people to be more eco-responsible, they don't know what to do. When you tell people precisely what behaviour to change and give them the right nudges and incentives at the right time, it is not impossible to get them to change. Take the example of the past few months; if we had told people “there is a virus out there, be careful” nothing would have changed. By advising them not to shake hands, wash their hands for 20 seconds, and stay three feet apart, we got results.

When it comes to the environment, the argument that every action counts is obviously true but also obviously stupid. Every action counts, but some count a lot more than others. For example, a flight Paris-New York in economy class represents more than all the coffee capsules - however polluting - that I could use in my life. I would emit less CO2 than a Paris-New York flight, even if I were to drink three coffees a day until my 80th birthday. You cannot ask people to change all their behaviours as if they are of equal importance. You can't change everything all the time.

Another insoluble problem is the issue of social and environmental responsibility, which prompts us to make compromises. For example, I choose to drink Nespresso coffee because I am convinced - because I have studied the subject - that their coffee purchasing policy is one of the best there is, because it supports small producers and fights against deforestation. But these capsules contain aluminum which, although fully recyclable, is not fully recycled - because people do not always bring their capsules back for recycling. So in the end, is it better to buy these capsules or to buy coffee beans that I know nothing about?

The complexity and cognitive load of buying a single packet of coffee becomes absolutely untenable. Social, environmental, geopolitical concerns, CO2, water, packaging recycling and alternative packaging options should be all taken into account. If it's not aluminum, it will be plastic, is that better? And then there is the question of buying a kilo of coffee versus 100g packets, and therefore the question of the number of trips I need to make to the supermarket to buy them. Buying a simple packet of coffee becomes a puzzle that is absolutely beyond my cognitive abilities.

What solutions do you suggest to move towards a more eco-responsible society?

We always talk about changing consumer behaviour. But the best way to change behaviour is still to tax, prohibit, subsidise. Why do people smoke less today than 10 years ago? It's not because we used nudges, it's because we increased prices and heavily taxed tobacco. And it worked very well.

Any serious behaviourist will tell you that there are loads of problems that require the use of structural and collective action tools. On important topics, such as the environment, consumers can make a difference, but it's up to regulators to decide whether to subsidise renewables or ban cars in city centres. If we don't ban them, people will continue to use them.

I think the real issue is in the political acceptability of top-down measures and not primarily in encouraging bottom-up measures that come from individual behaviours.

Despite this, do you practice any ecological gestures on a daily basis?

Like everyone else, I try to do what I can, but like everyone else, I'm a little lost. I recycle my waste, I try to eat less meat, because it's also bad for your health, I usually drive an hybrid car, but I haven’t been driving at all and I’ve been doing all my meetings virtually for a year or so.

I have probably reduced my carbon footprint tremendously, but I am fully aware that it is generally much larger than it could be. My house is bigger than it needs to be and therefore I heat more square meters than I need to. Should I be doing more, differently, otherwise? It’s an incredibly complicated problem I find.

Last question, do you have any suggestions of articles or books that might be of interest to our readers?

I would recommend the very good paper, “Sleepwalking into Catastrophe: Cognitive Biases and Corporate Climate Change Inertia” [link to our summary below]. This is the corporate version of what we’ve just talked about. This is an article that talks about the cognitive biases that prevent companies from taking the consequences of their actions into account and from changing their behaviour to tackle climate change.

There's also Noise, a book we've been writing for the past five years with Daniel Kahneman and Cass Sunstein, and which will be released in May. Noise raises a problem that we don’t talk about much - and that we hope we’ll talk about more - which is not the problem of biases, but the problem of noise. Noise is an additional source of error, complementary and distinct from bias, but which deserves to be treated as seriously and which, moreover, is relatively easy to deal with.

Read also: our summary of the article by Mazutis et Eckardt on Cognitive Biases and Corporate Climate Change Inertia


Fazio, L.K., Rand, D.G. and Pennycook, G. (2019) Repetition increases perceived truth equally for plausible and implausible statements. Psychonomic Bulletin & Review26(5), pp.1705-1710. 
Mazutis, D., Eckardt, A. (2017) Sleepwalking into Catastrophe: Cognitive Biases and Corporate Climate Change Inertia. California Management Review, 59(3). pp. 74-108.