In a bustling agricultural export office in Kakamega, Kenya, Wafula sat behind his desk, convinced that he held all the answers. He felt confident that maize prices would soon skyrocket, even though market data suggested otherwise.
Wafula rarely consulted reports or spreadsheets, relying instead on his own intuition. "You see, I have been in this industry for 15 years, I know the trends like the back of my hand," he would often declare. In stark contrast to Wafula, Nekesa, the data analyst, often cautioned against making decisions without proper data analysis.
Despite having key market indicators at her fingertips, Nekesa felt like she merely spoke to a wall whenever she approached Wafula. Wafula’s overconfidence led to a series of poor decisions that eventually had an adverse impact on the company's revenues.
As an example, several years ago, the company invested heavily in maize, ignoring Nekesa’s market data that warned of an oversupply.
By the end of that season, maize prices had plummeted, just as the data had predicted. Wafula shrugged it off as a one-time event, refusing to acknowledge that his decision-making process proved heavily flawed. Meanwhile, Nekesa and other colleagues grew increasingly frustrated with Wafula’s refusal to consider data-driven approaches. Continuing in Business Talk’s miniseries on overconfidence and the negative consequences on business, recent cutting-edge research by Simone Lackner, Frederico Francisco, Cristina Mendonça, André Mata, and Joana Gonçalves-Sá shows that overconfidence often correlates with a dismissive attitude toward scientific data and evidence. The study suggests that individuals with intermediate rather than advanced levels of scientific knowledge tend to exhibit overconfidence, which subsequently results in a negative view towards science and data. Essentially, individuals think they know enough to make decisions without further investigation, thereby sidelining crucial empirical evidence. The researchers conducted the study by examining four large surveys, spanning three decades and covering Europe and the United States.
They introduced a new confidence metric instead of relying on participants to self-report or compare themselves or compare others, which arguably has skewed previous studies on overconfidence. The metric gauges overconfidence based on the frequency of incorrect answers to questions on scientific facts as opposed to ‘don't know’ responses on assessments, thus hoping to get a purer gauge on someone’s abilities.
The study showed that the highest levels of confidence were with those with meagre intermediate knowledge of scientific facts and figures, thereby making them overconfident when they really knew quite little. Psychologist Jay Bavel interprets these findings as a warning against people who claim that they do their own research. He points out that such overconfidence can be a significant hurdle, especially in the workplace.
People who think they know enough often disregard the limitations of their own knowledge and overlook the importance of empirical evidence.
This can be especially detrimental in an organisational setting, where data-driven decisions are crucial. What about overconfidence in different industries? How about a lecturer who strongly believes in grade deflation by issuing higher proportions of failing grades as a way of motivating students rather than issuing higher proportions of top A-level grades, despite the evidence.
Working with such an overconfident person, such a lecturer would often proclaim “well, the research is wrong” or “the evidence data was collected incorrectly”, all without actually reading the studies, research, and evidence, similar to Wafula, but in a different industry.
On a marketing and sales team, a manager may berate and blame staff over why sales have not increased despite dire macroeconomic condition data impacting consumer purchasing behaviour. If you find yourself working with a colleague like Wafula, it's essential to exercise caution. One useful starting approach could entail tactfully encouraging such colleagues to consider alternative perspectives, especially when substantial evidence contradicts their viewpoint.
Then in stage two, dialogue and open a discussion that can sometimes mitigate the consequences of overconfidence and steer the decision-making process back to a data-centric model. Do not present evidence combatively because that will shut down overconfident colleagues. Instead, recognise what they say and their point, such as “thank you for sharing.
I hear you saying that [insert their main point]. Perhaps consider the latest research that was collected with strong methodology that shows [insert your main point].” Validate that you hear them, not that you agree with them. Then politely and calmly present the opposing evidence.
Managers and organisations on the other hand, dealing with overconfident employees requires a multi-pronged strategy. First, it may prove beneficial to introduce mandatory training programs that highlight the importance of data-driven decision-making.
Furthermore, organizations can implement systems that require employees to provide empirical evidence for their decisions, particularly when these decisions could have long-term implications.
Next, incorporate a culture that champions data-driven decisions including awards for the best data-based logic, etc. In conclusion, overconfidence can often lead to a disconnect between belief and actual knowledge, causing individuals to make decisions that disregard scientific facts and data. The dismal implications affect not only colleagues and organisations but also society at large.
By being aware of the dangers of overconfidence, and implementing strategies to counter it, we can foster a more enlightened, evidence-based approach to decision-making in our workplaces and communities.
Have a management or leadership issue, question, or challenge? Reach out to Dr Scott through @ScottProfessor on Twitter or on email at [email protected].