“What happens to Heroes?” EPISODE #7: The Unsung Heroes of the digital world “

The Psychological Impacts of Cyberattacks
This is the seventh episode of a story related to individuals who, in a matter of moments, transition from “employees” to “rescuers” in the immediate aftermath of a destructive cyberattack.
What I will call the “Heroes”
Cognitive Cybersecurity & prevention: The human factor at the heart of cyberattacks
“ We were carefree, thinking that it only happens to other people.”
“For me, these were things you only saw in the movies. I never thought I’d experience one in my life.”
Excerpts from Interviews with Heroes
In this episode, I would like to highlight a contradiction that strikes me every day. Despite massive investment in cybersecurity technologies (anti-phishing filters, advanced firewalls, intelligent detection solutions, IA algorithms ,..), human error remains the biggest weakness in our defense. Numbers vary between studies, but we can consider that between 75% and 95% of cyber incidents originate from human failure. The allocation of resources to this human factor is disproportionately low compared to the rest of the investment. Why is our reasoning flawed?
Cognitive biases in Cyber Security. What are our blind spots?
Today, in this highly connected world, cyber threats are evolving at an unprecedented rate. However, despite the latest technological advances and state-of-the-art security measures, a persistent and often overlooked vulnerability persists: the human mind. Cybersecurity isn’t just a technical challenge — it’s a cognitive one. The behavior of humans, influenced by unconscious biases, often creates opportunities for risks that no firewall can prevent. Organizations that want to strengthen their cyber resilience must understand these biases.
Let’s delve into four cognitive biases that can compromise cybersecurity: excessive optimism, apathy, the tendency to overestimate the availability of resources, and the tendency to overestimate one’s own abilities to handle a crisis.
BIAS #1: Unrealistic Optimism: “It Won’t Happen to Me”
Exaggerated confidence, also known as unrealistic optimism, is the tendency to believe that negative events are more likely to happen to others than to oneself. In the field of cybersecurity, this mindset can lead individuals, teams, and even entire organizations to underestimate their susceptibility to attacks.
The impacts will include underinvestment in security measures, lack of training and ignoring warning signs until it’s too late.
BIAS #2: Complacency: “We’ve Always Been Fine”
Complacency happens when past success or the lack of incidents leads to a false sense of security. People stop anticipating issues and cease taking precautions when everything seems to be going smoothly. However, “Past success does not guarantee future results” is a well-known statement.
The impacts will include a gradual decrease in security hygiene, an increase in blind spots, and an overestimation of the quality of IT operations.
BIAS #3: Availability Bias: “I’ll Worry About What I’ve Heard Of”
Availability bias is when you pay more attention to the most recent event or information. This leads us to exaggerate the probability of similar occurrences, like wearing blinders.
This will result in misallocated resources, ineffective risk management, and overlooked vulnerabilities.
BIAS #4: Overestimating expertise and crisis Management Abilities: “We’ll Handle It When It Happens”
Many IT teams believe they can effectively respond to a cyber crisis, despite lacking real preparation. This overconfidence in crisis response capabilities can be dangerous.
Impact: Delayed response, poor coordination, reputation damage, regulatory penalties, and higher recovery costs.
What are some potential solutions?
• Encourage threat awareness by sharing examples of breaches experienced by similar organizations.
• Conduct tabletop exercises that simulate “what if it were us” scenarios to challenge optimistic assumptions.
• Build a culture of continuous improvement. Implement regular reviews of security controls and embed a mindset that “no news” doesn’t mean “no risk.”
• Conduct regular incident response exercises that involve both technical and non-technical teams. Test decision-making under pressure. Identify and fix gaps in processes, procedure, communication, escalation, and external coordination.
Cybersecurity awareness is not just about instructing people on what to do. It is about helping them think differently and challenging their cognitive biases. This is a crucial step in enhancing the human defense layer.
By openly discussing biases such as unrealistic optimism, complacency, availability bias, and overconfidence, organizations can develop smarter, more resilient teams. These teams will be better at seeing risks and acting proactively.
🔐 Remember: The biggest vulnerability in cybersecurity isn’t a zero-day exploit — it’s the assumption that “this doesn’t apply to me” or “we’ll be ready when it happens.”
THINGS TO REMEMBER
The biggest vulnerability in cybersecurity isn’t a zero-day exploit — it’s the assumption that “this doesn’t apply to me” or “we’ll be ready when it happens.”
About the Author
Didier Annet is an Operational & Data Resilience Specialist and a Certified Professional Coach dedicated to empowering individuals and teams to navigate the complexities of an ever-changing digital landscape.
Find him on LinkedIn: Didier Annet
Learn more in his book:
📖 Guide de survie aux cyberattaques en entreprise et à leurs conséquences psychologiques: Que fait-on des Héros ? (French Edition) – Available on Amazon
English version:
“Survival Guide – The Human Impact of Cyberattacks and the Untold Story of Those Who Respond”
“What Happens to Heroes?”
Available on Amazon