No one will remember you for the mistakes you make. It’s how you deal with them afterwards that defines you. The most mature people own and correct their slip-ups, learning from them to avoid future repetitions. When it comes to cybersecurity, though, Mark Sangster says that companies aren’t that good at the learning part, meaning that the corrections are often only cosmetic.
The VP and industry security strategist at security company eSentire spent a long time this year thinking about why we fail to learn from our cybersecurity mistakes when writing his debut book, No Safe Harbor: The Inside Truth of Cybercrime and How To Protect Your Business. He launches the title this October in his talk at this year’s virtual SecTor conference: Only After Disaster Can We Be Resurrected: Field Lessons in Cyber Incidents.
The book looks behind the scenes of several data breaches, documenting oversights that the news headlines miss. He explains how fissures in our corporate culture allow cyber criminals to worm their way past our defences.
Scapegoats and easy answers
Our cybersecurity mistakes often stem from deep-seated human weaknesses, Sangster explains. People struggle with several biases when investigating data breaches, each of which stops them learning properly from their mistakes and hardening themselves against future attacks. One of the most pernicious kinds is outcome bias, he warns. This is where we look for a person to blame, enabling us to quickly explain the problem. Blaming stops us from tracking the root cause of our cybersecurity slip-ups.
“We do a great job of blaming but we don’t do a good job of learning,” he explains. “You’re not really solving the problem, which means you’re likely to repeat it.”
The problem with scapegoats is that they’re rarely the sole cause of an incident. Security mistakes are usually the product of many other decisions taken further upstream that span business factors, not just technical ones.
Failing to look deeper and identify a chain of cause and effect can lead to to hindsight bias, in which we exaggerate our ability to predict an event. ‘Sure,’ a CISO says after an incident occurs. ‘Anyone could have seen that this was going to happen if we didn’t patch our software.’ Except, no one did, which is why the incident happened in the first place.
We tend to look for scapegoats and easy answers on well-trodden paths because decision makers inside a company often share cultural and professional backgrounds that cause them to think the same way.
“It’s not just micro-groupthink within the organization. It’s macro-groupthink across industries,” Sangster warns. If most other companies handle cybersecurity a certain way, then others fall into line and structural flaws develop because no one looks for them.
Broadening our perceptions
In some cases, these flaws can be physical and life-threatening. In the book, Sangster cites a famous disaster-that-almost-was: the 601 Lexington skyscraper in downtown Manhattan. The building, erected in 1977 as the Citicorp Center, was built on stilts to accommodate a church underneath. An engineering student spotted a weakness in its design that could collapse it in strong winds. Its structural engineer had overlooked the fault, and builders rushed to fix the problem before the building took out several city blocks.
Stopping the cycle of mistakes means investigating further, stepping back and looking at all the factors that led to an incident. That can lead us down some uncomfortable paths in which we ask difficult questions. Why did the company cut its budget for this change management process but not that one? What stopped employees from escalating a suspicious physical security incident to their superiors? Is the company investing in the right kind of cybersecurity awareness training? Why is the RDP port really left open by default on Windows boxes that don’t need it?
To explore security problems more deeply, it’s time to shake things up by encouraging fresh new perspectives. You can use several mechanisms to do that. “One of them is an empowered employee base that can report these incidents or make recommendations,” Sangster says. We need to circumvent incumbent power structures and listen to those that don’t have a voice. After all, it was an engineering student who discovered the flaw in the Citicorp Center and struggled to get the problem onto the senior engineer’s desk.
Organisations have even created formal methodologies to promote this thinking by forcing dissent. He looks to the Israeli Defence Force’s ‘tenth man‘ strategy. If nine in ten people agree on a course of action, it’s the tenth person’s role to play devil’s advocate and challenge that consensus.
Another tool is regular vulnerability assessment, perhaps in the form of a red/blue or purple team exercise to highlight flaws in a company’s cybersecurity strategy.
The other is to encourage regular audits and attestations, but these should be backed up by self-attestation, in which an officer of the company personally vouches for a cybersecurity statement. This is something that the New York Department of Financial Services (DFS) already includes in its 23 NYCRR 500 cybersecurity regulations for financial services companies.
Perhaps that kind of executive accountability will help companies to close the circle, creating a cycle of constant improvement. It’s no coincidence that the one place we tend to see companies learning from their mistakes and detailing improvement plans is in more regulated industries where they are subject to more scrutiny.
Hear Sangster’s stories from the cybersecurity front lines in his talk at SecTor’s virtual conference in October. You can register here.
There are 0 comments