Each year, the guardians of the English language over at Merriam-Webster must accommodate new words in a changing world. We revel at the thought of conservative editors sitting in a dusty office, wincing as they write up the definitions for ‘humblebrag’, ‘woo-woo’, ‘binge-watch’ and ‘photobomb’. In 2018, we think it’s time they added another: ‘breach fatigue’.
Data breaches seem to be getting worse. If you open your favourite news site and find yourself breezing past the latest multi-million record compromise without even bothering to face palm, then you aren’t the only one. After all, a person can only take so much weak sauce.
One authoritative guide to historical data breaches is the dynamic chart over at the information is beautiful data blog. Every data breach is inherently ugly, yet this site manages to make them pretty. The data, gleaned from data breaches.net, the ID Theft Center, and press reports, extends back to 2005 and shows the number of breaches increasing in both frequency and severity over the years.
After around 2010, the bubble chart turns into a veritable spume of breaches, with the number of new ones surfacing. The number of large compromises also increases thanks to mega-breaches from the likes of Yahoo, Spambot, River City Media and Friend Finder. The final metric, sensitivity, also grows over time with the number of breaches revealing highly sensitive data increasing.
More scrutiny = more discovery
This may not be a sign that data breaches are happening more. Instead, some experts at SecTor believe that they may simply be more visible. We may be paying more attention to data breaches for a couple of reasons. One of them is the evolution of data breach notification rules.
Most states in the US have some form of data breach notification law, but many of them have evolved over the last few years to include different definitions for terms like personal information. For example, California, which contains nearly an eighth of the US population, expanded its definition with Senate Bill 46 in 2014. In many cases, these may change notification requirements for companies suffering data breaches.
Another suggested reason for this increased visibility is that we are simply better at finding breaches because our security tools are improving, and because we are looking more for them. We have people like Chris Vickery, a self-taught security researcher who made a name for himself finding massive data sets that people had left out in the open for anyone to see.
Why are people leaving these data sets out in the open? It’s because the cloud enables them to. Vickery couldn’t have found nearly 94 million Mexican voter registration records unless someone had left it on Amazon cloud server in plain text. He couldn’t have discovered River City Media’s 1.4bn records unless they had been left publicly exposed online.
The evolution of public cloud infrastructure makes it possible for people to store all kinds of information online, and they don’t always understand the tools that they are using to do it. Throwing around large files full of sensitive information on infrastructure that you don’t understand is likely to result in a breach. SecTor speaker Sean Cassidy has pointed out this problem before, and spoke about it at the 2017 conference.
More data = more compromise
Then, there’s the fact that there is simply more data to steal. Dave Millier, CEO of Uzado, believes that there is so much more data now that companies simply aren’t taking as much care to protect it as they should.
Allison Miller, who was product manager for security and privacy at Google when we interviewed her in November 2017, believes that the rate of breaches are in fact speeding up, in large part because more people are collecting data that could result in a breach. Companies are emerging that are entirely data-driven and natively mobile, enabling them to collect new kinds of data in new ways.
Uber, which lost data on 57 million customers and 600,000 drivers in November 2017 and then tried to cover it up, is a case in point. While the company didn’t spill geolocation data on its users during that hack, it collects vast amounts of information including customer location data via its mobile app.
The creation of massive data lakes, combined with more opportunities for user or administrator error, is combining with other factors including increased attacker automation. Iain Paterson, managing director of Cycura, says that hackers are rivaling enterprise IT departments in their sophistication, enabling them to gain access to more data than they did in the past.
So, there’s a perfect storm when it comes to data breaches. Whether or not more are happening, they are hitting the headlines more frequently. How do we stop them?
Expensive tools won’t stop the rot, says Paterson. The answer is the same as it always has been: increasing cybersecurity hygiene. Software patches, application white listing, proper access management and multi-factor authentication, application hardening and user training can all help to stop data leaking from an organization, as Australia’s Signals Directorate explains in its ‘essential eight’.
Until organizations take notice, we’ll continue to see more data breach train wrecks. Don’t customers deserve something better?
See SecTor’s experts talk about the increasing frequency and severity of data breach stories and what lies behind them here: