A proposal that makes it easier to submit bug reports is close to getting the final stamp of approval – but even when it does, there’s still a lot of work to do.
Edwin Foudil first submitted security.txt as an Internet Draft to the Internet Engineering Task Force in September 2017. It’s a format for explaining how cybersecurity researchers can interact with websites. By putting it on your site, you could make it easy for security researchers to report bugs by telling them who to contact, what to tell them, and what to expect. Last month the IETF issued a final call for comment on the
The proposal imagines a security.txt file included in the .well-known directory of their web site. That’s where sites are supposed to store common policy documents that visitors to the site can use to understand how to engage with the site. Robots.txt, the file that tells web crawlers what they should index and leave alone, is supposed to go there too.
The security.txt format requires at the very least contact details so that people know to get in touch with when reporting a security bug. It also includes optional spots including a URL for instructions when reporting a bug and/or a bug disclosure policy, encryption keys that researchers can use to talk securely (that’s important when sending sensitive bug info). You can also put the URL for a web page that acknowledges researchers for their contributions, and a URL for a hiring page that details security jobs at your organization.
Thinking – or rethinking – cybersecurity relationships
A ratified standard for outlining bug reporting procedures is a great step, but it won’t be enough on its own. Companies are still fumbling the ball by not thinking about how to deal with bug reports at all.
One commentator on the draft proposal, the CEO of mobile and IoT security company Copperhorse, had automatically searched 331 IoT vendor sites to look for a /security directory or a security.txt file. In its latest 2019 scan, it found 0.9% using a security.txt file in .well-known, and only 4.2% using a /security page or a redirect to their actual security page. Worse still, it found that some companies seemed to hide by not using a security@ email address or using obscure addresses that seem almost designed to elude researchers.
In some cases, companies have thought about how to deal with cybersecurity researchers but end up being deliberately combative. Take FireEye, which in 2015 sued a researcher who found critical flaws in its products so that he wouldn’t disclose the technical details. Or PWC, which sent a cease and desist letter to researchers who had contacted it about a flaw and given it three months to fix the tool in question.
There is an appetite for more structured guidelines around security bugs, or at least to deal with security researchers in a less tone-deaf way. In the US, the Department of Homeland Security’s Cybersecurity and Infrastructure Agency (CISA) has proposed a directive that would forbid agencies from threatening researchers who search out and report bugs, which it says is still common in .gov land. Moreover, it would force them to create a security.txt file and keep researchers updated on how they’re dealing with the flaws.
Ratifying security.txt is unlikely to change the security landscape overnight. What’s needed is a bigger change in the way that companies think about bug reporting. The increasing popularity of bug bounty companies that run organized bug finding and reporting campaigns on behalf of their clients is helping to push the needle there, and it’s perfectly possible to point researchers to a bug bounty program in a security.txt file, effectively outsourcing the whole bug finding and disclosure process.
Underpinning conversations about bug reporting conversations, though, are others around liability and publicity. How long should a company ask researchers to wait before disclosing a bug, or does the researcher get to decide? What constitutes a fix, and what happens if the fix doesn’t work? How long should a company get to persuade customers to adopt a patch before a bug goes public? These are questions involving lawyers, technologists, marketing types, and compliance officers. It’s no wonder that we’re still so early on in our efforts to normalize bug reporting and that companies still make missteps. Standardizing the format that points to the policy is just the beginning of a much longer journey.