Should information be treated like weaponry? The world isn’t sure – and it’s causing cybersecurity companies and researchers real headaches. Hopes for a key resolution on export controls were dashed after annual talks concluded in December – and that leaves some key questions unresolved.
The talks were part of the Wassenaar Arrangement, an agreement between 41 countries to govern the export of dual-use technologies, with a view to discouraging sales to regimes that might abuse them.
History is littered with dual-use technologies, which by definition can be used for good or bad. Nuclear fission can be used to power cities or destroy them, for example. On a lesser level, chlorine bleach can sanitize swimming pools, or make chlorine gas.
Cybersecurity tools can also be creative or destructive, based on how people use them. In the right hands, digital intrusion tools can legitimately highlight holes in network defenses. In the wrong hands, they become digital assault weapons, enabling people to listen in on other peoples’ conversations and steal their data.
That’s why in 2013, the Arrangement was updated to include Internet-based surveillance technologies, thanks in large part to the use of these technologies in countries with less-than-stellar records on human rights.
The concept was well-meant. Blue Coat technology has been sold to authorities in Syria, for example, while Italy’s Hacking Team exported its intrusion software to countries including Ethiopia and Sudan. Hacking Team had its Wassenaar license to export intrusion software outside Europe revoked by its government in 2015.
The Arrangement is not a binding treaty, so countries implement these controls at their own pace, in their own way. The US had moved to implement the changes beginning in May 2015, when the Bureau of Industry and Security published a proposed rule on the matter.
The cybersecurity industry has been less than happy with that idea, responding that the proposed US implementation is too broad. Vendors created the Coalition for Responsible Cybersecurity to push back against the proposed rules. They were worried about four main areas:
A chilling effect on cybersecurity research
The rule would make it difficult for researchers to test networks and share vulnerability information across borders, worry experts. Hacker-lawyer and SecTor speaker Brendan O’Connor, who runs his own consulting firm Malice Afterthought, warns that someone who wants to present their new direct exploitation technology at an international conference could be in a tricky position.
“If I open sourced all that information before I get on the plane, then I’m fine,” he said. “If I don’t until I’m standing on the podium in Amsterdam, then I’m not fine because I travelled with this exploitation information on my laptop across a national boundary.”
Difficulties in making cybersecurity tools available around the world.
“The process of getting an export license is at best onerous and at worst impossible for people to do,” said O’Connor. “It becomes an exercise of arbitrary discretion by a government who may or may not have your best interests at heart.”
Difficulty in developing perimeter security technologies.
If you can’t sell technologies outside north America, that makes it harder to justify developing them, and could hinder cybersecurity companies’ attempts to do business.
Hindrances to cybersecurity collaboration
The worry here is that communicating cybersecurity information to non-US citizens would contravene the rules.
“Imagine having to do a citizenship check for every single person at every single corporation to whom you sold information,” said O’Connor. “Or even set up a bug bounty program, or sold the commercial version of Metasploit [to]. This is debilitating at best.”
A lot of people voiced these concerns. Cybersecurity and tech firms testified at government hearings. Professors at Dartmouth published a paper on the adverse potential effects of the rule and members of Congress lobbied for a rethink. The Chaos Computer Club also explained its own concerns in comments to the EU, which implemented Wassenaar’s intrusion tools provisions in early 2015. The scope of the controls and the definitions used were among its key worries.
These uncertainties have led to some strange situations. Researchers have been nervous about participating in hacking competitions overseas, lest they be prosecuted for exporting dual-use technologies without a license.
The government listened to these concerns, and said in early 2016 that it would try to renegotiate the intrusion tool provisions in the Agreement, specifically targeting ‘removal of the technology control’ as a goal. Unfortunately, it couldn’t push through the changes in the talks, which happen annually. This leaves the US in limbo for another twelve months, creating uncertainty for researchers and security companies alike.
So, what happens now? For US and Canadian cybersecurity researchers and firms, not much. For the next year, we’re stuck without any clear decision on the rules. The US government hasn’t implemented its rule yet, and seems set on thrashing things out in the plenary sessions before it does so.
The problem is that nothing is certain, and uncertainty is never a good thing for academics or for companies trying to plan their business activities. With a unpredictable administration due to take office, the whole dual-use export issue remains as clear as mud for the cybersecurity industry.
There are 0 comments