2020 SecTor keynote speaker Dr Tracy Ann Kosa got into privacy for personal reasons: She suffered her own data breach.
“I was working on my master’s thesis at the University of Manitoba, and my hard drive died,” she recalls. She called a repair company for help. The visiting technician removed her hard drive and scraped all the data that he could from it back at the office.
“About six months after I graduated, I got a call from a guy in Newfoundland,” she says. “He had bought my hard drive from a third-party retail site, as part of a bucket of spare computer parts.”
Those stories about people uncovering personal data from second-hand drives bought through eBay? They’re all true.
“He opened up my resume and my address book and called me to ask if I wanted him to mail the drive back to me,” she says. “Thanks to that lovely fellow, my career in privacy was born.”
When Kosa’s hard drive died, Google was just starting to sell ads. Mark Zuckerberg had not yet even started at Harvard. This was a very different world, but the experience made her realise that the future of privacy lay in bits and bytes, not paper. She changed her career track, moving from public policy into a PhD in computer science, focusing on the scientific application of privacy principles. Since then, she’s worked as the team lead at the Ontario Information and Privacy Office, director of privacy compliance at Microsoft, and now as staff privacy engineer at Google. But when speaking with us, she made it clear that all her views were her own.
There has never been a better time to work in privacy. It’s a field undergoing unprecedented changes as we move from paper to electronic data. Since Kosa’s hard drive disaster, software has chewed up the world. It has foregrounded new problems that we just didn’t think about when our information was stored on paper. In 2001, metadata was something only data scientists talked about. More than a decade later Edward Snowden explained to a shocked public how much it changed the game.
Privacy experts like Kosa find themselves trying to adapt to technology and its implications on the fly. She constantly asks herself complex questions. “How do you take these rules that we’ve written, these expectations that users have around how they interact with machines, and the very concrete way in which those machines work?” She asks. “How do you try to shove those three things together?”
As technology development speeds up, big tech companies are in the driving seat. They created technologies that were indisputably cool, but as she points out, no one stopped to think about whether we should build them. Systems ranging from browser fingerprinting to smartphone location tracking and facial recognition all fall under that banner.
What does a privacy advocate do about that? Turning your back on the system altogether isn’t an option. Instead, as someone that works for large technology companies, she gets a seat at the table during privacy discussions, seeing how privacy decisions are made close up. The process isn’t as intentional as you might think. Big tech bigwigs aren’t deliberately trying to work out what they can get away with.
“What’s actually happening is much more nuanced, and probably a little scarier, which is we’re not even thinking about the consequences,” she says. “Some people are beginning to now, largely I would say because of facial recognition and the controversies around that. But for the most part, those conversations until very recently weren’t happening at all.”
Part of this was down to a lack of oversight. “You could see that nobody was looking at the 10,000 foot view,” she says. Instead, small privacy decisions that seem innocuous on their own collect like radioactive material to create a chain reaction. “This little slice of the pie might be great, and this one and this one, but then when you put them all together, you have a serious problem.”
The complexity of the privacy issue compounds this problem. The implications of privacy decisions are complex and far-ranging, and growing larger as the technology’s power grows. The frameworks and definitions in which privacy operates are intricate and hard to navigate. Yet the engineers creating the technology aren’t trained ethicists.
How do we make these discussions simpler so that we can address these problems early? Recently, she’s been mulling the idea of distilling privacy discussions down to three questions as a starting point.
“Whenever you’re doing something, whatever it is, ask who’s in the story? Who benefits from this story? Who’s left out of the story?,” she says. “That’s it. Just ask those questions.”
This is a concept she’s still refining. Like any good idea, it’s one that will evolve over time. It has to, because the implications of our technology developments are evolving too. In 2020, there’s everything to play for.