Roger Dingledine of Tor Project talks privacy and COVID-19 apps

Malea Lamb-Hall 13 Aug 2020

A key figure at the heart of the debate around online privacy, Dingledine gives us his views on staying safe in the age of Covid

Roger Dingledine is a key figure at the heart of the debate around online privacy. In 2006, Roger co-founded the Tor Project, a highly influential nonprofit that develops free and open-source software to protect people from censorship, tracking and surveillance online.

Today, Roger is president of the project and respected as a leading researcher in the field of online anonymity. He will be a keynote speaker at CyberSec&AI Connected this October — a virtual conference organized by Avast to discuss the latest research and thinking around AI and privacy. We spoke with Roger to preview his talk and get his views on privacy in the age of COVID-19.

Do you think the general public are far more educated around online privacy these days? Or is it for many people still too abstract a concept to worry about?

When we first started working on Tor in the early 2000s, one of the challenges was getting people to care about privacy. But after the Snowden revelations and the Cambridge Analytica scandal, and more recently the outrage about Silicon Valley pushing facial recognition technologies, I think we've turned a corner.

The general public now understands that privacy is about who you are as a human being: it's all the data that defines you, and how dangerous that can be if abused.

The challenge now is to build real solutions, that scale to help everybody around the world, that protect us from both over-reaching companies and over-reaching governments, and that align with user incentives and user flows so they aren't fighting against what users are trying to do. Technology alone cannot solve this problem: we need social and political change too.

Part of the response to COVID-19 has been the use of tracer apps to help monitor the spread of virus and recognize patterns of infection. This has both positive and potentially negative consequences in relation to AI, cybersecurity, and data. Is the speed these solutions are being pushed out of concern?

I'm always worried when big companies see another chance to push their own agenda on us, yes. Done poorly, this is yet another opportunity for companies and governments to build comprehensive databases about our friends, our social graphs, our movement habits. Over and over the trend is that some crisis emerges, and some group uses it as an excuse to try to strengthen their controls over society, whether it's for political gain or financial gain (if you can even draw a line between the two anymore).

The part where I'm optimistic here is that several groups of scientists have shown that we can build privacy-preserving COVID tracing apps, and they're actually getting adopted in some European countries. The intuition behind these designs is that, rather than having every phone tell the central database everywhere it has been, instead, each phone broadcasts a sequence of random numbers. If you later test positive, you can publish the random numbers your phone saw so then everyone else can privately, on their own phone, check to see if any of the numbers match the ones their phone sent out.

In June, I moderated an online panel of experts looking at the intersection of COVID tracing apps and the international protests against systemic racism.

Your presentation for CyberSec&AI Connected is entitled: “Surfing the Web Securely and Privately, Using Tor Anonymizing Network” Could you give us any insight into the kinds of things you’ll be addressing?

I want to give people a better intuition about tracking, surveillance, and censorship online, as part of explaining why Tor's "distributed trust" and transparency are important building blocks for strong privacy. Tor is a free-software anonymizing network that helps people around the world use the internet in safety. Tor's 7,500 volunteer relays carry traffic for millions of daily users, including ordinary citizens who want protection from identity theft and prying corporations, corporations who want to look at a competitor's website in private, or people around the world whose internet connections are censored, and even governments and law enforcement.

I plan to talk about online surveillance and how to avoid it, and about the difference between network-level security ("where your internet traffic goes") and application-level security (e.g. "what your browser gives away about you"). I'll also bring in internet censorship: how it isn't just a problem for far-away countries, and how the technical mechanics of surveillance and censorship are more similar than people realize.

What are some of the more recent trends and developments around privacy that have caught your eye?

I think the continued trend toward end-to-end encryption is critical for future society to be safe. We made a giant leap ten years ago when we succeeded at getting the world to think of HTTPS as an ordinary security layer that everybody should expect, rather than thinking of web encryption as something scary that only bad people would want. But we're still fighting that same fight today, with Signal and WhatsApp trying to offer safety to their users while governments drum up fear about how civilization will collapse if they can't decrypt anything and everything.

The reality is that giving people real encryption makes society safer, not less safe. That's because taking encryption away from the masses hurts ordinary users without slowing down the bad people. Security and privacy go together; they're not opposites.

To read more of our interview with Roger, check out the full interview


CyberSec&AI Connected is a virtual event taking place on October 8, 2020, and can be attended from anywhere in the world. To see the packed agenda, including speakers such as Chess Grandmaster Garry Kasparov, visit the conference website.

--> -->