The tech giants are partnering on a tool for public good, but critics worry it will ultimately get used for predatory surveillance
If the devastating health and economic ramifications weren’t enough, individual privacy is also in the throes of being profoundly and permanently disrupted by the coronavirus pandemic.
Apple and Google are partnering up to bring technology to bear on COVID-19 contact tracing efforts. The tech giants are laudably putting aside any competitive urgings to co-develop a solution that combines mobile operating system, Bluetooth and GPS technologies to help us all get past the burgeoning health crisis.
However, in an apparent effort to live down Google’s abjectly poor track record respecting consumer privacy, the Apple-Google partnership is treading lightly to avoid anything that might hint at an undue invasion of individual privacy. In doing so, their proposed solution has a number of glaring technical and privacy-protection shortcomings, according to several technologists I spoke with. In fact, the Apple-Google project has exacerbated a privacy controversy that flared up in Europe in the early stages, one that has more recently been picking up steam in the U.S., as well. Here’s how technologists and privacy experts see things stacking up:
Infected persons will be able to use their iPhones or Android devices to make their status known to a central server, which then correlates an anonymized identifier of the infected person to anonymized IDs of non-infected persons who happen to be in close proximity. The server then alerts the non-infected persons to self-immunize.
“It is a ‘mostly de-centralized’ approach, where most of the data never leaves the user’s device, in order to protect the user’s privacy as much as possible,” says Alban Diquet, the head of engineering at Data Theorem, a supplier of application security solutions. “The only data the server gets is a privacy-preserving ‘Bluetooth ID’ of the users who agrees to share their Bluetooth ID with the server. And the data is deleted after 14 days.”
“The core idea is pretty simple,” says Ambuj Kumar, CEO of Fortanix, a supplier of advanced encryption systems. “All smartphones are constantly transmitting their identities using Bluetooth. Each Bluetooth ID is represented by a set of random numbers, and there is no easy way to reverse map the Bluetooth IDs back to specific phone users. And each phone keeps a log of all of the Bluetooth IDs of phones that come within its range.”
“When a person gets sick, they can choose to reveal that information via their phone identity,” Kumar says. “Viola! That phone identity checks in and Bluetooth communication is used to send alerts to anyone who came into close proximity of the infected person.”
The Apple-Google project is proceeding on two tracks. First, the Apple-Google team released an API to the development community and invited any and all software developers to design contract-tracing apps leveraging Bluetooth IDs. Meanwhile, the Apple-Google team is focusing on designing systems that will be needed to embed a variety of contact-tracing apps into Apple’s iOS and Google’s Android platforms. Presumably this will get done for all of the models in wide use, not just the latest models.
Chloé Messdaghi, vice president at Point3 Security, a supplier of workforce training systems, notes that third party developers can include local government agencies, as well as for-profit software developers.
“Apple-Google is trying to help the pandemic by creating a solution that brings better transparency and the development community together at this time,” Messdaghi observes. “But I’m not the biggest fan of third parties creating the app. In general, many companies and government agencies do not have great security. We are also trusting they will not store the data themselves and/or sell the data.”
Furthermore, technology and privacy experts say it’s not a sure thing that iPhone and Android users will even trust the Apple-Google solution. Some might be all too familiar with Google’s long standing campaign to collect and monetize health data. Google, for instance, got sanctioned by the UK’s Information Commissioner’s Office (ICO) after the search giant scooped up records for 1.6 million patients of London’s Royal Free hospital. This too appeared to be for a good cause – it was part of creating a Google healthcare app, called Streams, designed to assist persons recovering from acute kidney injuries. However, the ICO ruled that Google continued deploying the app, even after patient data was transferred.
Caitlin Gruenberg, a privacy and cybersecurity analyst at CyberGRX, which supplies risk assessment tools, told me she believes a lot of folks might be hesitant to voluntarily use the Apple-Google contact tracing app. “Unless the population is properly educated about this solution and the app is executed properly, the general population may be hesitant to opt in,” Gruenberg says.
Even if a public awareness campaign is carried out effectively, Gruenberg wonders about limitations of the app, as described thus far. “Once all of the privacy concerns have been properly addressed and security controls implemented, I wonder if the data the app collects will be enough?” she ponders. “Or will the solution require more detailed personal data to be effective?”
Political leaders pursue mixed agendas; it’s what they do. Apple and Google reward their shareholders; it’s what they do. With so many moving parts needing to come together, the temptation to tilt toward the familiar could prove to be irresistible. And even if the design of the contract-tracing app stays very, very basic, there are the social media trolls to consider, observes Fortanix’s Kumar.
There’s nothing to stop anyone seeking bragging rights to stir controversy on Twitter or Facebook. This can be done by downloading the app, and then declaring himself or herself to be infected, when that’s not the case.
“So assume someone who takes a busy public transport to a big office building for work, and for fun declares themselves infected,” says Kumar. “This will create lots of alerts. Unfortunately, you only need a handful of these trolls who can overwhelm the system with all the false positives.”
Kumar wonders if a contact tracing app that incorporates the oversight of healthcare professionals might not make a lot more sense. “It might reduce the possibility of such abuse,” he told me.
Also, Kumar points out that some form of location-based tracking will have to be part of the Apple-Google contact tracer. That opens up a whole new can of privacy concerns, something the American Civil Liberties Union already is scrutinizing. The ACLU issued this report detailing the many ways proactive smartphone location tracking, even for a good cause, can trample privacy.
For one thing, precisely pinpointing a person’s location, moment-to-moment, pivoting off Bluetooth IDs, is not easy to pull off. The ACLU report points to problems arising in an Israeli contact tracing app, similar to the one Apple and Google are working on:
“The Israeli system apparently acts on the basis of nothing more than an automated look at proximity. In Israel, one woman was identified as a ‘contact’ simply because she waved at her infected boyfriend from outside his apartment building — and was issued a quarantine order based on that alone. Such a system is likely to make many such mistakes; it won’t know that a bank teller is shielded from transmission because they’re behind plexiglass, or that two people close together in a building are actually in separate apartments divided by a wall.”
Beyond the difficulty of designing and distributing an accurate contract tracing app – one that will actually do some good and not add to confusion -- there is a much bigger privacy concern looming. Privacy advocates worry that government authorities and the tech giants inevitably will use COVID-19 as an excuse to intensify surveillance, over broadly. Critics say this could conjure a slippery slope scenario whereby a proactive surveillance app that gets pushed out by Apple-Google, ostensibly for a good cause, eventually gets used by governments and corporations for manipulative and predatory purposes. It wouldn’t be the first time for Google, at least. The Wall Street Journal reported just last November how Google’s ‘Project Nightingale’ gathers personal medical data from more than 50 million Americans in 21 states, in cahoots with the nation’s second largest healthcare provider, Ascension, without informing the patients. Amazon, Apple and Microsoft are also aggressively pursuing revenue from the healthcare industry, the Journal reported.
As usual, those with the most to lose -- the elderly, the poor and young people -- are the most vulnerable. Sen. Josh Hawley, R-MO, for one, doesn’t trust Apple and Google to do the right thing. Hawley recently wrote a letter to Tim Cook, CEO of Apple and Sundar Pichai, CEO of Google, demanding that the CEOs’ accept personal liability for any privacy shortcomings in their new tool. Hawley wrote:
“Americans are right to be skeptical of this project. Even if this project were to prove helpful for the current crisis, how can Americans be sure that you will not change the interface after the pandemic subsides? Once downloaded onto millions of phones, the interface easily could be edited to eliminate previous privacy protections. And any privacy protection that is baked into the interface will do little good if the apps that are developed to access the interface also choose to collect other information, like real-time geolocation data. When it comes to sticking to promises, Google’s record is not exactly reassuring.”
COVID-19 quite clearly is forcing the privacy issue, which could be a good thing. Either governments and corporations will get away with using the global pandemic as an excuse to permanently deepen permission-less, smartphone surveillance -- or consumers will rise up and wrest back control of their privacy – by demanding stronger data privacy regulations.
One or the other seems certain to unfold. I’ll keep watch.
Many of the underlying algorithms we rely on are only as good as the human knowledge they come from. And sometimes, the knowledge transfer from humans to formulas falls short.
Security weaknesses align seamlessly with the spreading of disinformation. The purveyors of disinformation know this and have taken to spreading malware via vulnerable mobile apps.