Facebook, Uber and Telegram have all this week faced challenges to user safety – we look at those challenges and what we can learn from them.
Facebook founder Mark Zuckerberg finally broke cover after nearly a week of silence as the fallout from the row over how a data company, Cambridge Analytica, was able to access the profiles of some 50m people who hadn’t consented to share their data with the company.
Zuckerberg offered perhaps the best reply to one question – was this technically a breach – when he said the way in which Facebook had allowed Cambridge Analytica to access data without consent was “a breach of trust.”
For those catching up, the nub of the row, which first surfaced last weekend in reports from the UK’s Observer newspaper (the Sunday arm of The Guardian) and Channel 4 News, is that in 2013, Aleksandr Kogan, a researcher from Cambridge University, built a personality-testing app that made use of a feature of Facebook’s API that not only scraped the data of the 300,000-odd people who completed the quiz, but also helped itself to the public data of the friends of those 300,000 people.
Kogan then passed on that data to Cambridge Analytica, which used it to target American voters with detailed, targeted advertising that many believe helped Donald Trump win the 2016 US presidential election.
Sharing data with third parties without consent in the way that Kogan did is both a breach of trust and a breach of Facebook’s rules.
Facebook, aware that its API allowed apps to harvest the data of users’ friends, closed off that feature in 2014.
Of the many takeaways from this ongoing saga, one that might have escaped your notice is that Kogan’s app used the Facebook login feature, which allows Facebook users to sign in to apps and websites using their Facebook credentials.
It’s easy to see why people choose to do this: remembering passwords is hard, especially when we all have so many sites we have to log in to: your own laptop, your work laptop, your email, your bank, Amazon, eBay, social platforms such as Twitter and Snapchat, your Google password … the list is endless.
What this shows is that you don’t have to have a breach – in the sense of a cybercriminal illegally accessing systems and stealing data – for your data to be taken and used without your knowledge or consent.
We set out some of the ways in which you can lock down your Facebook account earlier this week to mitigate the risk of your data being used without your consent, but here’s another tip: don’t sign in to apps or websites using any of your social sign-ins - not your Facebook credentials, your Twitter credentials, your Google credentials or any other credentials.
Create separate logins for every site and app you use, and create a strong, unique password for each of those.
This controversy isn’t going away, but while it rumbles on, there are valuable lessons for us all in what happens to our data and the steps we can take to look after it.
Uber was also coming to terms with the risk it creates with technology after a woman was killed by an Uber car driving in autonomous mode.
Elaine Herzberg died when she was hit by a self-driving Uber SUV in Tempe, Arizona, on Sunday night after the car’s lidar technology failed to detect her as she rode her bike.
Herzberg’s death raises some important points. The first is that self-driving cars have some way to go before we’re comfortable with them on our roads. San Francisco suspended Uber’s permission to test self-driving cars in the city last February after several incidents where cars ignored red lights and made unsafe turns in cycle lanes, and the Uber has now halted all of its tests of self-driving cars in Phoenix, Pittsburgh, San Francisco and Toronto as well as in Tempe.
It’s understandable that Uber has pulled the tests for the time being: everyone needs to feel confident that they’re safe either in a self-driving car or out on the roads that they use.
However, it’s worth remembering that humans are not good at evaluating risk. The US National Safety Council points out that on average, 103 people die in car crashes every day, with 11,800 injured in car accidents every day – yet we cheerfully get in cars and drive to work, drive home, drive to family, drive to friends every day despite the very real risk of injury every time we do.
The technology that allows cars to drive without human input is still evolving. We don’t yet have much data on whether it’s safer than having fallible humans behind the wheel, but wide acceptance of the technology if it stacks up in the safety stakes will be life-changing for many people – the disabled, the blind, the housebound.
However, it looks as if the road to that destination might be longer than we thought.
Lawmakers around the world would very much like to have access to what people say in encrypted messages on platforms such as WhatsApp, Signal and Telegram: last year, Britain’s Home Secretary, Amber Rudd, sparked a storm of protest when she said that “real people” don’t need encryption.
And then the Australian prime minister, Malcolm Turnbull, doubled down on that by declaring that “The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia.”
Turnbull’s comments came as he introduced plans to compel internet companies to decrypt messages sent with end-to-end encryption, seemingly not understanding that the laws of math, which govern cryptography, can’t really be changed to suit the government’s purpose.
The latest skirmish in this ongoing stand-off between technology providers and their users and lawmakers came on Tuesday, when Russia’s supreme court ordered Telegram to hand over encryption keys used by the app to the FSB, Russia’s security service.
The FSB told the court that holding the encryption keys doesn’t compromise the security of the app’s users, saying it would still need a court order to use them to access someone’s messages.
The move is part of Vladmir Putin’s efforts to crack down on terrorism, but founder Pavel Durov said on Twitter that “Telegram will stand for freedom and privacy.”
This isn’t the end of the road, as Durov will undoubtedly challenge the court ruling further.
Since lockdown measures went into effect, incidents of domestic violence have increased as has the use of stalkerware to spy on people and track their movements.
Hundreds of millions of user records have been put up for sale on the dark web by a data breach broker who claims the information comes from 14 companies hacked in 2020.