The Cambridge Analytica scandal and what we need to learn from it to assure internet freedom
The Cambridge Analytica/Facebook scandal that has captured the media spotlight in recent weeks is a reminder that online security threats are amplified by the ever-expanding reach and power of the digital world. They may seem abstract and less urgent than dangers in the physical world, but their consequences are no less alarming. While I and many others have been talking about how to address them for a long time, I want to take this opportunity to underline the two main pillars of defense we can put up to protect ourselves, collectively and individually.
But first let’s recap what happened, and how such a breach could have occurred—if “breach” is really the right word. A Cambridge academic, Aleksandr Kogan, working with the political data firm Cambridge Analytica, built an app that harvested the data of what has now been revealed to be up to 87 million Facebook users. Even those who opted in to taking the app’s survey and sharing their data believed that it would only be used for academic purposes, not for political targeting. Moreover, the app collected information not only from users who had downloaded it, but from their friends as well. Facebook initially tried to shift responsibility for the scandal to Cambridge Analytica and Kogan, for violating its policies and transferring academic data into the hands of a profit-driven company.
While this is technically true, the more important point is that Facebook made it far too easy for companies to gather massive amounts of user data and do essentially whatever they pleased with this information. Even if criminal or other liability can be proved—not at all simple, as in so many of these cases—the damage cannot be undone, or the user data scrubbed away. The internet never forgets, as the saying goes.
Even as he voiced agreement with calls for industry regulation, Facebook’s CEO Mark Zuckerberg denied the problematic nature of his company’s business model. He described Apple CEO Tim Cook’s criticisms of Facebook as “not at all aligned with the truth,” when the reality is that Facebook, a free service, makes a profit not by selling products, but information about its users, mostly to advertisers. There have been various creative schemes proposed that would upend the ad-based digital ecosystem, but, for now, it’s important to safeguard privacy in the environment we have. That’s why I advocate for a two-pronged approach of carefully crafted government regulation and smart individual cyber-hygiene practices.
For the former, we have an example, however imperfect, in the data privacy regime entering into force in May in the European Union. The General Data Protection Regulation (GDPR) is too heavy-handed, and parts would violate America’s protected right to freedom of speech, but it nevertheless sets a precedent for taking privacy seriously. Of course, I would prefer that companies regulate themselves, that they recognize their important role in society and all of the responsibility that it entails. Unfortunately, we have seen a failure on their part to do so and the inevitable reaction is for legislators to get involved to quell consumer outrage and prevent future data misuse.
Apart from any damage done to the companies themselves—their stock price and reputation—there are other potential downsides, such as regulatory capture, in which industry lobbyists from the leading companies will shape legislation to their advantage to shut out competitors. In spite of these tradeoffs, the current situation demands action on all fronts to demand more transparency and accountability from an industry with enormous social, political, and economic influence. I hope that, as has happened in the past, fear of over-strict regulations pushes companies into doing a better job of regulating themselves.
The second of my suggestions is perhaps less enticing, because it requires real effort from every internet user, rather than seeking refuge in the government. Part of the problem arises from the nature of mass data manipulation. In a classic tragedy of the commons scenario, individuals usually suffer few consequences from being cavalier about their data. In aggregate, however, these troves of consumer information can create enormous social upheaval if they fall into the wrong hands.
Data scientist Michael Kosinski, who served as deputy director of the Cambridge Psychometrics Center when its data were misappropriated by Aleksandr Kogan, knows very well the power of such data stockpiles. He has been a pioneer in the field of psychometric data analysis and modeling. He and his team at Cambridge refined their algorithms to the point of being able to predict individuals’ positions on the Big Five personality traits (openness, conscientiousness, extroversion, agreeableness, and neuroticism) based solely on their number of profile pictures and friends on Facebook. It is not hard to see how this disturbing degree of precision can be utilized for targeted advertising—and political manipulation.
What can we do then in the face of ubiquitous data collection, whether by tech companies, researchers, or political agents? To start, accept that you will have to give up a degree of convenience for greater security. Read my blog post from a few months back, where I link to tips from Avast on keeping yourself safe online. Another suggestion: next time you are asked to sign off on a company’s terms of service, try to understand what you are agreeing to! No one has time to read hundreds of pages of intentionally opaque legalese, but do your best to be clear on the companies’ policies and privacy record. For example, people were shocked that the Facebook Android app was logging all of their phone calls; well, technically, they agreed to this when they downloaded the app and clicked “install” after seeing its permissions and terms of service.
Facebook and its founder-CEO Mark Zuckerberg are now under intense pressure from users and legislators worldwide about these practices, but only because Facebook is a global giant. Its size gives it tremendous power, but also makes it vulnerable and therefore accountable. What about the thousands, even millions, of other apps you are giving the same permissions to? Lawmakers aren’t going to investigate how every silly game on your phone is also logging your calls, or what its maker is doing with that data.
So think twice before adopting every enticing new product and feature that comes out on the market. Yes, it’s fun to unlock your phone with face recognition software, but is it worth the potential security risks? Alexa and its kin are useful tools, but they also introduce constant monitoring into your home. Nor is it all-or-nothing, a false choice between living in the present without privacy and security or living off the grid in a cave. It’s essential to invest the time to find the right middle path. Decide what’s worth it to you—and remember, it’s not just your individual privacy that’s at stake, but, as we have seen all too jarringly, the future of our open society.