An Ugly Truth: A book review

David Strom 29 Jul 2021

Deeper dives into recent political events and misinformation frame the debate over the role that Facebook should play in our society

New York Times reporters Sheera Frenkel and Cecilia Kang have been covering the trials and tribulations of Facebook for the past several years, and they have used their reporting to form the basis of their new book, An Ugly Truth: Inside Facebook's Battle for Domination. The book is based on hundreds of interviews of these key players  and shows the roles played by numerous staffers in various events, and how the company has acted badly towards protecting our privacy and making various decisions about the evolution of its products. Even if you have been following these events, reading this book will be an eye-opener. If you are concerned with your personal security or how your business uses its customer data, this should be on your summer reading list.

The book lays out many of the global events where Facebook’s response changed the course of history. 

The book details the Russian-backed hacking of the Clinton campaign emails and the ways in which the actors who perpetrated the hacks learned to manipulate Facebook’s advertising system to disseminate disinformation, the Cambridge Analytica scandal, and the role the social network played in perpetuating genocide against the Rohingya Muslims in Myanmar. 

“When the internet landed in a country where a social network became the primary and most widely trusted source of news,” the authors write. “Facebook was designed to throw gas on the fire of any speech that evoked an emotion because its algorithms favored sensationalism.”

Learning from the historical record

In each situation, the book details how Facebook staffers sounded alarms and proposed changes to various procedures and algorithms that eventually were quashed by management, even when managers took things to Mark Zuckerberg himself. 

The issue Frankel and Kang argue is that conflict, disinformation, and hate speech makes the company money by increasing engagement and selling more ads, hence the book’s title. And “Facebook’s great strength was the unparalleled engagement of its users,” the authors state. Time and again, both Zuckerberg and Sandberg were driven by a misplaced assumption that “Free expression demands the only way to fight bad information is with good information.” 

This philosophy has recently come under fire, particularly as it impacts decisions on whether to get vaccinated for Covid-19. Rather than remove misinformation (or block high-profile sources, such as what Twitter has done and continues to do), Facebook leaves these posts intact. As stated in a recent report by the US Surgeon General Vivek Murthy, “Misinformation is often framed in a sensational and emotional manner that can connect viscerally, distort memory, align with cognitive biases, and heighten anxiety. People can feel a sense of urgency to react to and share emotionally charged misinformation with others, enabling it to spread quickly and go viral.” He recommends that Facebook and other social media platforms need to assess the benefits and harms of their products and take responsibility for addressing the harms.

The authors cite the work of Renee DiResta, Research Manager at the Stanford Internet Observatory, who says that Facebook “had built the perfect tool for extremist groups.” She has studied the anti-vaccine movement in detail before Covid-19 and commented more recently in this Twitter thread about ways to sift through misinformation. (Side note: If you'd like to find out more about DiResta's ongoing research, you can sign up to receive weekly briefings.)

The book presents various internal debates on the algorithms behind its central News Feed (what the authors call “all the news from your friends that you never knew you wanted.”) These debates illustrate how Facebook ultimately treats your personal data, which is to say as a commodity that helps it sell ads.

But these deeper dives into the Russian hacks and election misinformation also frame the debate over the role that Facebook should play in our society and the level of trust we place in the platform. Time and again, security specialists at Facebook were outside the decision chain and not present during any discussions about data security. “The fact that the product teams were not part of any security conversations was a big problem,” says Alex Stamos, former Chief Security Officer at Facebook, in reference to the Russian 2016 scandals.

Takeaways for better running your business

1. The devil is in the details when it comes to implementing data security

One security staffer said, “They honestly treated security like it was something they wanted taken care of quietly, in a corner, where they didn’t have to regularly think about it.” The authors write: “Zuckerberg talked a good game about the platform’s commitment to security and data privacy, but growth came first, and these issues were an afterthought.”

The authors document how poorly Facebook has considered critical decisions, where policies were created organically or a response to a particular problem. We’ve talked on this blog before about building security into your practices and  to be more deliberate about how these policies are set, be transparent about the process, and have a more collaborative effort among your CEO, your product team, and your IT management.

2. Take diversity and equity issues seriously

The authors cite several examples where Zuckerberg demonstrates a tin ear on this issue: “How if you were Black, you might attract ads for predatory loans, or if you were lower-income, you might attract ads for junk food.” At the same time, it's worth noting that Facebook has taken steps to improve diversity within its own staff.

Some action items to consider

Even if you have followed many of our suggestions on protecting your privacy, you may still want to take an additional step. The book shows that changing out a SIM card on your phone is no guarantee of any privacy if your phone is running any of Facebook’s apps (including WhatsApp, Messenger and Instagram). The book documents how Facebook staffers were able to figure out the Russian 2016 participants due to data trails left behind on the hackers’ phones. This was uncovered even though the users tried to mask their locations or were using other SIM cards in their phones. If you are concerned about your privacy, the only way to stop the platforms from having any visibility into your actions is to delete these and other social media apps from your phone.


Further reading:
You Can Stop Stupid: A book review
Boost your application security knowledge with this guide

--> -->