Promoting negative sentiments about good things is just as malicious and dangerous as promoting harmful things
It has become normal to talk about political “tribes” these days, instead of mere parties or policies. The implication is a deeper feeling of affiliation, one that approaches family and is very hard to break. Your entire identity is wrapped up in the tribe and once you are a member, leaving can come with dire consequences.
The rise of online tribes requires new strategies for countering the spread of misinformation. Simply putting accurate information online isn’t enough; it must target the communities most in need of it and do so in a way that will succeed, not push them further away.
But how do we convince people to believe in facts when their personal beliefs and doubts get in the way? That is just what I discuss with well known American pollster Frank Luntz in our latest conversation in my lockdown series.
Notice the trend: the shrinking, increasingly specific criteria for tribal affiliation. The loyalty and fanaticism once reserved for a family connection or a supreme being can now be found in the supporters of a politician or a few policy positions.
The passion in these tribal groups can border on religious, which is dangerous in its own right. When you think your beliefs are not only right, but that those who disagree with them are apostates, it leads easily to escalation and even violence.
Thanks to social media, tribal alliances can now be formed almost instantly, globally, with no shared background beyond the ability to access the internet. This broad availability also makes these tribes vulnerable to manipulation and amplification from bad actors, from con-men trying to make a buck to agents of influence seeking to spread disinformation to cause chaos or weaken an adversary.
Another aspect of online tribalism is how it is fueled by the platforms themselves, automatically, algorithmically. Many studies have demonstrated how social media sites act as “radicalization engines,” pushing people toward more extreme content. The system is designed to create engagement, to give people content they like, to keep them on the site as long as possible to sell more ads. It sounds innocent enough, but we see the results.
You liked this video or group about the dangers of immigration, or street protests? Here is a similar one that is popular, with racist content, or supporting a violent movement. You shared a post about a controversial medical procedure? The algorithm serves you a few more, but these might be critical of scientifically uncontroversial things, or suggesting that the real cause of Covid-19 is 5G cell towers.
Repeat that effect millions of times per day, per hour, and you only need a small percentage of people to buy in to create a powerful ripple effect. Tribes form around these unfounded and often dangerous beliefs, which become part of a shared identity. The believers become more isolated from the non-believers, and antagonistic toward them. The online conspiracy movement known as “QAnon” or just “Q” shows how quickly these things can grow and have impact in the non-virtual world.
The political results of these online tribes we see all around, and there are other effects as well, dangerous for all. A good example is what is often generalized as “the anti-vax movement,” people who believe vaccines are dangerous, or even that they are part of a nefarious conspiracy. They come from all backgrounds, from Greens who say vaccines are unhealthy to religious zealots who don’t want to “interfere with the works of God” to anti-government types who view with suspicion any public initiative.
Calling most of them vaccine “skeptics” is a disservice to the honorable tradition of skepticism, which seeks evidence and logic to support theories and assertions. That’s nothing less than the foundation of all science. The conspiracies and baseless rumors most anti-vax sites traffic in are the opposite. These groups existed long before the internet, of course, and before hostile foreign actors got involved fanning the flames. But the internet’s ability to accelerate trends and bring disparate individuals together is unprecedented. As is the ability of outside instigators to easily insert themselves into a tribe online.
That most of these groups are ad hoc and spread out doesn’t make them less dangerous, especially when they become affiliated with a larger tribe, like a political one. That’s what we’re seeing in the polls in the US now, with 50% of Republicans who supported Donald Trump saying they won’t get the Covid-19 vaccine. These are mostly the same people who don’t want to wear masks or obey lockdowns. (There is also considerable overlap with those who say the 2020 election was stolen from Trump, an even more dangerous conspiracy theory.)
Vaccination is based on inoculating a critical mass of people, enough to cause “herd immunity” and the decline and eventual disappearance of the disease. If a sizable bloc of citizens refuse to be vaccinated, it is much harder or even impossible to reach that goal. It’s as if a group of hackers spread a conspiracy that VPNs and antivirus software were really dangerous apps and had to be uninstalled and avoided! (Not to give them any ideas.)
Promoting negative sentiments about good things is just as malicious and dangerous as promoting harmful things. It’s a crisis, so it’s not enough to write off the deniers. Somehow, they must be convinced or incentivized, since attempts at coercion only play into their conspiracy theories.
How to do this is the subject of the conversation with my guest on the next episode of "Garry on Lockdown" for Avast — a master class with a master of public opinion and how to measure it and influence it, the American pollster Frank Luntz. He recently held one of his famous focus groups with Trump supporters who are wary of getting the vaccine. The results provided some insight into fixing the problem, and also with dealing with the issue of combating online tribalism in general.
Facts aren’t enough if the target audience is too suspicious or antagonistic to listen. One key takeaway from Luntz's focus groups is that the audience must participate in the conversation, not feel like they are being lectured to or given orders by politicians and experts they do not trust. Often they trust social media friends or even strangers in their “tribe” with no formal knowledge at all over subject experts. To be heard, you have to message within the group, not only from the outside. It’s a tricky process, but one we must improve at or more people will fall to the fringes, outside of the reach of traditional communications.
Education, or deprogramming, isn’t enough, and can make things worse if not done correctly. You have to have the right message, delivered in the right way. With that in mind, maybe we can have the algorithms work for us this time by finding the pro-vaccination advertisements that have the best results. As you know, I’m always happier to be working with machines instead of against them!
Real war has come again to remind us that cyberwar, for all its terrors, is not yet on par with the damage done to flesh and family by bombs and bullets.
Defining criteria for creating the ideal "tech hate stack" on platforms including Parler, Facebook and Telegram.