Viewpoints

Hate speech on Telegram is on the rise

David Strom 25 Feb 2021

How divisive platforms are designed and moderated

Earlier this month, Parler went back online, after several weeks of being offline. Its return has me thinking more and more about the ideal platform for divisive content and hate speech.

It appears that there are two essential elements: the ability to recruit new followers to hate groups, and the ability to amplify their message. The two are related you ideally need both. For all the talk about its hate-mongering, Parler isn’t really the right technical solution. Instead, Telegram has recently stepped into the limelight for being a suitable home for promoting hate speech.

This blog post comes out of email discussions that I have had with Megan Squire who studies hate groups as a security researcher and computer science professor. She gave me the idea when we had been discussing this report from the Southern Poverty Law Center (SPLC) on how Telegram has changed the nature of hate speech. It is a chilling document that tracks the rise of hate groups over the past year. But the SPLC isn’t the only one paying attention: numerous other computer science researchers have tracked the explosive growth in these hate groups since January's Capitol riots and other seminal events within the "hate landscape".

Why Telegram?

Telegram’s rise in numbers doesn’t tell the complete story. Telegram has crafted a more complete social platform for distributing hate speech and recruiting new followers. Certainly, Facebook still has the largest userbase, but their tech hate stack (if you want to give it a name) is nowhere near as well developed as that of Telegram. Parler’s tech hate stack is a distant third. Let's compare the three networks below in terms of both amplification and recruitment elements:

Criteria

Parler

Facebook

Telegram

Type of service

Microblog

Social network

Messaging+

Coherent and transparent reporting process for hate speech

No

Mostly and improving

No

Support email inbox

No

Yes

No

Content moderation team

It depends

Yes

It depends (see below)

Appeals process

Yes

Yes

No

Encrypted messaging

No

Separate app

Built-in

Corporate HQ location

USA (for now)

USA

Dubai

Growth in English-speaking hate group followers

Unknown

Unknown

Huge growth (SPLC report)

Group cloud-based file storage

No

No

< 2 GB

Group-based sticker sets

No

No

Yes

Bot infrastructure and in-group payment processing

No

No

Yes


“Telegram is absolutely the platform of choice right now for the harder-edged groups. This is for technical reasons as well as access/moderation reasons,” says Squire. You can see the dichotomy in the table above: most of the moderation features that are (finally) part of Facebook are nowhere to be found or are implemented poorly on Telegram, and Parler is pretty much a no-show.

Telegram’s file sharing feature, for example, “allows hate groups to store and quickly disseminate e-books, podcasts, instruction manuals, and videos in easy-to-use propaganda libraries.” I've included links in the chart above to descriptions on why the bot infrastructure and sticker creation features are so useful to these hate groups.

What about moderating content?

Here we have conflicting information. For this reason, I've labeled the boxes for Parler and Telegram as "it depends." Telegram has said that their users do content moderation. In their FAQ, they claim to have a team of moderators. For Parler, their community guidelines document says in one place that they don't moderate or remove content, and in another that they do. (My personal guess is that they both do very little moderation.)

The picture for Parler is pretty bleak. If they do succeed in keeping their site up and running (which isn’t a foregone conclusion), they have almost none of the elements that I call out for Facebook and Telegram. Using the Twitter micro-blogging model doesn’t make Parler very effective at amplification of its messages (at least, not until some of their personalities can bring over huge crowds of followers) or in recruitment, especially now that their mobile apps have been neutered.

There are two technical items that are both useful for Telegram: its encrypted messaging feature and the difference between its mobile app and web interfaces. Much has been written about the messaging features between the different social networks (included in my related post). But Telegram does a better job both at protecting its users’ privacy (than Facebook Messenger) and has much better integration into its main social network code.

The second item is how content can be viewed by Telegram users. To get approval for its app on and Google Play and the App Store, Telegram has put in place self-censorship “flags” so that mobile users can’t view the most heinous posts. However, all of this content is easily viewed in a web browser. Parler could choose to go this route, if they can get their site consistently running.

As you can see, defining tech hate stacks isn't a simple process, and it only continues to evolve as hate groups better figure out how to attract viewership.