Telegram remains one of the most important social media platforms for the extreme segments of the far right. HOPE not hate has previously highlighted how it allows such extremists to congregate and spread violent propaganda – but little has changed since we did so, says Patrik Hermansson.

Taken from issue 43 of HOPE not hate magazine


FOLLOWING THE horrific March 2019 terrorist attack in Christchurch, New Zealand, large US-based platforms such as Facebook, YouTube and Twitter cleared out some of the most visible far-right activists from their platforms. Media and many members of the public had been outraged by the attacker using Facebook Live to broadcast his mass murder in real-time.

As a result, some far-right movers and shakers migrated to the Telegram messaging app and it has since gained in significant popularity among these figures and their networks. For far-right activists craving mainstream attention, such as Stephen Yaxley-Lennon (“Tommy Robinson”), the move represented a major loss, as Telegram’s user base in the UK is relatively small.

However, for the terroristic and explicitly fascist parts of the far right, Telegram’s growth has offered opportunities to reach a new audience. Its group chats provide community and the possibility to organise while one-way broadcast channels allow far-right groups to spread propaganda widely, akin to Twitter, although with much more sparse moderation.

“Telegram proliferates with much more extreme content” 

Compared to mainstream platforms, it means that Telegram proliferates with much more extreme content. On the platform users frequently romanticise far-right terrorists, urge others to commit violent attacks and share manuals on how to build weapons. Increased pressure from organisations like HOPE not hate and media organisations has had little effect.

In a press release from Europol on 25 November last year, titled “EUROPOL AND TELEGRAM TAKE ON TERRORIST PROPAGANDA ONLINE”, Europol explained that it was collaborating with Telegram and had conducted a “coordinated action focused on the dissemination of online terrorist content” and that “Telegram is no place for violence, criminal activity and abusers”.

Despite the promising words, in practice the concrete effects of this coordinated action have been lacklustre, at least in terms of far-right content. The terror propaganda that is supposedly being tackled continues to spread across the platform with little resistance.

There are some examples of the platform taking action, but these are few and have done little to disrupt the terroristic far-right’s use of Telegram. Last year certain channels on Telegram stopped being visible on Apple and Google Android phones. Attempting to look at these channels on Telegram’s app now displays a message that the channel breaks respective companies’ guidelines of what type of content is allowed in their app stores.

“weak actions are seen in many other of Telegram’s moderation practices” 

In doing so, Telegram acknowledges that it has knowledge of the type of content that these channels spread, but instead of choosing to remove them it hides them on certain devices. These terroristic channels are still visible if one accesses Telegram from other devices.

Similarly, weak actions are seen in many other of Telegram’s moderation practices. Coinciding with a far-right activist making threats to the Al Noor Mosque in Christchurch that had been attacked a year earlier, in March this year certain messages on terroristic Telegram channels disappeared with a message that they broke the guidelines of the platform. These were relatively extreme messages that called for violence particularly against Jews and Muslims.

However, only a few individual messages were removed, the channels were left online and on them, equally extreme messages were left up. After a few weeks, the moderation streak seemed to stop and the targeted channels were left to continue to encourage terrorism.

In late February this year, HOPE not hate provided Telegram with a list of terroristic channels on the app. By June, all but two of the approximately 90 channels we provided were still online. Discouragingly, the evidence points to the channels being removed were removed by the users themselves or as a result of police investigation into the activists attached to it.