Telegram – a tool for hostile actors or a force for good?
Telegram has been no stranger to media scrutiny since its inception in 2013, due to the controversy regarding the adoption of the app by extremists and terrorists around the world. The app was a favoured messaging and propaganda dissemination tool of Daesh terrorists globally, it’s been used as a safe haven for the far-right and was recently pivotal in facilitating the ‘storming of the Capitol’.
Yet the same mechanics that make it an appealing proposition to extremists and terrorists (such as end-to-end encryption and anonymity) have enabled users to employ Telegram as a tool for social good. The app has been used in protest settings across the globe in areas such as Hong Kong and Belarus, playing a central role in coordination and communication. In this post, we explore whether Telegram is the tool of global bad actors or a force for good.
What is Telegram, and how does it work?
Telegram is a cloud-based instant messaging service that launched in 2013, which competes with other messenger platforms such as WhatsApp and Facebook Messenger. The primary USP for Telegram is the security and privacy it offers its users. The messaging platform allows for end-to-end encryption, self-destructing messages, and number cloaking (allowing users to disguise their mobile numbers from other participants in channels and groups).
- 35 million Monthly Active Users (MAU) in March 2014
- 400 million MAU in April 2020
- Telegram was in the top 10 most downloaded apps globally in Q2 and Q3 2020, with ~80M downloads in each quarter
- July 2019 saw a 323% year-on-year increase in the first-time Telegram installs in Hong Kong, during pro-democracy protests
- Facebook messenger had over 3x as many MAU in October 2020, while WhatsApp had around 4.5x as many MAU during the same month
- The app is currently banned in Iran, China, Pakistan (Russia lifted the ban in June 2020, which was largely ineffective)
Users can also create and contribute towards channels and groups, which is a key distinguishing feature of Telegram. Channels have traditionally been used as one-way, broadcasting platforms, allowing individuals, brands or organisations to publish content to an unlimited number of subscribers.
Groups are community spaces that can have up to 200,000 users and are built for active engagement and participation from all group members. Links to groups and channels can also be shared externally, allowing for external promotion of these community spaces. While the functionality of Telegram is not new, the messaging app has boomed in popularity since 2013 due to its more unique features, with over 400M monthly active users currently.
The positive side of Telegram
Telegram was the platform of choice throughout the Hong Kong protests, allowing the public to coordinate and spread vital information, without fear of the authorities gaining access to sensitive information such as names and numbers which could later be used for persecution.
Additionally, the app has been widely adopted by Belarusians during protests against the de facto dictatorship of Lukashenko after the recent fraudulent elections. Nexta Live, one of the most highly subscribed-to channels on Telegram, has been the nucleus for coordinated and informed responses, even during the complete shutdown of the internet by the government after the elections.
The app has also been used to control the movements of large groups of Belarusian protestors away from danger and conflict with the police. Crowd organisers and participants were able to respond to updates in community channels in real-time, such as providing advice to disperse and avoid clashes with riot police. It is hard to deny the tangible, live impact the messaging app has had in supporting the coordination of protest movements globally.
The dark side of Telegram
But it is precisely the features that protected Hongkongers and Belarusians that attracted terrorist groups to the same platform, which has built a reputation for fuelling terrorist and extremist communications across the globe. A report by HOPE not hate published last year, explored the prevalence of extreme far-right channels on Telegram, reporting a surge in popularity of the platform by these groups following the Christchurch massacre of March 2019. In an article published in September 2020, HOPE not hate state that Telegram remains one of the most important hubs for these groups, due to its security features and the lack of proactivity in removing these groups from the platform.
Telegram groups can be small in size but still represent serious risks. The group ‘Atomwaffen’ used Telegram to spread violent propaganda and several members have been convicted for charges including planning terrorist attacks, hate crimes and murder. The app’s reposting features enable interconnectivity between these far-right groups, allowing them to build from disjointed entities towards woven, unified networks.
Telegram in the context of social change
Telegram can be seen as both a catalyst for good and as a tool for harm. For efforts that seek to achieve long-term, sustainable outcomes, Telegram does show potential for enabling proactive communications and being a safe space for discussion, knowledge-sharing and action. Beyond global protest, it is possible to envisage this model being applied to appropriate social campaigns aiming to inform, unify and empower vulnerable audiences over time, such as in countering violent extremism or in the fight against the spread of public health disinformation.
For issues related to information ecosystems, capacity building and resilience to disinformation, there could be strategic advantages to independent media outlets building a presence on Telegram, using it to disseminate high quality, accurate information. The benefits of the broadcasting capabilities are immediately obvious, while Telegram groups could facilitate a more personal relationship between media outlets and their subscribers.
What does the future hold for the messaging app?
More recently, Telegram has started to take more responsibility for the channels that exist on the platform. In 2019, many Daesh-connected groups and supporters were removed from the platform, with a wide-reaching algorithm that caused significant blows to the branches of terrorist networks. Telegram also has channels such as ISIS Watch and Stop Child Abuse, which report on the number of bots and channels being removed daily.
However, if they can thwart an entire terrorist community from hijacking their platform to further their aims, why hasn’t this happened sooner, and why hasn’t the same approach been applied to other extreme threats and online harms?
While platforms such as Telegram are investing in efforts to ensure their platforms are not being used by hostile actors, there are significant challenges to overcome such as the scale of moderation and work required to prevent platforms from being hijacked in the first place. Bad actors spread their presence across multiple forms of alternative communication, and the above work is applicable across the entire digital spectrum, not just Telegram. If ground can be made in these areas, the positive uses of these technologies, such as coordinating and supporting genuine protests, could one day begin to outweigh the negatives.