The Jordan Center stands with all the people of Ukraine, Russia, and the rest of the world who oppose the Russian invasion of Ukraine. See our statement here.
A version of this article originally appeared in Post- Soviet Affairs.
Mariëlle Wijermars is a CORE Fellow at the Helsinki Collegium for Advanced Studies of the University of Helsinki. Her research focuses on the human rights’ implications of internet policy and platform governance, in particular in authoritarian states.
Tetyana Lokot is Associate Professor in Digital Media and Society at the School of Communications in Dublin City University, Ireland. She researches threats to digital rights, networked authoritarianism, digital resistance, internet freedom and internet governance in Eastern Europe.
Information and communication technologies (ICTs), and social media in particular, play an important role in contentious politics. Yet scholarship on protests, especially as regards authoritarian states, insufficiently acknowledges the agency of such digital platforms. Platforms are seen as primarily instrumental: operationalized as the space where discontent and political participation happen, as the tools that enable citizens to mobilize against authoritarian rule, or as means for regime actors to suppress public discontent.
These arguments presuppose that citizens and regimes use digital technologies in a rational manner, for instance, based on their features. In reality, the reputation or public framing of a platform also shapes how it is perceived and used. In our 2022 article on the messaging app Telegram, we argue that digital platforms should be understood as actors with agency that participate in contentious politics and can shape the outcomes of political transformations. It is crucial to understand how a platform’s reputation is created and perceived by the various actors involved in protests and what the consequences might be if these perceptions are incorrect.
To understand how and why certain tech companies come to be seen as “harbingers of freedom” and how this reputation impacts their role as actors in contentious politics, we examine Telegram’s performance and practices during the 2020 Belarus protests against presidential election fraud, during which the Belarusian state sought to restrict protesters’ use of the internet. Large technological corporations often promote the idea that they serve to protect internet freedom and democratization around the globe. Yet platforms’ adherence to corporate responsibility principles is highly uneven and PR-driven. In authoritarian contexts, users are even more dependent on platforms claiming to be “safe” from state interference and are vulnerable to significant real-life consequences should their trust in these companies be undermined.
In many post-Soviet countries, the internet is perceived as a “liberator” and as an alternative to traditional media that have been co-opted by elites—a reputation long predicated on the internet’s relative deregulation. Across the region, ICTs have indeed played an important role in activist movements and large-scale protests. The last decade, meanwhile, has seen a significant growth in internet controls across former Soviet states, including repressive regulations specifically aimed at citizen expression online.
The tendency to view internet platforms as “harbingers of freedom” is bolstered by the fact that most top social media networks (except the Russian VKontakte) are run by by Google, Facebook, or other Western companies, signaling to post-Soviet users these platforms may be more concerned than domestic ones with free expression and user privacy. Platforms’ claims, performances and practices in the context of civic and political activity should be closely studied, along with how their “actorness” is co-constructed by state actors, media, and the public. Doing so allows us to gain a fuller picture of the dynamic unfolding of protest events and helps better explain user motivations and patterns of platform use.
Telegram emerged in the context of the Kremlin’s crackdown on activism and political expression in Russia and, as Ksenia Ermoshina and Francesca Musiani noted in 2021, was likely “created as a tool to protect their communication in the context of political persecutions.” Telegram offers end-to-end encrypted messaging options, in addition to public “channels” and groups, and prides itself on lenient moderation policies. In its public-facing messaging, the platform highlights support for “user security, privacy, and freedom of expression” as part of its mission.
However, the company has so far failed to demonstrate in any verifiable way that it lives up to these commitments. It lacks concrete transparency and accountability measures and bases its credibility on its users’ blind trust in the platform team’s stated credentials. Telegram’s functionality and stated values have proven to be especially popular with activists and dissidents based worldwide, from Iran and Hong Kong to Russia and Belarus. Political dissenters appreciate the app’s relative anonymity and security, its efforts to remain accessible in the face of state censorship, and its lax approach to content moderation. But the same reasons have also earned Telegram a loyal following among extremist and terrorist groups and have made it popular with state-affiliated propaganda operations in authoritarian regimes.
The 2020 Belarus protests offer an opportunity to examine how Telegram and other actors co-construct the platform’s actorness and credibility, helping to create the appearance of political agency. Protest participants make decisions about digital media by weighing functionality against assumed risks. Risk assessment of particular platforms may also be informed (sometimes incorrectly) by how users assess a platform’s reputation. To analyze platforms as actors, we propose the conceptual framework of platform actorness comprising three dimensions: first, practices (enacted actorness); second, performance (framed actorness); and third, perception (perceived actorness). Our research unpacks these dimensions and evaluates how they might shape unfolding protest events like the Belarus uprising.
The first term in our list, “platform practices,” refers to the actions and decisions taken by a platform that determine how users engage with its features, which can in turn potentially impact contentious political events. We differentiate between four categories of practices.
- Technical facilitation of use refers to platforms facilitating access for users. This aspect can involve creating or promoting censorship circumvention tools, such as VPNs or proxies, and may affect accessing or sharing information, including for mobilization purposes. Such features are crucial in case of internet shutdowns.
- Content moderation involves platforms’ policies and decisions about removing or blocking accounts or content, often at the request of repressive states. Since platforms have the capacity to act as moderation arbiters, their decisions have political salience and moderation decisions are often interpreted as political themselves. While (over)moderation receives the most attention, we argue that refraining from moderation equally constitutes a political act.
- Data sharing refers to platforms’ data storage practices and decisions about sharing user data with third parties, including governments. Data sharing by platforms poses a risk to individual users, groups and movements organizing against repressive regimes, who may be targeted for their activities.
- Finally, platform design and affordances circumscribe how platforms change their features or functionality. Whether technical (facilitating anonymity), communicative (affording new forms of communication), or symbolic (changes to color schemes), such changes can reshape platforms’ communicative functions and affect the forming and consolidation of (virtual) communities.
The second aspect of platform actorness, “performance,” refers to how platforms communicate their actions and values and how they frame decisions or new features. Studying how platforms “perform” their actorness is important because users typically have limited insight into their workings. Public statements concerning corporate values and decisions can shape users’ understanding of the platform’s “operating logics” and suitability to their aims. As platform reputation factors into users’ risk assessment, misleading or incomplete platform performance may obfuscate vulnerabilities that arise from using it.
Finally, “platform perception” refers to how a platform’s actions and performances are perceived by other political actors. How states perceive platforms’ actorness informs whether they see the platform as a threat and what regulatory action they may take to influence it. It also helps regimes decide whether platforms can be used to their advantage while intervening in political events. How citizens perceive a platform’s actorness informs whether they use the platform, their assessment of what they can (safely) say or share, and how reliable the shared information is, as well as how well the platform shields against manipulation and surveillance.
Analyzing Telegram’s enacted, performed, and perceived actorness in the protests, we found that its performance and practices resonated with Belarusian citizens and led them to form affective connections to the platform, since they perceived Telegram as an ally in their struggle against repressions and digital censorship. Meanwhile, the Belarusian state used Telegram’s aversion to censorship and preference for limited content moderation to co-opt the civic protest space, while also using Telegram as a convenient scapegoat to blame for the discontent.
Given Belarus’s heavily state-co-opted media landscape, protesters relied on independent (largely online) media and informal communication channels. As Belarusian users weighed pragmatic considerations against their perceptions of risk, Telegram’s reputation as a free speech champion, as well as the reputation of particular channel and group administrators, played an important role in decisions about using the platform. The platform’s technical facilitation of access enabled community-building, but also helped build a sense of allegiance necessary for sustained use. At the same time, the platform’s lack of content moderation made protesters’ communications, and by extension their embodied selves, vulnerable to state manipulation and provocations.
The case of Telegram demonstrates that digital platforms indeed have agency in ways that are relevant to contentious politics. Affective relations are central to consolidating political communities of action—and, we argue, to citizens’ relationships with platforms. The role of social media platforms in contentious politics in the post-Soviet space is complicated. To understand it, we should not only examine citizens’ perceptions of platforms, but also remain aware that post-Soviet autocracies may co-opt and exploit them. Platforms’ actions and the (lack of) debate around their work in authoritarian states (especially those considered “low priority”) may perpetuate otherwise discredited strands of thought about the role of platforms in democracy promotion.
While our analysis builds on a case of contentious politics in an autocratic regime, our conceptual framework for platform actorness can equally elucidate the triangle of platform-state-citizen relations in other regime types. For example, Telegram’s ambiguous actorness has been highlighted in the context of Russia’s 2022 invasion of Ukraine. Here we see the platform being used by both Ukrainian resistance activists to gather open-source intelligence and evidence of war crimes, and by Ukrainian officials for humanitarian support. But the platform has equally become a home for Russian pro-war bloggers and state propagandists vying for a greater share of the wartime information space. In the same way, Twitter has emerged as a platform that fosters public diplomacy and crowdfunding, but also disinformation. Its reputation has recently moved to more uncertain ground due to Elon Musk’s acquisition of the company.
Digital platforms are increasingly central to political processes, from protests to elections, in democracies as well as authoritarian states. Our platform actorness framework can serve as a useful heuristic for investigating the differing degrees of actual and perceived platform power, transparency, and accountability with respect to different target audiences and markets.