brand logo
Rate Us on BestWeb.lk
OSA: ‘Chilling’ impacts on digital/social media content creators

OSA: ‘Chilling’ impacts on digital/social media content creators

19 Sep 2025 | BY Ruwan Laknath Jayakody


  • 46-65% have modified content strategies, adjusted content practices; 32-45% have explicitly engaged in self-censorship (political satire, cultural criticism); 22-30% avoid content related to sensitive topics (politics/religion/nat. security)
  • Creators have deleted/archived previously posted content that could now be deemed controversial, intentionally depoliticised language, withdrawn from social commentary (esp. on X/TikTok)
  • Content now increasingly filtered, less confrontational, more brand safe
  • Pseudonymous/parody/persona creators and performance artists fear being misclassified and has led some to retire fictional/satirical accounts
  • Vague terms like ‘false statements’, ‘nat’l security’, and ‘religious feelings’ make it difficult for content creators to distinguish between legal compliance and potential infringement
  • Many creators have withdrawn from producing critical content, not due to disagreement with the OSA’s goals, but due to uncertainty and the fear of arbitrary enforcement
  • Protective mechanisms/safeguards – a transparent appeals process, inde. oversight committees - needed 


In the wake of the enactment of the Online Safety Act, No. 9 of 2024 (OSA), 46-65% of digital/social media content creators have modified their content strategies and adjusted their content related practices due to fear, 32-45% have explicitly engaged in acts of self-censorship (political satire, cultural criticism), 22-30% avoid content related to sensitive topics such as politics, religion, and national security while several creators have deleted or archived previously posted content that could now be deemed controversial, and others have intentionally depoliticised their language or withdrawn from social commentary, especially on the social media platforms, X and TikTok.

These findings were made in ‘A study on the impact of the OSA on social media content creators’ which was authored by K.H.E. Gunasingha (attached to the Peradeniya University’s Arts Faculty’s Fine Arts Department), and published in the National Institute of Social Development’s Journal of Social Sciences’ third volume’s first issue, this month. 

In the digital era, social media has revolutionised the landscape of communication, enabling individuals to create, share, and engage with content on an unprecedented scale. Platforms such as Facebook, Instagram, TikTok, X, and YouTube have become central to the public discourse, shaping not only entertainment and lifestyle trends but also political narratives, social movements, and cultural identities (D. Boyd’s ‘It’s complicated: The social lives of networked teens’ and J. van Dijck and T. Poell’s ‘Social media platforms and education’). 

Within this dynamic ecosystem, content creators ranging from influencers and independent journalists to educators and comedians play a pivotal role in generating and amplifying diverse viewpoints. Their influence extends beyond personal branding to shaping collective perceptions and sparking critical conversations on issues of national and global significance (C. Abidin’s ‘Internet celebrity: Understanding fame online’ and H. Jenkins, S. Ford and J. Green’s ‘Spreadable media: Creating value and meaning in a networked culture’).

However, the rapid growth of digital communication has also introduced significant challenges. The spread of misinformation, hate speech, online harassment, and content that may incite violence or social unrest have prompted governments worldwide to consider regulatory interventions (T. Gillespie’s ‘Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media’ and C. Wardle and H. Derakhshan’s ‘Information disorder: Towards an interdisciplinary framework for research and policy making’). 

Against this backdrop, Sri Lanka introduced the OSA as a legislative framework aimed at addressing these pressing concerns. The Act seeks to create a safer online environment by curbing the dissemination of harmful or misleading information, preventing online impersonation, and protecting individuals and institutions from cyber threats. The OSA is notable for its comprehensive scope and stringent provisions. It criminalises the publication or transmission of prohibited statements, including those deemed false, malicious, or inciteful toward violence or rioting. 

Furthermore, it targets content that threatens national security, incites religious or ethnic hatred, or tarnishes the reputation of public officials. The law also grants wide-ranging powers to regulatory bodies, including the ability to block access to platforms, issue takedown orders, and prosecute individuals for online offences. These mechanisms are intended to uphold public order, social cohesion, and national integrity in the face of digital disruption.

While the goals of the Act may align with the need to ensure accountability and safety in online spaces, they also raise significant concerns regarding the freedom of expression, digital rights, and creative autonomy. This is particularly true for social media content creators who often operate at the intersection of art, opinion, and activism. Their content may touch on sensitive political, cultural, or religious themes, making them vulnerable to regulatory scrutiny. The fear of legal repercussions can result in self-censorship, a chilling effect that undermines the openness and vibrancy of digital discourse (J. Penney’s ‘Internet surveillance, regulation, and chilling effects online: A comparative case study’ and Z. Tufekci’s ‘Facebook said its algorithms do help form echo chambers, and the tech press missed it’ and ‘Algorithmic harms beyond Facebook and Google: Emergent challenges of computational agency’). Additionally, the ambiguity in defining what constitutes a ‘false statement’ or ‘incitement’ leaves room for selective enforcement and the potential misuse of the law.

Legal provisions influence creative practices, editorial decisions, and public engagements. Does this Act foster a climate of fear that stifles innovation and marginalises dissenting voices? The State leads efforts to regulate digital harm, but there is a democratic imperative to preserve open expression in online spaces (D. Kaye’s ‘Speech police: The global struggle to govern the internet’ and E.B. Laidlaw’s ‘Regulating speech in cyberspace’).

The regulation of digital platforms has emerged as a critical arena where state authority, corporate interests, and civil liberties intersect. As social media becomes increasingly central to civic participation and personal expression, it prompts necessary inquiries into the limits of legal oversight. 


Global approaches to online safety legislation


Across the world, governments are navigating the tension between online safety and the protection of civil liberties. Laws and digital safety frameworks such as Germany’s Network Enforcement Act, the United Kingdom’s (UK) Online Safety Bill, India’s Information Technology (IT) Rules, and Singapore’s Protection from Online Falsehoods and Manipulation Act empower regulators to compel the rapid removal of illegal content and enforce platform accountability. These laws are often justified on the grounds of protecting children, preventing hate speech, and countering misinformation. However, critics argue that such frameworks may pave the way for overreach. Gillespie describes this phenomenon as the rise of platform governance, where digital platforms become quasi-regulators under state pressure. Similarly, N.P. Suzor’s ‘Lawless: The secret rules that govern our digital lives’ highlights that legal ambiguities often lead to inconsistent enforcement, risking the suppression of legitimate expression. United Nations Special Rapporteur D. Kaye cautions that vague definitions of “harmful content” can be misused to target dissent, especially in politically fragile contexts.


Freedom of expression and the chilling effect


One of the central concerns in digital regulation is the chilling effect — the deterrence of lawful expression due to the fear of legal or surveillance related consequences. Penney shows that awareness of surveillance mechanisms correlates with a reduction in political and social engagement online. This effect is especially profound in non-democratic or hybrid regimes. Tufekci emphasises the role of digital platforms as safe spaces for activism and minority expression. However, when regulatory frameworks lack transparency or due process, these platforms risk becoming tools for oppression. P. Chakravartty and S. Roy’s ‘Media pluralism redux: Towards new frameworks of comparative media studies 'beyond the West’” reinforces this point through the Global South where governments have leveraged digital laws to silence critics under the pretence of national security or public order.


Role and risks of social media content creators


Unlike mainstream journalists, social media content creators operate independently — often blending entertainment, education, and activism. C. Abidin’s ‘Mapping internet celebrity on TikTok: Exploring attention economies and visibility labours’ explores how the “micro-celebrity” culture is increasingly politicised, especially among creators engaged in social commentary or satire. Their informal and accessible style of communication broadens public discourse but also exposes them to risk. S. Cunningham and D. Craig’s ‘Creator governance in social media entertainment’ characterises content creators as hyper-visible but under-protected, arguing that they often lack institutional safeguards and legal literacy. In South Asia, creators have faced criminal charges or digital harassment for content perceived as controversial. In India, Pakistan, and Bangladesh, there is a pattern of the arbitrary enforcement of IT laws, further illustrating the precariousness of digital expression in volatile environments.


Digital rights, ethics, and democracy


Any attempt to regulate online speech must adhere to the international legal principles of necessity, legality, and proportionality (T. McGonagle’s ‘Fake news: False fears or real concerns?’). When these standards are violated, the risk of overregulation becomes imminent. The Association for Progressive Communications observes that online safety laws, often introduced during national crises, become permanent instruments of censorship, particularly against dissenters and minority voices. In this context, the Global Voices organisation has highlighted the dangers of algorithmic moderation and automated content takedowns. These systems frequently fail to account for cultural nuance, disproportionately affecting marginalised communities such as lesbian, gay, bisexual, transgender, queer and questioning plus individuals and ethnic minorities who rely on digital platforms for self-expression and advocacy.


SL’s legislative environment and OSA


Sri Lanka’s regulatory approach to media and speech has historically fluctuated between liberalism and authoritarianism. Successive governments have employed emergency regulations, counter-terrorism laws, and cybercrime acts to curtail press freedom. The OSA introduces several problematic provisions. It criminalises vaguely defined acts such as “false statements”, “religiously offensive content”, and “incitement to unrest.” The Act grants disproportionate power to a centralised Online Safety Commission without adequate judicial oversight. Furthermore, these provisions could be used to intimidate creators, particularly those producing satire or commentary critical of the State.

Globally and locally, digital legislation is evolving in ways that simultaneously address and threaten democratic values. While the goal of online safety is legitimate, its implementation must not come at the expense of free expression and creative freedom. The OSA exemplifies the complexities of this balance. For social media content creators who act as cultural curators, activists, and public educators, the Act presents a dual challenge: navigating the boundaries of legality while maintaining their creative voice. The enactment of the OSA has generated significant concern among digital content creators, raising questions about how this legislation affects the freedom of expression, digital creativity, and platform-based livelihoods. There are ambiguities in the legal language and the wide discretionary powers granted to the Online Safety Commission.


Methodology 


Gunasingha’s study employed a mixed-methods research design, integrating both quantitative and qualitative methodologies. A convergent parallel design was adopted. A structured online questionnaire was administered to 100 social media content creators in Sri Lanka including lifestyle and entertainment influencers, independent political and news commentators, meme page administrators, bloggers and podcasters, and social justice educators and activists. Purposive sampling was combined with snowball sampling to ensure diversity across gender, ethnicity, content genre, and platform type (e.g. YouTube, TikTok, Instagram, X). Participants were recruited via creator networks, digital rights organisations, and direct outreach. The 32-item instrument was categorised into thematic clusters: demographics and platform engagement (e.g. content type, follower count, posting frequency), awareness and knowledge of the OSA, the perceived legal and reputational risks under key sections (e.g., false statements, impersonation, religious offense), behavioural adaptations (e.g., self-censorship, use of disclaimers), and the perceptions of digital rights and expressive freedoms. A total of 20 semi-structured interviews were conducted with a purposive subset of survey respondents. These participants represented varied content domains such as political satire, religious commentary, and youth activism, and included both micro- and macro-influencers. Key provisions such as Section 12 (false statements prejudicial to national security or which pose a threat to public health, public order, and promotes feelings of ill-will and hostility between different classes) and Section 16 (deliberately and maliciously insulting and outraging religious beliefs, religious feelings and religions) were discussed. A qualitative content analysis was conducted on: 75 publicly available social media posts (e.g. stories, reels, tweets), 12 news articles and five press statements by digital rights organisations, and engagement metrics such as likes, shares, and comment sentiment. Limitations of this study are that the non-probability sampling limits generalisability while the social desirability bias may have influenced self-reportage. 


Results and discussion

Content creator awareness and interpretation of the Act

A total of 74% of the survey participants indicated a general awareness of the Act, but, only 28% could correctly identify its key provisions, such as those under Sections 12, 14 (maliciously or wantonly provoking or inciting to cause a riot), 16, and 18 (online cheating by personation). This gap in knowledge was echoed where many creators acknowledged a “surface level” understanding and expressed uncertainty about the legal thresholds of terms like “false statements” and “national security”. Interviewees emphasised that the ambiguous legal language left much room for subjective enforcement. A political satirist noted: “It’s not the law itself that silences me, but the not knowing of where the line is.” This legal precarity serves as a significant deterrent to expressive content, particularly in politically or culturally sensitive domains (A. Perera’s ‘The digital muzzle: Free speech and self-censorship under the OSA’).


Behavioural change and self-censorship


Content related to politics, religion, or national security are areas most likely to invoke Sections 12 and 16 of the Legislation. As one creator explained: “I used to comment on national events with humour. Now, I just post recipes”. This chilling effect aligns with global digital laws with vague enforcement parameters, suggesting that legislation meant to promote safety may unintentionally foster cultural and political silence (T. Harris and P. Nguyen’s ‘Digital rights in a restrictive landscape: A study of online expression’).


Impact on expressive freedom and digital creativity


A recurring theme in the interviews was that content is now increasingly filtered, less confrontational, and more “brand-safe”. This is particularly detrimental to genres like satire, religious commentary, and activist media, where engagement often depends on bold expression. A striking concern surrounded Section 18, which criminalises online impersonation. While the Section targets fraud and malicious identity use, pseudonymous creators and performance artists fear being misclassified. For instance, a lifestyle blogger using a pseudonym reported: “I’m afraid that my parody character might be interpreted as impersonation. That persona was my brand.” This concern has already led some creators to retire their fictional or satirical accounts, reducing the diversity and dynamism of Sri Lanka’s digital content space (J. Smith’s ‘Identity and creativity in digital cultures: Risks of misinterpretation’ and R. Smith’s ‘Freedom versus safety: Regulating online speech in a digital world’).


Reactions to specific clauses and perceived risks


A nuanced insight emerged around Section 14 which penalises content inciting violence or rioting. Unlike other provisions, this Clause received moderate support: 62% of the respondents agreed that such controls are necessary in the age of misinformation and viral hate speech. However, many interviewees cautioned against the overreach of this provision, noting that terms like ‘incitement’ can be weaponised in politically polarised contexts. Independent journalists and alternative media creators expressed concern that legitimate criticism could be reframed as incitement or subversion. As one journalist stated: “There’s a difference between calling for justice and being accused of instigating unrest. The line has blurred.” This dilemma reflects on how digital safety laws may disproportionately affect dissenting voices.


Comparative legislative perceptions


Participants expressed awareness of similar laws in India and Singapore but noted the lack of stakeholder consultation in the Sri Lankan context. Respondents perceived the Online Safety Commission as opaque and politically aligned, raising concerns about enforcement-related impartiality. Interviewees contrasted this with the UK’s Online Safety Bill, which they noted involves greater civil oversight and public engagement, suggesting that procedural transparency could mitigate fear and resistance.


Interpretation


The overall findings point to a paradoxical effect: Although the OSA seeks to enhance user safety and reduce harmful content, it simultaneously imposes implicit constraints on digital creativity, particularly in areas involving political and cultural critique. The lack of clear definitions, the fear of arbitrary enforcement, and the perceived political utility of the Act have transformed the digital landscape into one of hyper-vigilant self-regulation. Unless mitigated through legislative clarification, transparent enforcement, and stakeholder dialogue, the law risks becoming a tool of preventive suppression, limiting not just harmful speech, but also critical discourse that is vital for democratic engagement.


Conclusion


The findings suggest that while the Act is well-intentioned in its goal to address misinformation, online harm, and digital incitement, its broad and ambiguous language has inadvertently fostered a climate of fear and self-censorship among digital content producers. One of the most pressing outcomes identified is the prevalence of self-censorship, particularly among creators involved in political commentary, religious discussions, or social critique. Sections of the Act, especially those addressing vague terms like “false statements”, “national security”, and “religious feelings” have made it difficult for content creators to distinguish between legal compliance and potential infringement. As a result, many creators have withdrawn from producing critical content, not due to disagreement with the Act’s goals, but due to uncertainty and the fear of arbitrary enforcement. Moreover, the study underscores a disproportionate impact on creators who rely on satire, critique, or persona-based storytelling. These forms of content, essential to a diverse and democratic digital space, are increasingly seen as legally vulnerable, leading to a notable decline in both creative experimentation and civic engagement online.

There is a growing tension between state efforts to regulate harmful online behaviour and the constitutional rights to free speech and expression. In the Sri Lankan context, where social media serves as an important channel for alternative viewpoints and grassroots activism, this tension is particularly acute. Hence, there is a need for a rights-based digital policy framework that acknowledges both state security concerns and the democratic imperative of open discourse.

Practically, the findings of this study point to several urgent reforms that should be considered to balance safety and freedom: legal clarity and precision (the Act must be reviewed to replace vague terms with clearly defined legal standards as ambiguity not only opens the door to misuse but also leaves creators vulnerable to arbitrary interpretation), protective mechanisms for content creators (safeguards should be introduced to protect creators engaging in legitimate critique, such as a transparent appeals process, independent oversight committees, or journalistic exemptions for political commentary), stakeholder engagement (policy reform should include consultations with content creators, digital rights advocates, legal experts, and the civil society to ensure that the law reflects both public safety and civil liberties), and public awareness campaigns (many content creators operate without full understanding of their rights or the legal framework and therefore, educational initiatives can empower them to navigate the digital space more confidently).

Without these critical changes, the OSA risks creating a chilling effect, a phenomenon where the fear of potential legal repercussions leads to pre-emptive silence, stifling Sri Lanka’s vibrant, diverse, and politically active digital culture. Responsive digital legislation that is both protective and permissive, upholding public safety while safeguarding democratic freedoms, is essential.




More News..