The European Commission (EC) adopted the Digital Services Package (DSP) in 2022 (The Digital Services Act Package, n.d), opening a new era of internet governance in the 21st century. The regulatory tightening has started a major reform within platforms on how those platforms’ regulations approach the European market. The main question here is how effective this reform in Europe is in terms of protecting the European online population from exposure to harmful and/or illegal content.
Figure 1: (Jones, 2023)
So, what is an online platform? And why does the EC want to gain more control over contents distributed on those large platforms?
Gillespie (2018) views a platform as a “natural gathering of the wisdom of the crowd” (Gillespie, 2018). Platforms are created based on this nature, and therefore profiting from this participation. However, the issue of harmful contents arises as many platform designers were initially overseen the problem and blinded by optimism bias (Gillespie, 2018). This naivete has created a ground for harmful contents to arise on those platforms, therefore calling for moderation to ensure safe content consumption within society.
In the same chapter, Gillespie also mentions that the responsibility of content moderation lies on the shoulders of the platforms themselves. Those platforms are in such position to the level that they can set the rules, scale their platforms, and create appropriate regulations to constitute a safe online space for public mediation (Gillespie, 2018).
However, platforms have not optimised this power to the public’s advantage to protect the well-being of public’s content consumption. It is proven that the algorithms of online platforms are purposefully created to promote contents deemed as alluring to users to attract platform engagement and therefore gain capital benefits from such promotion. By the time that harmful content gains the attraction of the platforms’ community content moderators, which usually takes a few days, the content would have been circulated far beyond the influence scope of the original publisher, highlighting the limited ability of content moderation organised by platforms themselves (Goldstein et al., 2023).
An example of how toxic content has been fostered by platforms is Reddit. The platform’s nature is to create an open online space for people to anonymously create their communities of interest and start and/or contribute to any discussion on those communities. Highly upvoted posts are boosted to appear in higher positions, therefore attracting more attention from viewers. However, the anonymity nature and how the platform was created establish a solid ground for toxic technocultures to flourish and promote inequalities despite the platform’s public disapproval and its effort to reduce the growth of such technocultures (Massanari, 2016).
The explained theory and forementioned example highlight how deeply flawed content moderation operated by platforms is due to how the platforms were created. Therefore, the Internet is in the need for an improved version of content moderation to protect the public from harmful contents, especially protecting younger people and minorities.
Enter the EC in the content moderation game. The DSA was created to affirm European digital sovereignty while among other interesting goals (Turillazzi et al., 2023). On its website, DSP is explained to protect European users, which include digital services providers, business users, and citizens within European country members, from systematic risks. What’s interesting is that DSP is ought to additionally reduce online exposure to “illegal content” and “societal harms” (“The Digital Services Act”, n.d). This concretely means that “very large online platforms” (“The Digital Services Act”, n.d), platforms that reach more than 10% of European internet consumers, are obliged to adjust how they address harmful content published on their platforms. TikTok, Facebook, and Instagram are included under this list (Roth, 2023).
Within the discussion of the impacts of the DSA, how the DSA approaches harmful content is what differentiates the Act from internet governance of other jurisdictions. Contrary to the usual reliance on self-regulation in terms of publishing hate speech, the DSA pursues the “delete first, think later” (Turillazzi et al., 2023) approach to ensure that the amount of published harmful contents is minimised. However, this approach presents the undermine of freedom of expression on digital platforms (Turillazzi et al., 2023).
What have the impacted platforms reacted to the reform?
The DSA calls for the cooperation of large tech corporations, including Meta, Google, and ByteDance – TikTok’s parent company, to put this Act into effect. The companies have concurrently released press releases addressing their new approaches and policies in regards of the Act.
TikTok has improved its feature to make reporting “illegal content” (“An update on fulfilling our commitments”, 2019), allowed its users to have more control over their For You page preferences, and changed ads approaching policy. Meta’s Global Affairs President Nick Clegg has publicly confirmed Meta’s support towards DSA mission by providing updates similar to TikTok’s approach (Clegg, 2023).
Other than similar steps taken by Meta and TikTok, Google takes a step further than Meta and TikTok to expand their data access for researchers and transparency provisions, including ads policies and impact on public’s access to information on Google, to comply with the DSA requirements. Additionally, they recently enhanced analysing risks to report any potential issues to European regulators and independent auditors before publishing a public summary (Richardson, 2023).
Based on these public statements, we can see the strong initiative that the EC is taking in terms of holding online platforms accountable in ensuring online public safety. The ideas of content moderation set by DSA sound promising on paper, yet it also comes with a certain set of drawbacks.
Is the DSA problematic in any ways?
While the purpose of the DSA is for the better of the society, it is rather hard for the Act to be completely flawless in terms of online content moderation. A popular debate around the effects of the DSA surrounds the idea of hate speech. There is a fine line between self-expression and radicalisation to harmful and/or illegal level when it comes to hate speech. If escalated to certain levels, hate speech disguised as self-expression might cause violence and inherent illegal actions (Farrand, 2023). Hence, the “delete first, think later” (Turillazzi et al., 2023) approach would come into handy when it comes to preventing potential illegal activities sparked by online self-expressed posts.
However, the Act has been called “problematic” (Turillazzi et al., 2023) for the over-deleting user generated content approach. The definition of hate speech is inherently dependent on social circumstances and cultural background (Farrand, 2023). By following the “delete first, think later” (Turillazzi et al., 2023) approach, the DSA unavoidably hinders the freedom of speech, a basic human right, of European citizens affected by the enactment of the Act. Therefore, the reality of content moderation is much more complex than just mere community protection due to concerns over oppression of freedom of expression.
Figure 2: (Irungu, 2023)
Conclusion
Because the imposition of the Act is rather recent, we should wait for a few more years to truly see how the benefits of the Act unfold and how different stakeholders on the digital world react to the consequences of the DSA. Only in that case we have enough data to conclude whether the Act is more effective than content moderation operated by platforms themselves and the future of content moderation on the digital world.
Bibliography
Clegg, N. (2023, August 22). New features and additional transparency measures as the Digital Services Act comes into effect. Meta. https://about.fb.com/news/2023/08/new-features-and-additional-transparency-measures-as-the-digital-services-act-comes-into-effect/
The Digital Services Act Package. European Commission. (n.d.). https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
The Digital Services Act: Ensuring a safe and accountable online environment. European Commission. (n.d.). https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en
Farrand, B. (2023). ‘Is this a hate speech?’ the difficulty in combating radicalisation in coded communications on platforms. European Journal on Criminal Policy and Research, 29(3), 477–493. https://doi.org/10.1007/s10610-023-09543-z
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media (pp. 1–23). essay, Yale University Press.
Goldstein, I., Elderson, L., Nguyen, M.-K., Goga, O., McCoy, D., & Lauinger, T. (2023). Understanding the (In)Effectiveness of Content Moderation: A Case Study of Facebook in the Context of the U.S. Capitol Riot. https://doi.org/https://doi.org/10.48550/arXiv.2301.02737
Irungu, J. (2023, August 23). The dark side of content moderation: Unveiling the hidden toll on moderators. Medium. https://medium.com/@jirungu231/the-dark-side-of-content-moderation-unveiling-the-hidden-toll-on-moderators-add4c67756f6
Jones, R. (2023, January 26). Meta has a tiktok problem, and it’s hiring BuzzFeed to help. Observer. https://observer.com/2023/01/meta-has-a-tiktok-problem-and-its-hiring-buzzfeed-to-help/
Massanari, A. (2016). #Gamergate and the fappening: How reddit’s algorithm, Governance, and Culture Support Toxic Technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807
Richardson, L. (2023, August 24). Complying with the Digital Services Act. Google. https://blog.google/around-the-globe/google-europe/complying-with-the-digital-services-act/
Roth, E. (2023, August 25). The EU’s Digital Services Act goes into effect today: Here’s what that means. The Verge. https://www.theverge.com/23845672/eu-digital-services-act-explained
TikTok. (2019, August 16). An update on fulfilling our commitments under the Digital Services Act. Newsroom. https://newsroom.tiktok.com/en-eu/fulfilling-commitments-dsa-update
Turillazzi, A., Taddeo, M., Floridi, L., & Casolari, F. (2023). The Digital Services Act: An analysis of its ethical, legal, and Social Implications. Law, Innovation and Technology, 15(1), 83–106. https://doi.org/10.1080/17579961.2023.2184136