top of page

Regulating Social Media in the EU

Estelle Sutherland | Europe and Eurasia Fellow

Image Credit: Soumil Kumar

With the rapid spread of social media and Big Tech, the European Union (EU) has attempted to fill regulatory gaps with legislation regulating digital services. Yet, it is unclear whether this legislation will effectively address the discrimination rampant across social media platforms.

Finnish Prime Minister Sanna Marin recently faced widespread criticism in the wake of a publicly shared video of her dancing with friends at a private party. Later that week, Marin also apologised for a controversial photo taken at her official residence. Citing a week of intense work pressure, Marin defended her actions, describing her “longing for joy, light and fun”. Yet, these social media posts sparked backlash among some, with speculation that Marin is not fit for office and neglecting her duties.

Commentators have been quick to compare this criticism to the reaction to social media posts in August 2022 depicting Australian Prime Minister Anthony Albanese skolling beer at a music concert. The reception to this video was largely positive. Notably, no questions were raised regarding Albanese’s ability to perform his professional duties. The stark contrast between the reactions to these two leaders’ personal lives can likely be attributed to differences in gender and age. The amplification of these misogynistic and ageist attitudes across social media is of particular concern. Dr Sonya Palmieri, gender policy fellow with the College of Asia and the Pacific at the Australian National University, notes the particularly “vicious and more personalised” criticism towards Prime Minister Marin encouraged by social media platforms.

It is clear that Big Tech is inextricably linked to modern notions of politics. So, is the new age of social media a benefit or a hindrance to global politics? Social media has its benefits: it engenders a more informal relationship between elected leaders and the populace. It also allows for instantaneous sharing of information and greater transparency in political decision making. Yet, the case study of Marin demonstrates that the proliferation of social media encourages vocal discrimination (for example, sexism and ageism in the case of Marin). This discrimination undoubtedly works to suppress the voices of minority groups, undermining the legitimate political interests of various marginalised groups. The rise of misinformation on social media is also of concern – recent years have seen an upsurge in election misinformation as well as widespread false reporting in the context of Covid-19.

Thus far, multilateral law-making has been slow to address the rapid evolution of Big Tech. In an effort to combat these concerns, the European Parliament adopted the Digital Services Act package of legislation on 5 July 2022 in what has been described as a ‘transformative’ move. This package is comprised of two pieces of legislation: the Digital Services Act (DSA) and the Digital Markets Act (DMA). The European Commission describes the twofold aims of this legislation: firstly, to create a “safer digital space” which protects the fundamental rights of all users and secondly, to create a fair playing field in order to encourage “innovation, growth, and competitiveness” in the European market and globally.

The DSA, yet to be adopted by the Council of the European Union, contains horizontal rules and obligations regulating online intermediaries and services operating inside EU. The DSA imposes specific rules on very large online platforms (VLOPs), which are classed as platforms reaching over 10 per cent of the 450 million digital consumers in the EU. Platforms which fall within this category include Meta (Facebook and Instagram), Google, and YouTube, among others. Additional requirements for VLOPs include risk management obligations, the ability for users to opt-out of recommendations based on profiling, and data sharing with authorities and researchers. Essentially, through the DSA, the EU aims to increase public oversight of online platforms and protect users’ human rights.

The DSA certainly has its benefits. As the first set of common rules regulating online intermediaries in the EU, the DSA provides consistency and predictability across the region. Moreover, the positive duties imposed on VLOPs promote greater accountability. Under the DSA, VLOPs will be required to undergo annual risk assessments covering various areas including impacts on mental and physical health, privacy, and the right to non-discrimination. The DSA further obliges VLOPs to assess the impact of factors - such as advertising and recommender systems - on these risks. Once identified, VLOPs must demonstrate that they are taking effective measures to mitigate these risks.

Yet, this legislation is far from perfect. The penalty for non-compliance (up to 6 per cent of operating profits for VLOPs) is arguably lenient, with companies potentially favouring building this into business costs rather than saddling the financial burden of compliance. Others have flagged the ability of social media users to avoid regulation by migrating to alternate underground platforms.

It is also unclear whether the DSA’s anti-discrimination measures will be effective in cases such as Prime Minister Marin’s. Precise content moderation is extremely challenging to achieve. Content moderation by human workers is costly on both businesses and on workers’ health. Further, AI moderation, while somewhat effective, is unable to make accurate value judgments in areas such as anti-discrimination. Moreover, studies suggest that AI moderation is inherently biased due to its inability to recognise social context and its frequent mischaracterisation of language employed by minority groups. This form of moderation may work to amplify rather than suppress instances of discrimination which the DSA aims to mitigate. As such, while the DSA provides rules regulating social media hazards, it is unclear how VLOPs will actually address growing social issues such as online discrimination and misinformation.

While it is too soon to gauge the effectiveness of the DSA in practice, it certainly holds potential to provide a consistent and comprehensive framework for online services across the EU. As Big Tech gets bigger, it is imperative that legal mechanisms are in place to ensure that human rights are protected. While the DSA does impose positive anti-discrimination obligations, it is less clear how social media platforms will actually meet the promises of the DSA, particularly in the context of relatively soft penalties for non-compliance.

Estelle Sutherland is the Europe and Eurasia Fellow for Young Australians in International Affairs.


bottom of page