
The future of EU social media regulation is in Ireland
With the Digital Service Act (DSA) in place, the country has become the EU’s new social media regulator.
Ireland’s attractive corporate tax rates and its educated English-speaking population have made it Europe’s tech hub: 13 of the 22 very large platforms regulated by the DSA are in Ireland. Under the DSA, which came into force in November 2022, the member state that is home to digital platforms’ European headquarters is responsible for monitoring its enforcement. This effectively means that Ireland is now responsible for the large majority of what is posted on social media in the EU.
“If a French person files a complaint about a TikTok account promoting hate speech with the French watchdog, the latter then has to notify its Irish counterpart,” explains Seamus Allen, researcher at the Institute for International and European Affairs in Dublin. “The Irish regulator will then decide whether it should be taken down in regards to the plaintiff country’s national law on hate speech.”
The role Ireland has to play is controversial. Its Data Protection Commission (DPC) has often been accused of not properly enforcing GDPR, with some tying this back to <a href=”https://www.europeancorrespondent.com/story?s=stuck-between-two-chairs” style=”text-decoration: underline !important;”>Ireland’s economic dependence on Big Tech. The blame has also been put on the European Commission: in 2021, the Irish Council for Civil Liberties filed a complaint against the Commission for failing to monitor and hold the DPC accountable for GDPR enforcement.
Who is the watchdog?
The regulator in Ireland is the Coimisiún na Meán (CNAM), its media commission created in March 2023. Currently made up of 90 people, it aims to grow that number to more than 200 by the end of 2024 thanks to the €7.5 million budget put aside for its establishment. Additionally, it is in the process of drafting an Online Safety Code which will have to be followed by 10 video-sharing platforms it has identified, alongside the DSA.
Not yet put into place, the Code could significantly change social media as we know it. One measure in particular has received widespread support, including from several MEPs: platforms could be required to turn off by default the recommendation algorithm behind the personalisation of content viewed by users. Infringement of that code would lead to a fine of up to €20 million euros or 10% of profits.
This sounds like smooth sailing. Are there any difficulties?
In practice, whether the CNAM is up to the challenge will only be known after the next few years. As member states set up their own watchdogs, requests to take down illegal content will start flooding in. Then, as Allen puts it, “Ireland will have to deal with the most contentious and polarising points of the DSA”.
With controversies surrounding potentially anti-democratic laws arising in several countries, Ireland might end up stuck between two chairs. Although the DSA provides that any moderation should be done according to Union law, some points remain hazy For example, neither hate speech nor illegal content were defined by the Act. As a result, the CNAM may quickly be faced with requests to take down content in a country that is abusing its hate speech law to clamp down on online opposition.
This haziness could also adversely affect the EU’s digital landscape: although the DSA was meant to unify regulation, the reliance on national law could lead to a fragmentation of social media content, with posts banned from one country being visible in the neighbouring one. For now, the CNAM is still finding its feet – and we will soon find out whether it will lose balance.