The European Parliament passed the Digital Services Act (DSA) last week, the most significant attempt so far to regulate the internet. But could it have consequences for public media?

“A new global golden standard for tech-regulation that will inspire other countries and regions,” is how the Digital Services Act was described by MEP Christel Schaldemose. The European Parliament approved their position on the DSA last week, but it is still open to change, as the Parliament will now negotiate with the European Commission and Council.

The DSA is a broad piece of legislation with the purpose of improving “the mechanisms for the removal of illegal content and for the effective protection of users’ fundamental rights online, including the freedom of speech.” Most affected are very large online platforms (VLOPs) – such as Google and Facebook – who will become more responsible for the content placed on their platforms. The European Commission described the DSA as “a common set of rules on intermediaries’ obligations and accountability … [which] will open up new opportunities to provide digital services across borders, while ensuring a high level of protection to all users, no matter where they live in the EU.”

Enforcing transparency is another key component of the legislation. VLOPs will have to share data with authorities, and also be more transparent in how the algorithms which dictate what users are recommended (in terms of products or news reports for example) work. In a recent BBC investigation, reporters set up a fake anti-women troll account, which was then increasingly recommended more anti-women content. The legislation is designed to understand how these algorithms work, and subsequently prevent users from falling into such rabbit holes surrounded by other dangerous and abusive accounts.

But there has been debate over the extent to which the law could affect media organisations. As a result of the DSA, the onus would be on the online platforms to ensure mis- and disinformation is not circulating and give them the legal requisite to remove any content deemed as such. But this has caused fears over whether this could harm media freedom by giving too much authority over what is deemed as mis- and disinformation to the online platforms. As a result, platforms would be able to de-platform media organisations, if they were found to have broken their terms and conditions.

Media organisations have strongly opposed the position of power it gives to VLOPs. The President of the EBU and head of France Télévisions, Delphine Ernotte Cunci, told Politico “media companies should not be treated like regular internet users on social media.”

An amendment introduced in November 2021 sought to address this issue, by providing a so-called media exemption. It was initially rejected, but during the vote last week, other amendments designed to afford greater protection to media organisations were tabled. According to Euractiv, the Parliament rejected an amendment which would allow media organisations to contest content moderation decisions. An amendment which forced online platforms to consider the Charter of Fundamental Rights, including media freedom, when creating their terms and conditions was approved.

The arguments for and against a media exemption:  

For the media exemption:

The DSA “does not take into account the power that platforms wield over lawful content disseminated over their networks which is under the editorial control and legal liability of press publishers (or broadcasters)”
– Angela Mills Wade, executive director of the European Publishers Council

Against the media exemption:

“Inserting a media exemption in the DSA is equivalent to cancelling much of the progress against fighting disinformation in the past years.”
– Alexandre Alaphilippe, executive director at EU DisinfoLab


Where does this leave the media?

“The European news media sector regrets that the European Parliament have put more power with platforms at the expense of media freedom,” was the verdict of News Media Europe after the DSA was passed. The Executive Director, Wout van Wijk, commented: “We call on the co-legislators to achieve the right level of regulatory ambition in the coming months.”

Meanwhile, the Head of the EBU’s Brussels office, Wouter Gekiere, welcomed the amendment which will now ensure media freedom is considered by platforms when it comes to content moderation. But he added, “Media organisations regularly experience abuses by online platforms. They suspend business accounts, remove editorial content or set other limitations to the content our Members put online. … When platforms add additional layers of control over editorial content, media operators should be able to challenge these decisions.”

The inability for media organisations to be able to contest decisions made by online platforms is a concerning facet of this legislation. The Codes of Conduct which must consider media freedom is crucial to ensuring there are no unjust decisions made against media organisations. Additionally, the type of content produced by public media and any public interest media organisation will have to abide by that organisation’s own high editorial standards and intrinsically, must therefore not break the rules set by the VLOPs. Undoubtedly, this legislation enshrines VLOPs as the gatekeepers of news content, with the ultimate authority over what is allowed and what is not. Accountability therefore is imperative and the law will hopefully ensure the decisions and choices made by VLOPs are more transparent.

But the other intrinsic aspect of this legislation is to clean up the web from sources of mis- and disinformation, and to instead supplant it with trustworthy, and informed reporting. In this setting, public media will be able to provide such credible journalism.


Header image: Row of EU Flags in front of the European Union Commission building in Brussels. Credit: VanderWolf Images / Shutterstock.com