The Digital Services Act’s ambition to clean up Big Tech will only be as strong as its enforcement.
With the GDPR, Europe rewrote the rules of online privacy. Now, it’s turning its attention to making the internet safer.
After setting far-reaching benchmarks for privacy standards, the EU is ready to usher in rules that could reshape the web, forcing Big Tech companies like Facebook, YouTube and Instagram to curb the spread of toxic content on their platforms and increase their operations’ transparency.
As soon as Friday, 19 of the world’s largest social media companies, e-commerce platforms and search engines will be required to comply with the Digital Services Act (DSA) — or else face sweeping fines of up to 6 percent of their global annual revenue.
Daunting though the new rulebook may be, it will only be as powerful as its enforcement on Big Tech firms, which is largely in the hands of the European Commission. In the opposite corner stand the tech giants themselves and the weight of expectations — and impatience — across Europe.
“Europe is now effectively the first jurisdiction in the world where online platforms no longer benefit from a ‘free pass’ and set their own rules,” said Internal Market Commissioner Thierry Breton, who spearheads the Commission directorate that will enforce the DSA.
“Technology has been ‘stress testing’ our society. It was time to turn the tables and ensure that no online platform behaves as if it was ‘too big to care.’”
Reshaping users’ online experience
Some of the requirements companies face include swiftly removing illegal content; stopping the use of people’s sensitive data, like their health information and sexual orientation, to show them personalized ads; and revealing previously secret information about how they operate. Companies will have to tell users if they remove their content, limit its visibility, or stop its monetization — and explain why.
Social media networks like Instagram and Facebook have already announced that European users will be able to tailor their feeds to see posts shared by accounts they follow, or in chronological order. TikTok said its users could choose to be shown videos based on their location, or on worldwide popularity, instead of based on the company’s own algorithm. Other companies like Snapchat announced how they were by making it impossible for advertisers to use teenagers’ data to show them personalized ads.
Companies will also have to identify — and implement concrete measures to counter — major long-term risks that their platforms pose for society, such as disinformation and negative effects on mental health under the scrutiny of the Commission, auditors and vetted researchers.
“My expectation is that throughout the DSA enforcement saga, we will see a change in the business structures of platforms,” said Renate Nikolay, the deputy director of the directorate supervising the law, in a late-June public discussion with activists, consumer representatives, lawyers, tech executives and academics.
The Commission plays cop
It’s one thing to come up with an ambitious rulebook. It’s another to successfully enforce it.
The content-moderation law has serious potential to bite. The law provides for stronger fines than its GDPR sister rulebook — 6 percent of companies’ annual revenue, compared with 4 percent. Led by the team that wrote the law — and knows it inside and out — the Commission will have broad enforcement powers, similar to antitrust investigators’, to oversee and ensure the compliance of the biggest tech firms. It will also receive extra yearly funding — an estimated €45 million for 2024 — funded through an annual levy from the Big Tech firms themselves.
The teams in Brussels will be backed by dozens of artificial intelligence and computer scientists at the Commission’s European Centre for Algorithmic Transparency (ECAT). And the Commission will also cooperate with national EU digital regulators, including in Ireland, where most of the affected tech firms have their EU headquarters.
“My services and I will thoroughly enforce the DSA, and fully use our new powers to investigate and sanction platforms where warranted,” Breton said in comments shared with journalists.
Priorities will include checking whether the designated companies are doing enough to protect children online and to fight disinformation campaigns, especially ahead of crucial national elections in Slovakia and Poland next year, as well those for the European Parliament in June 2024, Breton added.
After years marred by tech scandals and criticism that the world’s biggest companies lack proper accountability, the Commission will face widespread demands to show its EU law has teeth.
“There’s a lot of pressure on the Commission, and the Commission has acknowledged that publicly, that it needs to deliver and deliver early and fast,” said Julian Jaursch, digital policy expert at the Berlin-based think tank Stiftung Neue Verantwortung.
But the list of obstacles cannot be underestimated. For one, observers fear that the Commission’s enforcement teams could lack the competence, staff and cash to confront Big Tech firms. The Commission plans to have 123 full-time staff to enforce the DSA in 2024 and estimates it will roughly need an extra 30. Staff at the algorithmic center amount to 30. For reference, the British regulator estimates it will need 350 people to oversee between 30-40 tech companies for its own content law, the Online Safety Bill.
Already, Amazon and European fashion company Zalando have challenged the Commission in court, arguing they aren’t very large online platforms — and shouldn’t face the ensuing extra obligations. Facebook and Twitter have previously fought activists and gone to court to avoid opening up about how they operate.
“We shouldn’t be fooling ourselves. Companies will come with their armies of lawyers to find all of the smallest procedural flaws to crush cases just like they do in competition or data protection cases,” said Suzanne Vergnolle, a professor of technology law at the Conservatoire national des arts et métiers (the Cnam Institute).
Still under construction
Some pieces of the DSA’s enforcement puzzle are not yet not fully in place, which could arguably make the Commission’s work harder in the first months. EU countries still have until February 2024 to designate their national watchdogs, who will be in charge of parts of the law, like vetting researchers who will be able to access platforms’ data. The network of national regulators, the Board of Digital Services Coordinators, will also approve more detailed standards for platforms when it comes to fighting disinformation under the DSA.
“They’re kind of in a limbo,” said Martin Husovec, professor of law at the London School of Economics and Political Science. “They don’t have the full DSA operational because of these missing institutions.”
While the DSA empowers users to challenge potential suspensions or the takedown of their content, the full process hasn’t yet taken shape. Google has warned that EU out-of-court settlement bodies have yet to be finalized.
Some of the more detailed rules and processes laying out how large companies need to assess and limit major societal risks have yet to be decided. And the Commission is still hashing out details on the auditing of companies’ assessment and mitigation reports.
“Auditing is really going to be key and right now, it looks like only the Big Four can do it,” said Sally Broughton Micova, academic co-director at think tank CERRE and professor at the University of East Anglia, referring to the tetrad of auditing behemoths — PwC, Deloitte, Ernst & Young, and KPMG.
Commission officials have said in briefings with journalists in recent months that initial cases under the DSA would likely focus on low-hanging fruit, like explicit legal obligations requiring platforms to release specific information at a deadline. Tackling major risks for society — like disinformation or negative mental health effects — would take longer, they acknowledged.
“A lot of our work will be deciding: Is this enough? Will this stand in court?” said a European Commission official in a briefing who was granted anonymity as a civil servant. “There are areas such as the risks for disinformation, civic discourse or mental health of kids where the evidentiary thresholds aren’t immediately obvious.”
Observers from activists to experts said the DSA had strong potential to make the internet safer for users, but that it was important for the Commission to tread carefully.
“The risk would be to issue a penalty very fast that wouldn’t be upheld by the Court of Justice because it doesn’t respect procedural rules and it would be even worse than taking some time because it would raise legitimacy issues,” said Vergnolle.
Beyond facing off Big Tech and building its cases, Brussels is also already reckoning with some unruly capitals. Paris seeks to beef up its version of the European law — even though it’s not supposed to go solo and doing so could disrupt the enforcement of the DSA. Asked about the draft French law, Breton warned that the Commission would not hesitate to go after countries adding extra obligations.
“It will require time, experience and maybe potentially some failures and errors before they will get it right,” said Eliška Pírková, Access Now’s global freedom of expression lead. “And this, of course, will be a serious challenge for the Commission as a regulator.”
Source : Politico