Officials hailed a victory against Big Tech after the European Union passed landmark rules to govern online content late last week. Now, eyes are on the EU’s executive and how it will enforce those standards against Google, Facebook and Twitter.
In a first-of-its-kind move, the European Parliament and EU countries made a sweeping change to the Digital Services Act — the bloc’s new online content rulebook — entrusting the bloc’s executive body in Brussels with the supervision of the largest digital firms.
“For us, it’s only the beginning obviously because they will be watching us to see if we are doing a proper job in getting the enforcement right,” EU Commission Executive Vice President Margrethe Vestager told reporters after the rules were agreed.
The law comes with great powers. For the first time outside the realm of antitrust and competition, the Commission will have the power to slap of up to 6 percent of a company’s global revenue on the likes of Facebook, Google, Twitter and Amazon if they don’t curb disinformation and cyber-violence against women on their platforms, as well as limit the risk these companies pose to people’s fundamental rights, diversity of the media and users’ mental health.
For Brussels, it’s a litmus test of whether it can take on and change the tech sector’s toxic practices.
After whistleblower Frances Haugen revealed alleged harmful practices of Facebook and Instagram, which are used by over 2 billion people globally, there is a growing sense of urgency from politicians and activists in the U.S. and Europe when it comes to holding large digital companies accountable. Haugen’s documents showed a culture inside Meta, Facebook’s parent company, where repeated warnings of how harmful content spread virally went unnoticed or ignored. The company denies any wrongdoing.
Instagram and Facebook have also faced scrutiny for reportedly harming teenage girls’ mental health and pushing users down rabbit holes of increasingly extreme content, problems the company said it has taken steps to reduce. And on Tuesday, billionaire and self-described “free-speech absolutist” Elon Musk became the owner of Twitter, prompting worries he would weaken content moderation and let abusive speech proliferate.
Getting off the ground
Granting the Commission new powers to enforce the new content rules is a direct response to a flailing enforcement of Europe’s earlier behemoth technology law, the General Data Protection Regulation, which protects privacy rights of Europeans.
Governments and lawmakers have expressed growing frustration over how the GDPR has failed to curb potential abuse. In large part, that’s due to the fact that the data protection rules had granted national capitals — particularly Dublin, where many of Silicon Valley’s biggest names have European headquarters, mostly for tax reasons — sole power to enforce the rulebook.
Many across the bloc have claimed Ireland and, to a lesser extent, Luxembourg failed to aggressively police companies including Google and Facebook. Both countries’ watchdogs have denied they’ve failed to enforce the rules, with Ireland’s Data Protection Commissioner citing multiple fines against Big Tech firms.
With the Digital Services Act likely to take effect sometime next year, the Commission has only a few months to hire around 150 legal, data and algorithms experts and build up a strong-enough team to rival Big Tech firms’ armies of lawyers.
To cover some of the costs of enforcement, Brussels hopes to pocket €30 million euros annually from a fee of up to 0.05 percent of companies’ global revenues. A part of this will be spent on hiring outside experts to do the heavy lifting on enforcement. It’s a tiny budget compared with the cash Big Tech can throw at regulatory problems.
Controlling Big Tech’s systemic risks
In the coming months, the Commission will have to ensure that about 30 of the world’s largest platforms and search engines are properly assessing and mitigating the risk that their design choices, algorithms and services pose to people’s fundamental rights, human dignity, data protection, freedom of expression and media pluralism, the protection of kids and consumer protection.
Along with enforcing these new standards, the EU’s executive body will have to make sure platforms apply an upcoming voluntary charter on disinformation with detailed objectives to address and disclose coordinated manipulation campaigns with bots or fake accounts and set up features to highlight trustworthy content. In the case of terrorist attacks, natural disasters, war or pandemics, the Commission will also be in charge of requesting that Big Tech quickly adapt its services to deal with potential disinformation.
“The Commission will be ready; we organized ourselves to have a specialist team,” Internal Market Commissioner Thierry Breton told journalists.
The French politician said 150 people from both inside and outside the Commission would be needed, a “significant number” of high-skilled specialists to monitor whether large companies violate the EU’s content rules. In comparison, the U.K.’s separate online content rulebook, known as the Online Safety Bill, has led that country’s media regulator to start recruiting up to 500 people to enforce those separate standards.
Three Commission officials, who all spoke on the condition of anonimity, acknowledged their hands were tied because the Commission’s finances had already been set because Brussels’ new enforcement role had not been envisaged when the initial EU-wide budget was negotiated.
“We have to work with the budget we have, not the budget we want,” said one of the officials. “I would expect more money will be asked for when the next budget comes around” — in 2027.
Commission’s hurdles ahead
Breton and Vestager — the commissioners whose team will be entrusted to make the new content rules a reality — already face a tough deadline to hire staff to make sure the new rulebook has teeth. EU negotiators gave technology firms only four months to comply from the moment the Europe’s content rules become law, meaning the Commission will hire quickly and its staff will learn how to enforce these rules on the job.
The Commission officials expressed concerns about the new responsibility.
Two of the officials suggested that much depended on the caliber of external consultants who would have to delve into the inner workings of social media and e-commerce companies to determine whether they were abiding by the laws. The other wondered how civil society groups, many of which have been eager to help the Commission with its new role, would be able to respond, given their lack of enforcement experience and often outspoken antagonism toward the firms within the new regime’s scope.
To make it work, Brussels would build on its team that’ll enforce the bloc’s separate digital competition law for Big Tech, the Digital Markets Act, as well as tap into the expertise of national electronic communications and media regulators.
“Any fine is likely to get appealed,” said one of the officials. “The enforcement has to be done by the book.”
This article is part of POLITICO Pro
The one-stop-shop solution for policy professionals fusing the depth of POLITICO journalism with the power of technology
Exclusive, breaking scoops and insights
Customized policy intelligence platform
A high-level public affairs network
https://ift.tt/b8l9Jvi April 26, 2022 at 08:36PM
Clothilde Goujard, Mark Scott