Meta faces EU probe over Russian disinformation

People walk behind a Meta Platforms logo during a conference in India in 2023Image source, Reuters
  • Published

The European Commission has opened formal investigation proceedings into Meta over its handling of political content, including a suspected Russian influence campaign.

With elections looming in the EU and elsewhere, officials said they would assess whether the company's approach to moderating disinformation on Facebook and Instagram breached EU law.

Among the Commission's concerns is Meta's oversight of its advertising tools, and whether they had been exploited by "malicious actors."

The probe will also examine whether Meta is being transparent enough over its moderation of political content and accounts.

“We have a well-established process for identifying and mitigating risks on our platforms," Meta said in a statement.

"We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

The company is one of several tech firms designated "very large online platforms" (VLOPs) under the bloc's Digital Services Act (DSA).

VLOPs face fines of up to 6% of their annual turnover if they do not meet tougher content moderation requirements.

This includes taking action to prevent manipulation of elections and disinformation.

The Commission says it suspects Meta's current methods for moderating disinformation and political adverts do not comply with DSA obligations.

It is concerned about the impact this may have on the upcoming electoral cycle, with the European Parliament elections taking place in June.

"This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries," said Commission President Ursula von der Leyen.

"If we suspect a violation of the rules, we act. This is true at all times, but especially in times of democratic elections."

The four concerns at the heart of the EU Commission's investigation are:

  • Ineffective oversight and moderation of adverts

  • Lack of transparency over demotion of political content and accounts

  • Journalists and civil society researchers having no easy access to real-time data or tools to monitor political content during elections

  • Lack of clear and easy ways for users to report illegal content

A European Commission official said it believed Meta's current approach to moderating advertisements did not meet DSA requirements.

It cited findings, external by non-profit research organisation, AI Forensics, that a Russian influence campaign had been running adverts across the firm's platforms.

AI Forensics said it uncovered a network of 3,826 pages spreading "pro-Russian propaganda" and the campaign had reached 38 million users between August 2023 and March 2024.

It said less than 20% of the ads had been moderated by Meta as political.

Meta says it has been taking action against the "Doppelganger" campaign since first exposing it in 2022 and it now sees less user engagement as a result.

The Commission has given the firm five days to respond to a request for information about tools for journalists and researchers to monitor content on Facebook and Instagram during upcoming elections.

It said it was concerned by Meta's approach to CrowdTangle - a public tool providing data and insights into Facebook and Instagram content engagement.

The firm announced , externalin March that it will no longer be available from 14 August, but says it is building new tools to provide wider access to platform data.

The EU Commission's investigation follows its launch of a similar probe into disinformation on X (formerly Twitter) in March.

Related topics